what do you mean by the term gravity
Answers
Answered by
1
In physics, gravity is the natural force that causes things to fall toward the earth.
Answered by
1
Gravity has played a big part in making the universe the way it is. Gravity is what makes pieces of matter clump together into planets, moons, and stars. Gravity is what makes the planets orbit the stars--like Earth orbits our star, the Sun. Gravity is what makes the stars clump together in huge, swirling galaxies.
In other words gravity is the force of attraction between the earth and stars or other objects on its surface.
In other words gravity is the force of attraction between the earth and stars or other objects on its surface.
Similar questions