"The Guide says that there is an art to flying," said Ford, "or rather a knack. The Knack lies in learning how to throw yourself at the ground and miss."
from Life, the Universe and Everything by Douglas Adams
A frequent question that is asked regarding satellites is why they don't simply fall to Earth. The same question used to be (and still is) asked about the moon, our only natural satellite. The aincients bypassed this question with the simple assumption that the Moon, Sun, stars, and planets were of an entirely different realm and were not subject to the same physics as mundane terrestrial objects. Newton changed all that with his three universal laws of motion and his law of universal gravitation. Under these laws, the Moon must be pulled towards the Earth by the same gravitational force that pulls a baseball back down after it has been hit into the air. So it may seem that the question becomes a real problem. Why doesn't the Moon fall like the baseball?
Actually, Newton's laws are quite adequate for solving the problem, if you have the right perspective.
If you remember anything from high school physics, you probably remember projectile motion. Any object thrown near the Earth's surface follows a parabolic path, whereby the horizontal component of its velocity remains constant and the vertical component consists of uniformly accelerated motion with acceleration given by gravity, g=9.8 m/s2. Beyond that some vague memory of complicated algebraic word problems no doubt stirs a sense of unease in the pit of your stomach. So let's see if we can't simplify things a little bit.
If we drop an object near the surface of the Earth, it falls towards the surface with constant acceleration, g. This means that, as it falls, it speeds up (specifically, its speed increases by 9.8 m/s for each second it is falling). If we measure the distance fallen, we find that the object has dropped 4.9 meters after 1 second, 18.6 m after 2 sec, 44.1 m after 3 sec, etc. The equation for the distance fallen after t seconds is
That's for an object dropped from rest, but what if the object is thrown? Without gravity, we know that an object that is thrown with a certain initial velocity will continue in a straight line at constant velocity (Newton's first law). The distance it travels in each second will be the same as it travelled the previous second (or in any other second). The projectile's motion can be thought of as a combination of this very simple motion with the "free fall" motion described above. In other words, figure out the straight line, constant speed motion that the projectile would have if there was no gravity to figure out the position that the object would be at without gravity. The effect of gravity is that the object will fall vertically below that straight line path by a distance given by the free fall distance, as shown in the figure.
We can apply this to an object thrown horizontally. Without gravity, it would travel in a horizontal line at constant speed. With gravity, it falls below that horizontal line. So if I throw a baseball horizontally at 1 m/s, it will travel a horizontal distance of 1 m in the first second of travel, but it will fall a vertical distance of 4.9 m below that horizontal line in that same second. It will therefore be 4.9 m closer to the ground after 1 second of travel (assuming the ground is level, which is the same thing as horizontal). No matter what speed I throw the ball, it will fall vertically the same distance of 4.9 m in the first second. The horizontal distance travelled will depend on how fast I throw, but the vertical distance will not. So if I throw at 5 m/s, it will travel 5 m in the first second, if I throw at 50 m/s it will travel 50 m, and if I throw at 500 m/s it will travel 500 m, all while falling 4.9 m below the original horizontal line.
Let's take a break for a minute. We know that there are some simplifying assumptions being used here that we haven't mentioned. For example
What if the projectile is launched ar 5000 m/s? Then it will travel a horizontal distance of 5000 m (5 km) in the first second. By the end of that time it will still have fallen 4.9 m below the horizontal, but because of the curvature of the Earth, it will not be 4.9 m closer to the ground. The level of the ground will have dropped as well, so the height of the ball will have decreased by less than 4.9 m. In fact, if we throw fast enough, say at 10,000 m/s, we will find that the ground has dropped away more than 4.9 m, so the height of the ball above the ground will actually have increased, even though it has still fallen 4.9 m. If we pick just the right initial speed (about 7900 m/s near the Earth's surface), we can arrange that the height above the ground will still be the same after 1 second.
Time to look at the second assumption. We know from Newton's law of gravity that gravity decreases with distance from the center of the attracting object (in this case, the Earth). Since we are setting the conditions to keep the ball at approximately the same distance from the center of the Earth, you might think that we don't have to worry abouth this one. But Newton's law tells us something else, as well. Specifically, the direction of gravity always points towards the center of the attracting object (again, the Earth). What this means is that the direction we refer to as "down" varies as we travel around the globe. Down will always be perpendicular to level ground. To make a long story short, Newton was able to show that, for the right speed (chosen so that the height of the ball after 1 second is the same as it was initially, as described above), the ball would still be travelling parallel to the ground ("horizontally") after 1 second with the same horizontal speed it had originally, and with gravity pulling it "down" towards the ground. This is identical to the situation when it was originally thrown, consequently we can conclude that the same thing will be true after the next second (and the next, and so on). The ball will continue to fall 4.9 meters each second, but will always remain the same height above the ground. In fact, it will travel in a circular path all the way around the world, right back to its starting point. It becomes a satellite in a circular orbit.
Of course, you can't do this near the Earth's surface, because wind resistance (which invalidates the first assumption) will slow things down (actually, at 8 km/s it will burn things up!). For it to work, you must get above the Earth's atmosphere, where friction is negligible. Newton theorized that if you could build a tower tall enough to stretch above the atmosphere, it would be possible to launch a projectile horizontally into orbit around the Earth. Such a tower is beyond current engineering techniques, but can easily be simulated on the computer. You can try out a Java implementation of Newton's Tower to get a feel for how this works.
A few things to notice while you're up there: