In the most simple terms, latency is another word for delay, or lag. When you send or receive information over the internet it has to travel a maze of cables, routers, and other transmission equipment to get from one point to another; if it takes a long time the latency is said to be large.
When a packet of information is sent, a variety of factors may cause latency; first is simply the distance it has to travel. Then there's the type of transmission (fiber optic, WiFi, etc). As mentioned above, routers examine each packet, possibly updating information in it such as the hop count. And of course there are computers and hard drives that can introduce latency as well.
When the latency is very small, it's usually unnoticeable. When latency creeps up it can become annoying to the user experience, or in an example such as a video game, make it absolutely unusable; imagine a driving game in which you see a curve and try to steer into it, but the latency is so high that by the time your computer's packet that says you turned the wheel reaches the destination computer, you've already crashed.
In "getting it done faster" news, TechRepublic is reporting that Fastpass: Centralized network arbitration eliminates data-center latency.