mdonnelly says:
No, nothing is "thrown away". A "good" compression algorithm looks for repeatable sequences that can be expressed with less bytes.
Look at a zip file. You can compress the average ascii file at about 75%, and get every piece of that file back as it was. Binary files with random bytes will not compress as well, but they do compress, and all the data is extracted to recreate the original. Try zipping up an exe file, then extract it and run the program. It works.
RAR does the same as zip, but the algorithm is better, so the compressed file size is smaller.
Please do some reading up on video compression algorithms. MPEG-2, MPEG-4 and VC-1 are lossy compression algorithms. Bits in to encoder != Bits out of decoder which is the very definition of a lossy encoder. However, they've gotten much, much better in the intervening years since MPEG-2 has come out, and even MPEG-2 has gotten better over the years since its introduction.
MPEG-4 and VC-1 both use different approaches to the compression than MPEG-2 and as such are able to use a higher level of compression.
Keep in mind the goal is to compress without noticable artifacts creeping into the equation.
The raw, uncompressed bit rate for HDTV is 1920 (height) x 1080 (width) x 24 (bit depth) x 30 (complete frames) per second. This translates to just under 1.5 Gigabits per second for raw, uncompressed HD. ATSC allows for a maximum bandwidth or roughly 19.2 Megabits per second with MPEG-2 encoding. That's a factor of ~75:1 compression. You simply cannot get that level of compression with a lossless compression algorithm.
So yes, information is in fact thrown away, however the algorithms are very smart about what is thrown away. The same occurs with Dolby Digital and DTS audio algorithms.
This isn't the computer world, with the use of Lempel-Ziv style entropy based compression algorithms. Here's a
quick tutorial showing you that MPEG-2 and MPEG-4 are indeed lossy compression algorithms.
Cheers,