HD-Lite significantly degrades the picture even for 1280 source and 1280 TV sets!
The point I am trying to make is very simple: Every time you rescale the image you are significantly degrading the image, particularly you are losing the visible image resolution or picture clarity. Not only when you are scaling down, but also when you are scaling up!
When you are scaling down from 1920 to 1280 you essentially have to through away one of every three pixels, and you have to alter the remaining two. So, you are losing one third of the image detail, or more. This part is obvious.
It might be much less obvious that you are also degrading the image when you are scaling it up. When you go up from 1280 to 1920 you have to artificially add one pixel for every two pixels of the source image. And this cannot be done without some image degradation.
As a quick illustration, imagine that as a source you have a 1280-pixel-wide test pattern with alternating black and white pixels (vertical stripes). If you had a way of passing that image unaltered to an HDTV display that has the same 1280 native horizontal resolution you would see nice smooth gray field consisting of black and white vertical stripes - one pixel wide each.
Now, let's take that alternating pixel pattern and rescale it to 1920. Guess what? You will no longer have that sharp pattern any more! Since you converted every two pixels into three you have altered the image. It's not the same pattern any more! Even if you then convert it back to 1280, the resulting 1280 image will be very different from the original 1280 image. You will not get that original pattern of black and white pixels! White pixels will no longer be white, black pixels will no longer be black.
This illustrates the point I am trying to make: every time you rescale the image you are losing something. Whether you go up or down. The more times you rescale the image, the softer the image gets. (As Joe Kane said in a more philosophical statement in his class in Denver the other week: "You only get exactly what the message says when there is a total match between what's transmitted and what's received. If there is any deviation in any direction in the receiver away from what transmitter is doing you are going to lose something.")
It's a common misconception that if your TV set cannot resolve 1920 lines, or if the source is not true 1920, then HD-Lite is not an issue for you. This is not true. Additional image conversions introduced by the satellite service provider degrade the image and the results of this image degradation can be observed even if the source is not 1920 or if the TV set's native horizontal resolution is only 1280 or 1366.
For the sake of the argument, let's assume that the source image (be it HDCAM or whatever) is encoded as 1280x1080. Usually it's better than that, but for simplicity, let's assume that it's only 1280 H. And let's assume that your TV set's native horizontal resolution is just 1280 too. Here is what happens with that image before it gets to your TV set, the way I understand it (and I am oversimplifying the process here and focusing on horizontal resolution transformations only):
Step A: You start with a 1280x1080 source image (in our example).
Step B: The image is then converted internally by the satellite service provider (or by the content provider) to 1920x1080 - and stored on the master tape (or hard drive) ready for the broadcast/uplink.
Step C: The 1920x1080 image is then converted to 1280x1080 during the satellite uplink (the first "HD-Lite conversion")
Step D: Your satellite receiver box cannot output 1280x1080 as this is not a standard resolution, so it rescales it back to 1920x1080 (This is the second "HD-Lite conversion").
Step E: The scaler in your TV set rescales the 1980x1080 image to the native resolution of the TV set. Say, 1280x720, 1366x768 etc.
Every step of the way you are degrading the image, losing the picture clarity. You may start with a perfect 1280 image (at step A) and end up with an image that also has 1280 pixels of horizontal resolution (at step E), but the final image will be very different from the original one. You will no longer be able to visually resolve the same level of detail and the picture will be much softer. Why? Because those two additional "HD-Lite conversions" (from 1920 to 1280 and then back to 1920 at steps C and D) had their toll on the picture quality. And a very significant toll: you lost almost one third of your image detail during just those two steps! And that's on top of any other image degradation caused by other conversions, MPEG-2 compression, etc.
Of course, the difference will be more noticeable when both the source and your display are capable of resolving 1920x1080. But the main point I am trying to make here is that even if both the source and the TV set have lower resolution (say, 1280), still, the two artificial "HD-Lite image conversions" introduced by the satellite service provider will significantly degrade your resulting image. Don't think that just because your TV set is not 1920x1080, or because particular source is not 1920x1080, HD-Lite is not an issue for you. The two additional image conversions degrade your image regardless of the source resolution and regardless of the native resolution of your display. Your TV set and the source might be capable of resolving 1280 lines of horizontal resolution or more, but because of those additional image conversions, you will get much less than that.
No, Don, I wasn't making such assumption in my post above. I think you are still missing the point, my friend. Ok, I will try to explain it in more details.Don Landis said:That is, we have to assume we start with a program source that is indeed 1080i x 1920 pixels at the point of origin...
The point I am trying to make is very simple: Every time you rescale the image you are significantly degrading the image, particularly you are losing the visible image resolution or picture clarity. Not only when you are scaling down, but also when you are scaling up!
When you are scaling down from 1920 to 1280 you essentially have to through away one of every three pixels, and you have to alter the remaining two. So, you are losing one third of the image detail, or more. This part is obvious.
It might be much less obvious that you are also degrading the image when you are scaling it up. When you go up from 1280 to 1920 you have to artificially add one pixel for every two pixels of the source image. And this cannot be done without some image degradation.
As a quick illustration, imagine that as a source you have a 1280-pixel-wide test pattern with alternating black and white pixels (vertical stripes). If you had a way of passing that image unaltered to an HDTV display that has the same 1280 native horizontal resolution you would see nice smooth gray field consisting of black and white vertical stripes - one pixel wide each.
Now, let's take that alternating pixel pattern and rescale it to 1920. Guess what? You will no longer have that sharp pattern any more! Since you converted every two pixels into three you have altered the image. It's not the same pattern any more! Even if you then convert it back to 1280, the resulting 1280 image will be very different from the original 1280 image. You will not get that original pattern of black and white pixels! White pixels will no longer be white, black pixels will no longer be black.
This illustrates the point I am trying to make: every time you rescale the image you are losing something. Whether you go up or down. The more times you rescale the image, the softer the image gets. (As Joe Kane said in a more philosophical statement in his class in Denver the other week: "You only get exactly what the message says when there is a total match between what's transmitted and what's received. If there is any deviation in any direction in the receiver away from what transmitter is doing you are going to lose something.")
It's a common misconception that if your TV set cannot resolve 1920 lines, or if the source is not true 1920, then HD-Lite is not an issue for you. This is not true. Additional image conversions introduced by the satellite service provider degrade the image and the results of this image degradation can be observed even if the source is not 1920 or if the TV set's native horizontal resolution is only 1280 or 1366.
For the sake of the argument, let's assume that the source image (be it HDCAM or whatever) is encoded as 1280x1080. Usually it's better than that, but for simplicity, let's assume that it's only 1280 H. And let's assume that your TV set's native horizontal resolution is just 1280 too. Here is what happens with that image before it gets to your TV set, the way I understand it (and I am oversimplifying the process here and focusing on horizontal resolution transformations only):
Step A: You start with a 1280x1080 source image (in our example).
Step B: The image is then converted internally by the satellite service provider (or by the content provider) to 1920x1080 - and stored on the master tape (or hard drive) ready for the broadcast/uplink.
Step C: The 1920x1080 image is then converted to 1280x1080 during the satellite uplink (the first "HD-Lite conversion")
Step D: Your satellite receiver box cannot output 1280x1080 as this is not a standard resolution, so it rescales it back to 1920x1080 (This is the second "HD-Lite conversion").
Step E: The scaler in your TV set rescales the 1980x1080 image to the native resolution of the TV set. Say, 1280x720, 1366x768 etc.
Every step of the way you are degrading the image, losing the picture clarity. You may start with a perfect 1280 image (at step A) and end up with an image that also has 1280 pixels of horizontal resolution (at step E), but the final image will be very different from the original one. You will no longer be able to visually resolve the same level of detail and the picture will be much softer. Why? Because those two additional "HD-Lite conversions" (from 1920 to 1280 and then back to 1920 at steps C and D) had their toll on the picture quality. And a very significant toll: you lost almost one third of your image detail during just those two steps! And that's on top of any other image degradation caused by other conversions, MPEG-2 compression, etc.
Of course, the difference will be more noticeable when both the source and your display are capable of resolving 1920x1080. But the main point I am trying to make here is that even if both the source and the TV set have lower resolution (say, 1280), still, the two artificial "HD-Lite image conversions" introduced by the satellite service provider will significantly degrade your resulting image. Don't think that just because your TV set is not 1920x1080, or because particular source is not 1920x1080, HD-Lite is not an issue for you. The two additional image conversions degrade your image regardless of the source resolution and regardless of the native resolution of your display. Your TV set and the source might be capable of resolving 1280 lines of horizontal resolution or more, but because of those additional image conversions, you will get much less than that.