DIRECTV Getting Sued over HD Lite

Status
Please reply by conversation.
HD-Lite significantly degrades the picture even for 1280 source and 1280 TV sets!

Don Landis said:
That is, we have to assume we start with a program source that is indeed 1080i x 1920 pixels at the point of origin...
No, Don, I wasn't making such assumption in my post above. I think you are still missing the point, my friend. Ok, I will try to explain it in more details.

The point I am trying to make is very simple: Every time you rescale the image you are significantly degrading the image, particularly you are losing the visible image resolution or picture clarity. Not only when you are scaling down, but also when you are scaling up!

When you are scaling down from 1920 to 1280 you essentially have to through away one of every three pixels, and you have to alter the remaining two. So, you are losing one third of the image detail, or more. This part is obvious.

It might be much less obvious that you are also degrading the image when you are scaling it up. When you go up from 1280 to 1920 you have to artificially add one pixel for every two pixels of the source image. And this cannot be done without some image degradation.

As a quick illustration, imagine that as a source you have a 1280-pixel-wide test pattern with alternating black and white pixels (vertical stripes). If you had a way of passing that image unaltered to an HDTV display that has the same 1280 native horizontal resolution you would see nice smooth gray field consisting of black and white vertical stripes - one pixel wide each.

Now, let's take that alternating pixel pattern and rescale it to 1920. Guess what? You will no longer have that sharp pattern any more! Since you converted every two pixels into three you have altered the image. It's not the same pattern any more! Even if you then convert it back to 1280, the resulting 1280 image will be very different from the original 1280 image. You will not get that original pattern of black and white pixels! White pixels will no longer be white, black pixels will no longer be black.

This illustrates the point I am trying to make: every time you rescale the image you are losing something. Whether you go up or down. The more times you rescale the image, the softer the image gets. (As Joe Kane said in a more philosophical statement in his class in Denver the other week: "You only get exactly what the message says when there is a total match between what's transmitted and what's received. If there is any deviation in any direction in the receiver away from what transmitter is doing you are going to lose something.")

It's a common misconception that if your TV set cannot resolve 1920 lines, or if the source is not true 1920, then HD-Lite is not an issue for you. This is not true. Additional image conversions introduced by the satellite service provider degrade the image and the results of this image degradation can be observed even if the source is not 1920 or if the TV set's native horizontal resolution is only 1280 or 1366.

For the sake of the argument, let's assume that the source image (be it HDCAM or whatever) is encoded as 1280x1080. Usually it's better than that, but for simplicity, let's assume that it's only 1280 H. And let's assume that your TV set's native horizontal resolution is just 1280 too. Here is what happens with that image before it gets to your TV set, the way I understand it (and I am oversimplifying the process here and focusing on horizontal resolution transformations only):

Step A: You start with a 1280x1080 source image (in our example).
Step B: The image is then converted internally by the satellite service provider (or by the content provider) to 1920x1080 - and stored on the master tape (or hard drive) ready for the broadcast/uplink.
Step C: The 1920x1080 image is then converted to 1280x1080 during the satellite uplink (the first "HD-Lite conversion")
Step D: Your satellite receiver box cannot output 1280x1080 as this is not a standard resolution, so it rescales it back to 1920x1080 (This is the second "HD-Lite conversion").
Step E: The scaler in your TV set rescales the 1980x1080 image to the native resolution of the TV set. Say, 1280x720, 1366x768 etc.

Every step of the way you are degrading the image, losing the picture clarity. You may start with a perfect 1280 image (at step A) and end up with an image that also has 1280 pixels of horizontal resolution (at step E), but the final image will be very different from the original one. You will no longer be able to visually resolve the same level of detail and the picture will be much softer. Why? Because those two additional "HD-Lite conversions" (from 1920 to 1280 and then back to 1920 at steps C and D) had their toll on the picture quality. And a very significant toll: you lost almost one third of your image detail during just those two steps! And that's on top of any other image degradation caused by other conversions, MPEG-2 compression, etc.

Of course, the difference will be more noticeable when both the source and your display are capable of resolving 1920x1080. But the main point I am trying to make here is that even if both the source and the TV set have lower resolution (say, 1280), still, the two artificial "HD-Lite image conversions" introduced by the satellite service provider will significantly degrade your resulting image. Don't think that just because your TV set is not 1920x1080, or because particular source is not 1920x1080, HD-Lite is not an issue for you. The two additional image conversions degrade your image regardless of the source resolution and regardless of the native resolution of your display. Your TV set and the source might be capable of resolving 1280 lines of horizontal resolution or more, but because of those additional image conversions, you will get much less than that.
 
I don't disagree with you that rescaling is a destructive process. However, your detailed explanation of what is being done in the DBS handling is completely alien to me and not what I was told is being done.

The DBS provider is usually starting out with direct feeds and the proividers are supplying these feeds from a variety of formats, most common are HDCAM at 1080i x1440 pixels. Bit rates at typically 45 Mbs tops. eg CBS network feed. Others may use a combination of HDCAM and D5 sources for uplink. The DBS company will compress the source feed to 8-20 Mbps depending on content, allocated and budgeted bandwidth availability and policy. This is where the reduction in detail gets crushed. This is the resampling process in the nutshell. In effect, all MP2 source feeds are resampled but to different bitrates and these bitrates reductions RESULT in a lower end visual resolution. Making a tape to tape transfer is a simple way to do a down conversion of pixel conversion such as a film to D5 at 1920 H pixels is then dubbed to HDCAM and will automatically be reduced in H pixels to 1440 since the recorded bandwidth of HDCAM cannot support the 1920 H pixel detail. HDCAM SR and SQ is newer and now can handle this.

Illya- bottom line is, and I hope you understand this, we are on the same side here. However, I feel that to do battle with the big boys at DirecTV (and E*) one has to be technically accurate in his engineering, how the system works and finally, not to ask for something you can't receive in the end anyway. IN addition, not to ask for something that never existed in the first place.

Unfortunately, I see both of these last parts in so many people's posts and these just dilute the validity of the complaint. eg.

It is silly to insist that all HDTV has to be 1920 pixels horizontal resolution.

It is silly to insist we receive 1920 pixels H resolution when we may only have a mediocre CRT or at tops a DLP that can do a maximum of 1280 pixels.

Better in both of these cases to allow for a lower HD format and enjoy more variety of HD programming.

I submit that if we restricted our programming to only that which is true 1920 pixel resolution we would not have but a tiny fraction of the programming available today. Maybe this goes along the lines "be careful waht you insist upon, you just may get it... One HD channel that is a studio HD camera trained on a demo shot! Or a loop of a 1920 HD test pattern on the demo channel.

All I want is Keep all HD channels at 1280 H pixels or higher and keep all bit rates at 13 Mbs or higher for MP2. I won't use the term HD lite in my requests since that term has no standard officially recognized definition. I think this request is reasonable and if followed by DirecTV and E* people would be quite satisfied with the PQ except for those who have Blueray DVD and a 1080P x 1920 monitor for comparison.
 
Don Landis said:
I don't disagree with you that rescaling is a destructive process. However, your detailed explanation of what is being done in the DBS handling is completely alien to me and not what I was told is being done.
So, which of the five described steps you are having problem with? The only one I am not absolutely sure about is step B, as I don't know for sure what kind of rescaling they do internally. I suspect they convert everything to 1920x1080i first, but I am not positive. In any case, this is not very important: steps C and D (rescaling to 1280 and then to 1920, the so-called "HD-Lite conversion") is what really matters for the above discussion. And we all know that those two steps indeed take place.

Don Landis said:
The DBS provider is usually starting out with direct feeds and the proividers are supplying these feeds from a variety of formats, most common are HDCAM at 1080i x1440 pixels.
And that's ok with me, as long as the satellite service provider passes through whatever resolution was received from the content provider with minimum possible change.

What is not ok with me, is when they take that original resolution (be it 1920 or 1440) and then rescale it two or three more times before the signal gets out of the IRD. As I tried to explain above, those extra transformations significantly degrade the image even for those of us who still have displays capable of 1280 H max (though I bet most of our forum members have displays capable of 1366 and above, and I don't believe too many still have CRTs ;))

...not to ask for something you can't receive in the end anyway. IN addition, not to ask for something that never existed in the first place...
It is silly to insist that all HDTV has to be 1920 pixels horizontal resolution.
It is silly to insist we receive 1920 pixels H resolution when we may only have a mediocre CRT or at tops a DLP that can do a maximum of 1280 pixels.
Don, sounds like you keep insisting that HD-Lite issue is irrelevant if the source is not 1920 or if the TV set is unable to resolve above 1280 H. I was trying to prove the opposite in my post above. It's not the resolution I am having problem with, it's the two additional image transformations that I think damage the picture quality regardless of the resolution.

One point on which I totally agree with you is that there are other, and perhaps much more important problems with the picture quality beyond just downgraded resolution and image clarity. Compression artifacts bother me much more than soft image, for example. So, I am not trying to say that HD-Lite is the root of all our problems. All I am trying to do is to explain why I think it is relevant.

Better in both of these cases to allow for a lower HD format and enjoy more variety of HD programming...
All I want is Keep all HD channels at 1280 H pixels or higher and keep all bit rates at 13 Mbs or higher for MP2.

That's where our positions differ completely. I would prefer less channels but at the best possible quality: just give us whatever resolution the content provider delivered with minimum or no additional rescaling and give us the highest possible bitrate, ideally 16-17 Mbps. For that, I would be willing to give up 1/3 of the channels of your choice. ;)
 
Smthkd said:
Dang Guys, are you writing love letters on here or what!? :D Too many paragraphs!!!
Do ya just hate it when thorough information is shared?

I'd rather read these and learn something than read the quick attack hits from certain posters here. :up
 
Smthkd said:
Dang Guys, are you writing love letters on here or what!? :D Too many paragraphs!!!
Sorry about that! I think we are almost done. :D
 
Directv Sued

Satmeister.....Words of wisdom, "You can't see what you can't get"....and......"To prove you are now getting less of something, you must first prove you use to have more of the same thing.
Well put Satmeister:) I'm reading, learning and taking notes....Knowledgeable people speaking, make those of us who want to learn, throw anger away and invest in understanding. Thanks for the words of wisdom:)

PeaceOfMind
 
Directv Sued

Don Landis....Ilya.....Thanks for sharing your knowledge and understanding. Little did you know, you have answered many of the questions that I and many others, have had for quite some time. Nothing beats Positive people, sharing their wisdom. Thanks:)

PeaceOfMind
 
Illya-

"So, which of the five described steps you are having problem with? The only one I am not absolutely sure about is step B, as I don't know for sure what kind of rescaling they do internally. I suspect they convert everything to 1920x1080i first,"

"B" Yes, B stuck out like a sore thumb first but then as I read from the top, I had no idea where you got the 1080i x 1280 source program from. So A is also an unknown for me.

As I said, the most common method that does a rescaling of the pixel specification is tape dubbing. This is done at sources like HBO, Showtime etc, from the film chain to D5 tape master then to HDCAM distribution sub masters before uplinking at the higher bit rates. I got this info straight from one man who does this for one of the few companies. He is a telecine operator.

The data stream scaling of resolution pixels is at the DBS and is a result of compression softening. They don't generally time shift the programming like TV stations do but only further compress the stream with real time compression THAT RESULTS IN a lowering of the pixel resolution. There is no such thing as adding program resolution once it has been lost due to compression. All you can do is add noise.


Step C is normal but your description is a more of a cut to the chase result of the process than really what is going on. The resolution reduction is a result of the compression and what it results in.

Step D Yes, the receiver may output a signal that is in this form but anytime the upscale process is in play the result is additional noise as once the picture is gone, it (the original) is not put back by present day methods. All that can be done in this case is image enhancement processing which artificially adds clarity to the appearance of the program.

E- This is a necessary and final step to convert the data bits to an array of pixel data to display on your particular displays native digital imager. In the case of analog, the equivalent process builds the analog voltage curves for scan line imaging on the CRT.


"Don, sounds like you keep insisting that HD-Lite issue is irrelevant if the source is not 1920 or if the TV set is unable to resolve above 1280 H. I was trying to prove the opposite in my post above. It's not the resolution I am having problem with, it's the two additional image transformations that I think damage the picture quality regardless of the resolution."

Hope I'm understanding you correctly here. but I have to disagree on this basis:

Resolution reduction is the RESULT of those transformations if they are taking place. I have a problem with that result. If transformations vis a vis compression can take place (not with present technology) without resolution reduction then transfor all you want, it won't matter.


"That's where our positions differ completely. I would prefer less channels but at the best possible quality: just give us whatever resolution the content provider delivered with minimum or no additional rescaling and give us the highest possible bitrate, ideally 16-17 Mbps. For that, I would be willing to give up 1/3 of the channels of your choice. "

Agreed! this is where we differ. I feel there is no point in having the highest "possible" quality if the chain of quality is broken at the consumer display and it requires a drastic reduction in programming to achieve. Your definition of the highest quality is not but rather an olive branch compromise, I think.. It is a reduced quality from the source, but as I said much earlier, unless you are using the latest 1080p x 1920 monitor, then you really will gain nothing with having 1920 pixels in the programming.
What I'm saying is that to force DBS providers to send all HDTV at the maximun 1920 pixels and not allow anything less is too restrictive and unnecessary. BUT, and this is important so plese do not discount this- When it is determined that the predominate number of customers of the DBS company own the display equipment to see this quality, THEN is the time to upgrade the DBS quality with less compression that results in the maximum bandwidth allowed and maximum resolution we can view. In the meantime, we maximuize the number of programs with the quality people can see.
Your compromise suggestion is humorous since it puts you at exactly where I would like to be- 16 Mbs of compressed signal is really equivalent to something of HDCAM at about 1440 H pixels. as it pertains to MP2. :) But give up 1/3 the programming of my choice? Maybe you don't want to go there my friend! :D

Well, our slash at bringing some technical reality to this "emotional topic" has been fun but I believe you and I agree we're on the same side on this. I don't want to be emotionally rediculous in requiring something that is unobtainable, but don't like what DirecTV is doing either. They have taken it too far. I trust their claim that when they have the new birds in operation and MP4 they will make things better. They have in the past. Let's see what happens. Also, I don't want to give Dish Network ideas they can have acceptable "HDTV" as in 720P x 500 pixels resolution either. I know they will try it if they think they can get away with it. This suit is good in that it may help define for the DBS providers and others what we really want. Let's hope the lawyers... get it!

Thanks folks for your kind comments, but I agree that maybe this dialog has gone into overtime, besides, I have to get back to work!
 
Last edited:
Don Landis said:
I had no idea where you got the 1080i x 1280 source program from.
I simply used that 1280 value just as an extreme example, to illustrate the main point that even if both the source and the monitor were 1280 H only, even in that extreme case the HD-Lite is still an issue, due to image degradation caused by additional image rescaling. Of course, in real life the source has better resolution than that (1440 or 1920), and many displays have better resolution too (1366 and above), which makes HD-Lite issue (with its 1280 H intermediate resolution) a much more obvious case. ;)

Ok, here are the real-life scenarios, assuming a 1440 source:

Without HD-Lite:

1440 --> 1920 --> native (e.g. 1366)​

With HD-Lite (one of the following - not sure which one, I suspect the first one):

1440 --> 1920 --> 1280 --> 1920 --> native (e.g. 1366), or
1440 --> 1280 --> 1920 --> native (e.g. 1366)​

Basically, there are two problems with HD-Lite, the way I see it: first is that 1280 value; second, the additional image rescaling that takes place.

Ilya.
 
Don Landis said:
At the end of the day, I hope Cohen wins so a base line standard will be established. But to win, I believe this "HD Lite" will need to be defined. It won't be as easy as resolution. In my opinion, a better way is to define it in terms of compression amount and scheme while not resampling the production original resolution.

Don,

First, thank you for taking the time to write out the technical issues that are at hand, because as long and drawn out as they are, you have brought the entire problem to the table.

Second, the Cohen Class, will bring the real issues into the arena so that the DBS providers will be forced to deal with them.

Last but not least, the real issues as you have described them, will force the entire industry to separate the BS that the Marketing Gurus have shoved down our throats, from the facts. There should be no doubt that the Cohen Class will shed some light on the behavior of these DBS providers.

This has been the most educational Thread I've read, and again I thank you for your efforts to educate us. Our Dealer Class Action Arbitration has just been given a great Victory as well. See the Transmitter News Article for Sept 25, 2006
 

Attachments

  • Decision_CalAppCt_B1882781.pdf
    79.9 KB · Views: 253
I still do not understand how how can downrez pixels disproportionatly? How do they take something down from 1920x1080 to 1280x1080? I mean wouldnt 1280x1080 be almsot a square? Like a 4x3 image?
 
hdtvtechno said:
Hey u know what came to mind right now... just think Directv gets mandated by the judge that they should up the resulations on all HD channels, and switch the HD Locals from Mpeg4 to mpeg2.... :D

Don't hold your breath:eek:
 
Don Landis said:
I feel there is no point in having the highest "possible" quality if the chain of quality is broken at the consumer display and it requires a drastic reduction in programming to achieve.
I totally disagree. The broadcaster should deliver the highest quality. If the consumer wants to watch it on a less than optimum display then that is his perogative. I suppose that if your paper boy finds that you do not read all of your newspaper then he should just throw half of it in the trash before it gets to you. :rolleyes:
 
In case anyone was interested, the original complaint that was filed is attached below.
 

Attachments

  • DirecTVComplaint.pdf
    116.6 KB · Views: 137
Status
Please reply by conversation.

New Channel 274!

Report: DIRECTV HD Satellite Launch Scheduled For july 6th

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)

Latest posts