DIRECTV Getting Sued over HD Lite

Status
Please reply by conversation.
first of all, 1080x1440 is a 4:3 aspect ratio, it is HD in 4:3. if DTV is broadcasting 16:9 with that resolution, i don't see how that is possible, uless they are squeezing the image and having it streched to full screen width. which is what i assume they are in fact doing. this is clearly a misrepresentation of what they are actually broadcasting and if you viewed the proper 4:3 content it would be tall and unwatchable.

the standard for widescreen HD is 1080x1920 interlaced or 720x1280 progressive. 1080x1440 is NOT a HD widescreen standard. It is an HD Fullscreen standard.

But of course there is the larger issue: compression. you can have any resolution you want, but if you compress the sh*t out of it it's going to look like hell. This is what the cable companies do. even if DTV were to give you back the lines of rez they steal they could compress the image to make up for it and you'd be right where you are now. the only explanation in my mind why they don't do this is b/c they have compared the 2 and decided that doing it there way is less noticable or looks better than the alternative. the part that blows my mind is that this is the future of there product and the brodcast dozens upon dozens of channels of crap that noone watches that could be used to improve their HD bandwidth.

Dss service has always been crap. even the SD content is woefully inferior to even analog reception or cable. it's like 544x480, that's not even 4:3 and they compress the sh*t out of it to boot. If you want the best HD signal you can get and live within range use OTA. yes it's compressed to but usually much less and it's broadcast at the proper resolution and it's free. I think that was the best point made, free service should not be drastically better in quality to premium services that consumers are purchasing with the assurance that they are getting the " best picture possible"

Sorry for rambling on. this may be my first post i have been lurking for a while. anyway, hi, it's good to make your aquaintance.
 
vedhead said:
seriously, we should all get a piece of this pie...


I agree. I think we should all sue em. In this mad suit happy world why not? Lets stick it to em for them sticking it to us. Lets drive up their legal expenses and who knows we might win. Heck if we get some money outta the deal that will help offset the increase in our monthly bills as they pass those legal costs onto us. Then we'll have more to be able to bitch about. Where do I sign up??? :eek:
 
I dont like law suits but this one might be good. I think it might call out what HD is and what providers are allowed to call HD. It might help consumers make the choice between providers that provide higher quality without it being subjective. So I for one would love to help. I think the industry needs to clarify HD and let consumers know what the are really providing.
 
leww37334 said:
They can obtain new revenue that way by getting more people to sign up for the HD package. D* simply is not interested in providing any additional national HD channels.
Thank you for posting the ATSC spec. However, the HDLIL's are provied at no charge for those who have the MPEG4 equipment ... no HD pkg required. That is why it has always follored me that they pump HDLIL instead of National HD.

I voted with my wallet on this the other day. I have made arrangements to return both my H20 and HR20-700. I guess if D* will not go the way of TiVo ... I may go the way of cable or E*.
 
I posted this in the Dish Network forum, but I think I will repeat it here too:

Here is what makes the HD-Lite case so outrageous, IMO:

While most TV manufacturers are trying to improve the picture quality and resolution of their TV sets year after year, DirecTV and Dish Network seem to be moving backwards: they started with full 1920x1080i resolution on most HD channels and then gradually downgraded the resolution by 1/4 or by 1/3. At the time when more and more TV sets are finally ready to display Full HD (the ultimate 1920x1080 Progressive), DirecTV and Dish Network are taking a pristine HD source and intentionally downgrading the picture quality of it in an attempt to squeeze more and more channels into the pipe.

I don't know if this Class Action can succeed and personally I don't care about any penalties that DirecTV or Dish Network will have to pay to their customers. What is important though is that a strong message will be sent to all service providers out there, that there are a lot of customers who care about picture quality more than channel quantity, and that not everyone is willing to tolerate some garbage labeled as "astonishing picture clarity".

I hate lawsuits, but perhaps this is the best way of sending a message, so that the company would listen...
 
Nothing beats TRUE FREE HD OTA! Now if only I could pick up SciFi in HD!
 
projection freak said:
first of all, 1080x1440 is a 4:3 aspect ratio, it is HD in 4:3. if DTV is broadcasting 16:9 with that resolution, i don't see how that is possible, uless they are squeezing the image and having it streched to full screen width. which is what i assume they are in fact doing. this is clearly a misrepresentation of what they are actually broadcasting and if you viewed the proper 4:3 content it would be tall and unwatchable.

the standard for widescreen HD is 1080x1920 interlaced or 720x1280 progressive. 1080x1440 is NOT a HD widescreen standard. It is an HD Fullscreen standard.

.

etc...

Welcome to forum posting! Let me try to give you some clues why your thinking is confusing to you. While what you posted is partly true, your math is leaving out some important parts of the video equation. This is causing you to form incorrect conclusions, erroneous math results and forming new terminology that really doesn't exist such as "HD fullscreen standard" BY definition, all HDTV is 16x9 screen AR. Don't feel bad because so many people fail in this understanding of what makes up a video image, a digital image, and the final resultant image you receive after signal processing via satellite.

The main part you didn't list that flaws your understanding is the Pixel aspect ratio. Or, the PAR. You probably heard of OAR, Screen AR, Projector Targe AR and I could go on with a few others but an important part to understand when using pixel count to determine the screen AR is the PAR. This is how you can have a 1440 horizontal pixel count while still maintaining the 16x9 image AR. BUT, important, that is not the only issue in play for the horizontal resolution in a digital image either. image resolution is the ability to define clearly a measurable resolve of lines of resolution. Here is where compression enters the picture and why greater compression will burry the resolve of the image even deeper into the video noise. So many lay people go only as far as the understanding of the ATSC pixel spec on the upper limit of HDTV standard. Then they claim that anything less than 1920 is "HD lite" Therefore according to definition, 1080i x 1919 is HD lite.
In reality "HD Lite" is a vernacular of common language and has really no standard definition. Kind of like Porn, ie can't define it but I know it when I see it. Right?
The point of contention I have with HD lite is that defining it to the common standard of HDTV production standards which is considered HDCAM resolution, defined by Sony at 1080i x 1440 pixels as it is recorded to tape, dictates that ALL HDTV production IS "HD-Lite" and only HDTV Mastering in D5 is done at full HDTV. I would bet that nobody has ever seen true HDTV by the maximum pixel standards of the ATSC spec unless you worked in a TV studio where Sony studio cameras are fed to Panasonic D5 recorders and your monitor was one of the latest DLP 1080p x1920 imagers. Well that's a tall order! So this is why I constantly like to define HD-Lite as somewhat less than most lay people and that would be 1080i x anything less than 1440. Thus, D*'s claim that 1080i x 1280 is where they want to be with HDTV is what I call HD-Lite. But the Voom channels and other Dish Network 1080i x 1440 is nothing more than HDCAM distribution maximum resolution, not limited by Dish Network but by the original production of the program itself. HDNet uses HDCAM equipment and by Sony specification, this is limited at the production to 1440 H pixels.

So what types of programming can be at full D5 HDTV resolution of 1080i x 1920?
I would suggest that much of the live feed direct to air broadcasts such as NFL games where most of the cameras are imaging at 1080i x 1920 such as a CBS HD broadcast. Some studio feeds that are live. And finally any movie that was telecined from 35mm or greater film to D5 and the distribution tape was dubbed to D5 to air. At one time a great number of movies were distributed to DirecTV in D5 format for their PPV channel. In those days the DirecTV resolution was indeed full on 1920 H pixels. HBO was using a combimbnation of HDCAM and D5 for distribution from their transfer facility to uplink so they also have some full 1920 movies as well as other studio productions but field sports is all at HDCAM (1440) production.
The logic behind D* using a spec of 1080i x 1280 is a good choice by today's viewing technology but that won't last for long. Currently the upper limit of the majoriety of HDTV viewers can only image 1280 native resolution on their screens, many have even less in their monitor's native, especially the "HD ready TV's"
Furthermore, all digital imagers will convert your 1080i to a 720P standard for display. Unless you have one of the higher res monitors such as the JVC DILA or the newer TI 1080px1920 DLP's all you will get to view is 1280 on your native res monitor. Anything higher gets lost anyway on your monitor. Therefore selecting a limiting resolution of 1280 for both the 1080i and 720P broadcasts was a good choice by D* engineers if they are planning to downres the image to save bandwidth.

Now I am not meaning to defend the practice of "HD-Lite" but I believe that the image I have seen by D* on their Showtime and HBO and HDNet channels is far more lightened than just 1280 pixels. I would be surprised if it can be seen any higher than 700-800 pixels these days! While I don't agree with the lay zealots who post their interpretation of the HDTV technology, I do agree with them in spirit, that is D* should not down res an HDTV signal below what is considered production standard. IMO, that would be either 720P x 1280 or 1080i x1440 (HDCAM) standards.

Back to people who have purchased the latest DLP 1080p x1920 monitors. Sorry to burst your fantasy bubble, but I'm afraid you will not be seeing true native HDTV resolution on many broadcadsts at all. Your best bet to insure full native resolution programming is to get programming on Blue Ray or HD-DVD formats as these are being mastered in the native resolution of your state of the art monitor.

What about viewing an interlaced signal on a digital monitor- In effect, the interlaced technology was designed for CRT's or scan line displays, not digital pixel based displays. The digital technology has been able to fix most of the problems that arise from an interlaced image on a pixel display but in reality, it produces ( rather converts interlace to progressive)what is technically a 540P x (whatever the native H res is) This is most likely 1280 pixels on a good HDTV DLP imager. A CRT uses phosphor persistence, a chemical-electro phenomenon, to achieve image display of the interlace lines while a digital LCD or DLP has to electrically convert the timing of the interlace to a progressive 540 line image from the 1080 / 2 image. Confuesed? to explain it in more detail, I'd need to draw pictures of the phosphor raster vs pixel matrix. Not in the mood to do that here but engineering guides do that well. :)

OK, I'll stop here but I suggest you get some good publications on Image science or from the engineering side of the technology. This only scratches the surface because to fully comprehend what the problems are, one needs to combine the understanding that people who engineer monitors know with the broadcast and DBS technologies, the program distribution practices and TV production and film conversion industry does. It is not simple as just quoting some pixel specs in one ATSC document and believe you have a full understnding of how TV is done in HDTV. I don't claim to know it all, far from it, but I do have a considerable background as a consulting BE, PE active TV program producer. This is what I do for a living and I'm still learning. A good start in your studies is to begin with the ISF people, including Joe Kane, Guy Kuo, and others. They hang out on AVS Forum and you can ask them direct questions on this stuff you find confusing. Better they often attend the Home theater cruise and you can sit in on their lectures for more understanding.
 
Last edited:
1440 x1080i is only an acceptable excuse if it is a limitation of the camera IMO. The problem with D* isnt entirely resolution but also the choking of the bitrate. No HD resolution has a chance of looking good when its bitstarved to the point that theirs is. Hdlite is a combination of both.
 
Don:

Thanks for your valuable information and insights. It puts HD in a more clear perspective, and addresses issues like the value of 1080p sets today and the fact that to a true purist, almost all HD broadcasts today are "HD Lite". It will be interesting to see what happens in the next 9-12 months overall, as providers ramp up their bandwidth and have the opportunity to perhaps narrow the gap of transmissions to pure HD.

Again, we appreciate your detailed data.
 
vurbano-- to be precise, I believe the chipsets in the HDCAM Camera section actually do output 1920 pixels but the tape format is limited to 1440 recorded. I do believe that the Cine Alta and other high end field camcorders are capable of a cable feed that will yield the full 1920 pixel horizontal. However, most production outside of a live multicamera to air or to tape in a field production truck would be recording to the camcorder's local HDCAM deck.

And, it is most true that the real culprit in this whole issue of HD Lite is not so much the horizontal resolution but rather the bitrate and compression scheme. It must be stated both ways because with MP4, one can achieve greater compression without affecting the resolution, motion blocking, and image softness artifacts.

Satmeister- Yes, I am quite concerned how all this pans out with the coming proliferation of local broadcasts in HD. I've been against this for a long time but spot beams have lessened the sting of LIL HDTV. IMO, local broadcast via satellite is a hugh waste of satellite resources we really don't have yet today. MP4 and spot beams, as I said lessens the sting but if we were prepared to offer HDTV LIL as people want, then this whole issue of HD Lite would never have happened. But that's just my opinion. I do know that horizontal pixel reduction is mostly a result of compression to the extreme and not actual resampling. I believe this is why we see such poor image quality far less than the claimed 1280 which, IMO, would satisfy most people if they really got that. It's when you tune into Showtime when D* has so many NFL games hogging the TP's that we see the softening degraded to more like 700 pixels. Obviously this is when HD Lite gets to be really annoying and I believe the real reason that triggers the CA suit by Cohen.

Another somewhat related fact few realize with HDTV and "HD Lite" All these little consumer and "Prosumer" HD camcorders now out on the market also suffer from that often quoted ATSC 1080i x 1920 standard. With Sony's HDV the Z-1U HDV camcorder and it's related competitive makers all use higher tape compression to achieve recording. The Sony compresses to the same bitrate of 25 mbs that is used by mini DV. By comparison HDCAM is at 144 Mbps. ( A newer HDCAM SR and SQ format was introduced in 2003 that allows even higher bitrates of up to 880 Mbps and can achieve the full 1920 H pixels. One should not confuse these formats and their capabilities. Fact is, there are very few of these recent HDCAM SR and SQ in use at this time)
HDV uses the same track width as mini DV so it can record on the same tape and the same time as mini DV. Just the pixel depth is maintained at HDCAM 1080i x 1440. Interesting eh? Now take a related competing HDV format. While it also uses the the HDCAM resolution it will record at even higher compression than the Sony at 19.4 Mbs. I think Canon may be using this. But the lessor data rate should allow for something more, right? It does. Another factor in all these different recording brands that do the same pixel size, different compression, is the GOP. No, that has nothing to do with the Republican Party. :) It is what is called the Group of Pictures. Sony has a higher number of pictures in the group (higher is Bad) than the competing formats. This is important in image updates during recording and affects the accuracy of the fast motion in the recording. With the highjend HDCAM it is 1 frame in the Group. With HDV Sony, it is 15 frames in the group. Consequently, the HDV format gets an update of motion for accuracy every half second. Most of the time this is OK but not in fast action. Another problem with a high GOP is during editing. When I edit HDV I have to be very careful where I might put a dissolve in the edit so as to avoid a doubling of the GOP to 1 second. While HDV GOP is bad for action, it becomes disasterous during edits and one more issue I have to deal with to avoid motion jumping within an edit. These things are all tradeoffs and have to be considered. With DirecTV, all they want to do is offer more channels and what has suffered is picture quality.

At the end of the day, I hope Cohen wins so a base line standard will be established. But to win, I believe this "HD Lite" will need to be defined. It won't be as easy as resolution. In my opinion, a better way is to define it in terms of compression amount and scheme while not resampling the production original resolution.
 
Don, although I agree that there is much more to it than just pixel resolution, there is one important point that you seem to be missing in all your posts on HD-Lite topic. Two additional conversions of 1920 to 1280 for the satellite transmission and then back to 1920 in the box, degrade the picture quality resulting in softer image. This change is significant and can be observed regardless of the original source resolution, and even regardless of the native resolution of the display. Even if the source is HDCAM and the TV set is just 1280x720, still, two additional conversions introduced by the satellite service provider result in loss of image information and degrade the resulting picture. I can prove this mathematically if you really want. :D
 
Directv Sued

Don Landis...Satmeister....Ilya.... Thanks for the educated post, I am learning alot. You don't let anger get in the way of sharing knowledge of why things are the way they are. Once one has an understanding of how the things actually work, the anger subsides and a positive learning experience, begins. Thanks for breaking it down, so that the average person can understand what is really going on. My anger has been replaced with Understanding.....Thanks again.

PeaceOfMind:) :up
 
"I can prove this mathematically if you really want. "

Illya- You don't have to because in theory and in an isolated case you are indeed correct. That is, we have to assume we start with a program source that is indeed 1080i x 1920 pixels at the point of origin, distributed to the satellite provider in t a form that maintains this high degree of resolution. Then the signal is "downresed" by a resampling process to 1080i x 1280. In theory, this reduces the image quality from the original source.
But my point is that
1. we have so few program sources that offer this highest degree of program resolution, second, that we have other factors that also add to the destruction of quality, I feel are far more destructive than the simple resolution resampling such as increased compression.
2. That given a program source that is of a higher resolution than we can see (due to monitor limitations) if we are given the same program that is at the monitor (native resolution) we will see no difference.

What I want to see by the providers is an HDTV signal that offers true 720P x 1280 pixel resolution. If I owned a CRT and needed an interlaced HDTV signal because I was using scan lined display, I would want the 1080i to be at minimum 1440. Why not 1920? Because it has been shown in the Sony Labs that while their best CRT can achieve 1920 lines of visual resolution, it had to be operated in a manner that was not favorable to an audience. It was a very small screen size with very low brightness settings to avoid focus blooming of the electron beam on the phosphor target.

Illya- I suppose what I'm saying is that we don't need to have a 1920 pixel resolution by today's monitor standards. When the 1080p x 1920 native monitors become the norm, probably in 2-4 years, the DBS downres proponents will need to revist their thinking on this. I said before, that I believe when people say they can see a difference in a downresed 1280 pixel signal and one that is originally 1280 and not downresed, I believe the downresed is really not 1280 but a muddy over compressed image that could pass no more than 700 lines in a visuial resolution test!

Bottom line- My opinion on the whole "HD-LIte" is that 1080i x 1440 ( HDCAM) is still HDTV and is superior to what most people can resolve. Plus, production of programming will not yield much more as the field production technology just doesn't offer it. 1080i x 1280 HDlite is OK if that is all we have as a monitor and the signal is truly a 1280 H pixel signal. I suspect it is much less due to over compressing. MY suspicion would have to be demonstrated on a case by case proof since DBS adjusts compression on the fly by human decision.


Peace of mind- Thanks. I like to put emotions aside and look at the whole picture. While I deplore the over compression of HDTV we have been seeing, I also respect that there is no scheme to sabotage HDTV by D* (and even E*) but rather a desire to offer more choices at a time when their bandwidth is severely stretched. I believe that as time progresses and these additional bird capacities get established, the DBS distribution quality will coincide with the consumer's standards of viewing quality and equipment technology.


Interesting side bar to compare PQ on standard analog NTSC over the years. With NTSC, the "Boroadcast Quality" has always been several pegs better than the consumer receiving display capability, until the Home theater and big screen became mainstream. All of a sudden in the late 90's people everywhere were clamoring for HDTV quality. Broadcast sudeenly found it's "Broadcast Quality" superiority disappeared overnight and now Broadcast suppliers are running neck and neck with consumer's ability to display everything they send. Over compression is a big mistake and they need to get used to the idea they have to limit the minimum bitrate to a firm standard even if it means limiting the number of channels until they do have the bird capacity or stop advertising the signal is "HDTV" I don't have a problem with D* and E* offering the EDTV or widescreen SD. But I'm sure the marketing guys will complain about that. Can't please everyone but lets not make the mistake of limiting program choices because we make D* and E* send data that no one can see anyway.
 
Last edited:
Don Landis said:
1. we have so few program sources that offer this highest degree of program resolution, second, that we have other factors that also add to the destruction of quality, I feel are far more destructive than the simple resolution resampling such as increased compression.
2. That given a program source that is of a higher resolution than we can see (due to monitor limitations) if we are given the same program that is at the monitor (native resolution) we will see no difference.

....... Can't please everyone but lets not make the mistake of limiting program choices because we make D* and E* send data that no one can see anyway.
DON -- Kudos, Kudos. :up

There are a select group of folks who had made their infamous posting careers over whining hundreds of times about things that are not reality. This HD provider over that HD provider, HD Lite (a term invented by naysayers), and the glories of 1080p (which can't really be seen via any HD content provider today).

In simple terms:

1) You can't see what you can't get (1080p isn't there, except for HD DVD / Blu Ray) - there's no real need for mainstream 1080p sets/projection in the market for some time - this is only a way to sell new equipment by the manufacturers, not new services (for some time to come).

2) All content providers do some form of compression over their spectrum of HD channels - cable and satellite, no matter how many folks are in denial about it - making this provider over that provider comparisons a joke. While this practice needs to be addressed, it needs to be addressed by all providers, and is founded in the distribution of available bandwidth usage. As bandwidth grows, these things disappear altogether.

3) The premise of this thread - the lawsuit - is not only going to be virtually impossible to validate (since no legal mandate exists for the specific definition of HD (in fact most legislation pertains to Digital Television, which most know isn't quite the same thing, and even that is somewhat vague accepting only 720p and 1080i as standards). To prove you are now getting less of something, you have to first prove you use to have more of the same thing - good luck with that in court on this issue.

4) A number of us have have contended that equipment & cabling, installation quality, weather. location, broadcast equipment, original content provider transmission rate, and other factors can go into ones final in-home HD reception. This is the case for anyone - the potential variation factors remain the same, but can significantly impact final HD viewing results as much as (or even more in some cases) as the actual 720p, 1080i, or other transmission resolution. In reality, no 2 homes see the exact same HD results bit by bit, based on these comparative differences in the 2 locations - unless everything is exactly the same, which is almost impossible. Despite the false assumption that transmission quality is 100 constant from any particular provider, the fact is that these conditions can change hourly or daily (cable and sat).

Don - You have blown all the perfectionist theorists and negative zealots out of the water with your facts. More important, you have exposed the totally bogus foundation of those pointing to Utopian HD in 2006 based on these various HD broadcast elements. :up
 
I also would have no issues [for now] with the quality of the HD signal I now receive if D* could do something to make their SD signal even remotely acceptable, given that the majority of channels offered are SD and not HD.
 
Satmeister said:
3) The premise of this thread - the lawsuit - is not only going to be virtually impossible to validate (since no legal mandate exists for the specific definition of HD (in fact most legislation pertains to Digital Television, which most know isn't quite the same thing, and even that is somewhat vague accepting only 720p and 1080i as standards). To prove you are now getting less of something, you have to first prove you use to have more of the same thing - good luck with that in court on this issue.

While you may not see the validity in his suit, dont be suprised if he wins. Im not a lawyer but just because HD standards may seem vague doesnt mean this guy loses.

A sharp lawyer can make a case out of nothing.
 
Status
Please reply by conversation.

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)