projection freak said:
first of all, 1080x1440 is a 4:3 aspect ratio, it is HD in 4:3. if DTV is broadcasting 16:9 with that resolution, i don't see how that is possible, uless they are squeezing the image and having it streched to full screen width. which is what i assume they are in fact doing. this is clearly a misrepresentation of what they are actually broadcasting and if you viewed the proper 4:3 content it would be tall and unwatchable.
the standard for widescreen HD is 1080x1920 interlaced or 720x1280 progressive. 1080x1440 is NOT a HD widescreen standard. It is an HD Fullscreen standard.
.
etc...
Welcome to forum posting! Let me try to give you some clues why your thinking is confusing to you. While what you posted is partly true, your math is leaving out some important parts of the video equation. This is causing you to form incorrect conclusions, erroneous math results and forming new terminology that really doesn't exist such as "HD fullscreen standard" BY definition, all HDTV is 16x9 screen AR. Don't feel bad because so many people fail in this understanding of what makes up a video image, a digital image, and the final resultant image you receive after signal processing via satellite.
The main part you didn't list that flaws your understanding is the Pixel aspect ratio. Or, the PAR. You probably heard of OAR, Screen AR, Projector Targe AR and I could go on with a few others but an important part to understand when using pixel count to determine the screen AR is the PAR. This is how you can have a 1440 horizontal pixel count while still maintaining the 16x9 image AR. BUT, important, that is not the only issue in play for the horizontal resolution in a digital image either. image resolution is the ability to define clearly a measurable resolve of lines of resolution. Here is where compression enters the picture and why greater compression will burry the resolve of the image even deeper into the video noise. So many lay people go only as far as the understanding of the ATSC pixel spec on the upper limit of HDTV standard. Then they claim that anything less than 1920 is "HD lite" Therefore according to definition, 1080i x 1919 is HD lite.
In reality "HD Lite" is a vernacular of common language and has really no standard definition. Kind of like Porn, ie can't define it but I know it when I see it. Right?
The point of contention I have with HD lite is that defining it to the common standard of HDTV production standards which is considered HDCAM resolution, defined by Sony at 1080i x 1440 pixels as it is recorded to tape, dictates that ALL HDTV production IS "HD-Lite" and only HDTV Mastering in D5 is done at full HDTV. I would bet that nobody has ever seen true HDTV by the maximum pixel standards of the ATSC spec unless you worked in a TV studio where Sony studio cameras are fed to Panasonic D5 recorders and your monitor was one of the latest DLP 1080p x1920 imagers. Well that's a tall order! So this is why I constantly like to define HD-Lite as somewhat less than most lay people and that would be 1080i x anything less than 1440. Thus, D*'s claim that 1080i x 1280 is where they want to be with HDTV is what I call HD-Lite. But the Voom channels and other Dish Network 1080i x 1440 is nothing more than HDCAM distribution maximum resolution, not limited by Dish Network but by the original production of the program itself. HDNet uses HDCAM equipment and by Sony specification, this is limited at the production to 1440 H pixels.
So what types of programming can be at full D5 HDTV resolution of 1080i x 1920?
I would suggest that much of the live feed direct to air broadcasts such as NFL games where most of the cameras are imaging at 1080i x 1920 such as a CBS HD broadcast. Some studio feeds that are live. And finally any movie that was telecined from 35mm or greater film to D5 and the distribution tape was dubbed to D5 to air. At one time a great number of movies were distributed to DirecTV in D5 format for their PPV channel. In those days the DirecTV resolution was indeed full on 1920 H pixels. HBO was using a combimbnation of HDCAM and D5 for distribution from their transfer facility to uplink so they also have some full 1920 movies as well as other studio productions but field sports is all at HDCAM (1440) production.
The logic behind D* using a spec of 1080i x 1280 is a good choice by today's viewing technology but that won't last for long. Currently the upper limit of the majoriety of HDTV viewers can only image 1280 native resolution on their screens, many have even less in their monitor's native, especially the "HD ready TV's"
Furthermore, all digital imagers will convert your 1080i to a 720P standard for display. Unless you have one of the higher res monitors such as the JVC DILA or the newer TI 1080px1920 DLP's all you will get to view is 1280 on your native res monitor. Anything higher gets lost anyway on your monitor. Therefore selecting a limiting resolution of 1280 for both the 1080i and 720P broadcasts was a good choice by D* engineers if they are planning to downres the image to save bandwidth.
Now I am not meaning to defend the practice of "HD-Lite" but I believe that the image I have seen by D* on their Showtime and HBO and HDNet channels is far more lightened than just 1280 pixels. I would be surprised if it can be seen any higher than 700-800 pixels these days! While I don't agree with the lay zealots who post their interpretation of the HDTV technology, I do agree with them in spirit, that is D* should not down res an HDTV signal below what is considered production standard. IMO, that would be either 720P x 1280 or 1080i x1440 (HDCAM) standards.
Back to people who have purchased the latest DLP 1080p x1920 monitors. Sorry to burst your fantasy bubble, but I'm afraid you will not be seeing true native HDTV resolution on many broadcadsts at all. Your best bet to insure full native resolution programming is to get programming on Blue Ray or HD-DVD formats as these are being mastered in the native resolution of your state of the art monitor.
What about viewing an interlaced signal on a digital monitor- In effect, the interlaced technology was designed for CRT's or scan line displays, not digital pixel based displays. The digital technology has been able to fix most of the problems that arise from an interlaced image on a pixel display but in reality, it produces ( rather converts interlace to progressive)what is technically a 540P x (whatever the native H res is) This is most likely 1280 pixels on a good HDTV DLP imager. A CRT uses phosphor persistence, a chemical-electro phenomenon, to achieve image display of the interlace lines while a digital LCD or DLP has to electrically convert the timing of the interlace to a progressive 540 line image from the 1080 / 2 image. Confuesed? to explain it in more detail, I'd need to draw pictures of the phosphor raster vs pixel matrix. Not in the mood to do that here but engineering guides do that well.
OK, I'll stop here but I suggest you get some good publications on Image science or from the engineering side of the technology. This only scratches the surface because to fully comprehend what the problems are, one needs to combine the understanding that people who engineer monitors know with the broadcast and DBS technologies, the program distribution practices and TV production and film conversion industry does. It is not simple as just quoting some pixel specs in one ATSC document and believe you have a full understnding of how TV is done in HDTV. I don't claim to know it all, far from it, but I do have a considerable background as a consulting BE, PE active TV program producer. This is what I do for a living and I'm still learning. A good start in your studies is to begin with the ISF people, including Joe Kane, Guy Kuo, and others. They hang out on AVS Forum and you can ask them direct questions on this stuff you find confusing. Better they often attend the Home theater cruise and you can sit in on their lectures for more understanding.