Seeing a lot of 720p tv's with no 1080i

Joewee

SatelliteGuys Guru
Original poster
Aug 9, 2005
128
0
Eau Claire, Wisconsin
Anyone notice that stores are starting to sell a lot of 720p tv's and no 1080i. It seems to keep the prices down but I am wondering what everyone thinks about this? How is the 1080i quality when it is downconverted to 720p. Is this good for the industry or not?
 
Most 720p tv's will sync for 1080i. I guess whether you want to call it down converted is dependent on your frame of refrence. 720p has 180 more horizontal lines of resolution, but has 540 less vertical lines of resolution (if its 1920x1080i content). Of course this also depends on the actual resolution of the screen. (1024 x 768, 1366 x 768, 1280 x 720, etc) One thing nice about 720p is that you will get full HD resolution. With 1080i, it depends. It may have 1920, 1440 or 1280 vertical lines of resolution. In all cases but the 1920, 720p has more pixels of resolution. It has about 11% less pixels of resolution of 1920. Its about 15% more than 1440 and 25% more than 1280.

Of course all of that maybe meaningless if the providers over compresses the data rate of the station.
 
It's amazing how cheap they are selling them for and how fast 1080p is becoming the new standard.

Will something new come after 1080p?
 
Seems to me that the most important criteria is that the display resolution match that of the source camera. Any material captured at 1080 can't be directly presented on a 720 display - it has to be processed to make the lines fit. Detail will be lost or altered.

The same is true, of course, if something captured at 720 is expanded to fill a native 1080 screen. What you see isn't really what the camera captured.
 
That's why our TV is 1080p. I didn't want to mess with the 720p vs 1080i crap. My TV will display anything you throw at it. :)
 
At least not much for HDTV. Unless your TV is bigger than 50". I've heard people claim they can tell the difference.

Now, Blu-ray or HD DVD is another story. Its full 1080p, so it would be wise to get a TV that can truely handle that format.
 
Let me clear up a few things before some people get confused about this whole i vs p thing. The concept of interlaced came about during the era of CRT. It is the process where by even lines and odd lines are drawn on the screen one after the other. This allowed them to use half the bandwidth. In the era of digital transmission, specifically ATSC, there are two standards for HD content. 720p (1280x720) and 1080i (1920x1080). All HD TV will be able to receive both of these content. Only CRTs have the capability of showing 1080i content natively. All digital pixel based displays are always progressive scan and cannot display 1080i content natively. So, you need to look at the native pixels of the TV. Some TVs are 1280x720, several cheaper plasmas are 1024x768, many LCDs are 1366x768, the higher end LCDs are 1920x1080, several DLP and LCoS are also 1920x1080.

These pixel based displays can support multiple resolutions at the input side such as 480i, 480p, 720p, 1080i and even 1080p but they will have to convert this to the native pixel using a process of deinterlacing and scaling. Some TVs have good scalers, some don't. That is why a TV with a good scaler might show video content received in its native format well, but any other content look bad.

So, if a TV is advertised as HDTV and it says 720p native, don't fret. It will support 1080i, 720p, 480p and most probably 480i (may be not through HDMI but through component, etc). In fact, I bought an inexpensive 37" LCD that is 1366x768 native, but it supports 1080p. It just downconverts to the native pixels. I confirmed this by connecting my PS3 to it and setting the resolution on the PS3 to 1080p.

I hope this clears up a few things.
 
I think that most of us understand the difference between interlaced and progressive scan. I think that we even understand that a 720p set can receive a 1080 i picture and present it in a resolution that the set is capable of. In fact I only see one post taht indicates taht some sets might not be able to do this and I am not sure that the poster menat it quite that way.

I think that the OP was commenting on the fact that 1010i sets seem to be disappearing . I am not sure that is true but if true it is interesting.
 
... I bought an inexpensive 37" LCD that is 1366x768 native, but it supports 1080p. It just downconverts to the native pixels. I confirmed this by connecting my PS3 to it and setting the resolution on the PS3 to 1080p. ...
This conversion is what concerns me.

If 1080 programming is displayed on a 720 screen, the conversion process results in three horizontal lines being changed to two. That can't be done without loss of image detail.

Similarly, if 720 programming fills a 1080 display, two original lines are expanded to fill three and the picture won't be what the camera captured. The 1080 display could be set to use only 720 of its lines, of course, but then the picture would fill only 2/3 of the display vertically.
 
It seems to me there has just been a change in how either the retailers or the manufacturers have decided to sell these sets. My Sony that I bought a little over year ago(LCD Projection), was originally advertised as 1080i. Now the same tv is being sold as 720p. I think they are trying to ease the confusion between the i and the p and increase the sales of the 1080p sets that are more expensive. If you see a tv that says 720p or 1080p, most people will go for the 1080p. If you see 1080i and 1080p and you don't know what the difference is between the i and the p, then most people will probably go for the savings of buying the 1080i.
 
***

Users Who Are Viewing This Thread (Total: 1, Members: 0, Guests: 1)

Who Read This Thread (Total Members: 1)

Latest posts