720p or 1080i output?

teneightyp

SatelliteGuys Pro
Original poster
Mar 16, 2008
233
1
Duluth, MN
I have a 1080p TV and a VIP211 receiver. Should I set my TV output to 720p or 1080i under the HDTV setup menu? Why? Thanks!
 
I think it is a matter of preference. I have a 1080p Sony XBR4, and just switched back and forth between 720p and 1080i; my thought was that 1080i looked better. But I am not sure if someone changed it without me knowing, if I would be able to tell the difference.
 
I have a 1080p and really can't tell the difference between the two. Some channels I think look good in 720 and others 1080.
 
It comes down to resolution.... 720p will give you about 1 Megapixel picture... 1080(i or p) will give you 2 Megapixels (1920x1080), so you get twice the resolution picture... Of course, that depends on three things...

1. The Channel (CBS/NBC broadcast in 1080i, ABC/FOX in 720p)

2. The uplink or Satellite resolution (in some cases, HD Lite is transmitted which is 1440x1080 so it becomes 1.5 Mpixels). Still 50% better than 720p.

3. The TV... No point sending 1080i to a TV that is only 1280x720 (or 1360x768)...
 
3. The TV... No point sending 1080i to a TV that is only 1280x720 (or 1360x768)...

If the TV does a better job doing the converting from 1080i to 720p then, yes you are better setting the box to 1080i. If the opposite is true then the best setting is 720p. Hence the reason you need to try both and decide on your own.

Ideally we'd have a native setting to pass whatever the channel is coming down as, but for now since most channels are 1080i I'd set it to that and then let the TV do the rest of the work.
 
There is also the argument that for fast-moving sports, 720p is better. But I have never been convinced of it.

That's true with older LCD's that have motion blur (coupled w/ deinterlacing) problems. It's less true these days.

My gf prefers 1080i on most channels to my 768p LCD. I prefer 720p since I don't like the 540p the set delivers from 1080i sources. (poor deinterlacer) During football season, I set my 211 to 720p out since I'm generally watching ESPN, Big Ten or Fox SportsNet. (Our local ABC affiliate is ironically 1080i!)

On my 1080p set that deinterlaces properly, I usually set everything to 1080i. Sometimes I will set the 720p native channels to 720p, as my 1080p set scales better than the Dish box, too.
 
The difference is clear on my new 1080P set. 1080I channels should be shown at 1080i but 720P channels at 720P. Hoping for "native resolution passthrough" to be added sometime.
 
It comes down to resolution.... 720p will give you about 1 Megapixel picture... 1080(i or p) will give you 2 Megapixels (1920x1080), so you get twice the resolution picture... Of course, that depends on three things...

This is kind of true, but a 1080i signal only sends half the picture (every other line) at a time so it's kind of hard to say it's actually 1920x1080. It appears to be that way because the refresh rate is so high.

In reality, going screen draw per screen draw, 720p is a higher resolution since it draws the full screen on each refresh.
 
This is kind of true, but a 1080i signal only sends half the picture (every other line) at a time so it's kind of hard to say it's actually 1920x1080. It appears to be that way because the refresh rate is so high.

In reality, going screen draw per screen draw, 720p is a higher resolution since it draws the full screen on each refresh.

In the space of 1/30th second, the 1080i still gives you 1920x1080, it just on each half of the interlace, it's only giving you 1920x540. The originating camera still has a 1920x1080 CCD, it's just sending 1/2 the information for each 1/2 of the screen draw.

So, yes, in high-motion sports events, you could argue that the resolution drops to 1920x540 although it happened so fast, it would be hardly noticeable except for that "motion blur".. For more stationary images, the full resolution would be evident.

In 1/60th of a second, the amount of information sent for 1080i and 720p would be almost identical.... But I can tell you on my 1080 SXRD, I can really tell the difference in crispness between NFL on CBS and on FOX.
 
This is kind of true, but a 1080i signal only sends half the picture (every other line) at a time so it's kind of hard to say it's actually 1920x1080. It appears to be that way because the refresh rate is so high.

In reality, going screen draw per screen draw, 720p is a higher resolution since it draws the full screen on each refresh.

Not a higher resolution, but yes more than a million pixels for the same time (1/30 of a second).

720P. 720x1280x2 = 1,843,200
1080i. 1080x1920 = 2,073,600

With many shows still being filmed based, 1080I wins. The 720P being at twice the rate is neutralized and ends up as only 921,600.
 
That's true with older LCD's that have motion blur (coupled w/ deinterlacing) problems. It's less true these days.

LCD motion blur is totally unrelated to this discussion as we are talking about the resolution of the source.

Just trying to keep people from getting confused.

And I'm with Rocky, I've yet to see 720p better for sports/motion than a quality, high bitrate 1080i feed. CBS is tops for HD Sports.
 
Most prime-time scripted TV shows are 24 frame/sec 1080p (at least on CBS and NBC). When transmitted as 1080i60, many displays can correctly reconstruct this 24fps 1080p from the 1080i, and then display it with a 3/2 cadence at 60Hz (or 5/5 if you're lucky enough to have a 120Hz set). If you set your receiver to output 720p instead, you've prevented yourself from ever viewing anything better than 720p.

On the other hand, Lost on ABC broadcast at 720p looks pretty darn good on my 1080p set (which can recover 24fps 1080p perfectly), so signal format isn't everything when it comes to picture quality.
 

ESPN baseball PQ quality

Need Help Please

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)