Output resolution 720p vs 1080i

  • WELCOME TO THE NEW SERVER!

    If you are seeing this you are on our new server WELCOME HOME!

    While the new server is online Scott is still working on the backend including the cachine. But the site is usable while the work is being completes!

    Thank you for your patience and again WELCOME HOME!

    CLICK THE X IN THE TOP RIGHT CORNER OF THE BOX TO DISMISS THIS MESSAGE

brad1138

SatelliteGuys Pro
Original poster
Mar 20, 2006
1,214
379
Red Dwarf
I recently purchased a new 70" TV. I have already calibrated the TV and it looks great. I was playing around with Hopper settings and I noticed that when I set the Hopper output res to 720P the menu & guide look razor sharp. When I set it to 1080i it is soft and fuzzy. I don't think it is anything to do with my TV, other sources such as PS3 are razor sharp at 1080i. It is harder to tell if the picture is actually sharper, but when I pause on ESPN, I think the graphics look sharper when it is set to 720P, but no where near as big a difference as the Hoppers guide/menu.

I know that in the end, if the channel is a 720P channel (FOX), you see 720P at your set, and if it is a 1080i channel (CBS) you see 1080i. (Edit: I had heard that a long time ago and never questioned it, but that is incorrect.)

Has anyone else noticed this? Anyone with a Hopper and 1080 i/P TV, can you see if you see the same thing?
 
Last edited:
Actually, no, your Hopper either outputs 720P or 1080i depending on what you have set. It is one or the other. My thinking is that you are overthinking it. The guide and menu may be some semblance of HD but they are not HD. Myself, once again, I am only thinking about the programming I watch. If the menu/guide is a little less crisp, I couldn't care less. It's the programming look that matters....
 
I seem to notice the guide and menus are slightly fuzzier using 1080i on my 50 inch oanny plasma with Hopper. I don't notice a difference in the actual picture.

Posted Via The FREE SatelliteGuys Reader App!
 
I know that in the end, if the channel is a 720P channel (FOX), you see 720P at your set, and if it is a 1080i channel (CBS) you see 1080i.
I presume your edit was adding the part that this statement is incorrect ? Your comments seem to be focused on "images", e.g. menus, guides, graphics (on ESPN), and so on. The Dish graphics are overlays and may not affected by that setting (?). Or, maybe they are and are closer to 720 resolution and when the setting is 1080i, they're scaled-up slightly.
 
Your TV, nearly certainly, is DISPLAYING a 1080P picture. Whatever signal it gets, it converts to 1080P. Some conversions are better than others. You can experiment with setting what signal is output by the Dish box, vs what the TV converts.

Sadly, we STILL do not have native pass thru and probably never will.
 
Just leave it at 720p and let the tv upconvert it to 1080p if it is able. I have my hopper set for 720p and then it is run to my google revue unit that puts out the signal to 1080p. So it looks great . I would think that it would be easier to upconvert from 720p to 1080p then from 1080i to 1080p. You know progressive to progressive rather than interlaced to progressive. Besides whether it is 720p or 1080i , it looks the same , it is both hd.
 
Frankly, on higher end TV's (or even just better brands) one does not see the difference with either resolution for either picture or EPG. Most likely, it is how the HDTV is handling the conversion to its native resolution (in other words, OP, you see ALL images at your HDTV's NATIVE resolution no matter what the resolution of the source as the HDTV does the scaling just before displaying the picture. It may have to do with your recent calibration.

Look, I'm not against calibrations, but people should be aware that a calibration can allow one to see more FLAWS and even result in a worse "aesthetic" PQ. A calibration is a less "forgiving" picture because calibrations, in general bypass some of the electronics for PQ enhancement, and while some of those settings, especially the "noise" reducers are best left off, some of the processing is best left ON, for very good reasons that the PQ from the studio or network uplink will suffer somewhat when it reaches your HDTV. I really don't view my HDTV as a Conrac Monitor for a TV Production Studio.

Most calibrations focus on the greatest/best contrast and accurate color, which is not necessarily what people view as the "best" color, well aware that the HDTV is enhancing the saturation, but people PREFER that, not reality, which is more bland, and they do want that HDTV to "pop." It's a preferred aesthetic, and even movie makers and TV directors go to pains to REMOVE any banality to the look and use popping color for mood and symbolism or just plain aesthetics.

Aside from color looking "washed out," among the most ironic complaints I've heard after calibration is that viewers can see (and I've seen these and more for myself with calibrated HDTV's) the grain of the film far too often and consider it "noise" along with the contrast--YES, calibrated ideally according to specs, looks a bit too dark in some places. But there are more.

The problem with calibrations is that they are done absolutely OBJECTIVE. Yet, in music recording studios, and video editing suites and transfer facilities everywhere, they have and use all the OBJECTIVE electronic tools, but, also, make SUBJECTIVE adjustments. In other words, the music recording engineer/producer will ALWAYS use their EARS for that final verdict and SUBJECTIVELY alter what is supposed to what it ought to be according to the tool. The video editor/transfer tech, ALWAYS makes final adjustments to his/her eyes to get it to look "right." These electronic tools provide an objective base from which the best people use their ears and eyes to finish the job.

All things are a compromise, and I suspect that your calibrator--most likely doing the what he should be doing--made adjustments objectively for the best PQ which would be the case in some circumstances, and may have bypassed or turned off processing he may have found objectionable for the purposes of calibration, that your HDTV was designed to take a not so good image and clean it up and make so you could not tell the difference in the display of an EPG at 720P or 1080i source and scale it properly and enhance it, but especially something like an EPG which could be an overlay, and EPG's are the source of a lot of complaints about PQ. You may be getting it raw from the Dish box and it only rears its ugly head when the EPG is displayed, but I think one could also see other instances in real world PQ.

Yours truly, can see NO difference in the quality of the EPG on any of my high-end nor entry level HDTV's with the box set for 720P or 1080i without external processing. All but one now have some form of external processing, which I find far more beneficial than calibration, and it provides far greater control.

IMHO, TV and Movies are NOT real life. I enjoy a processed, but not over-processed (and ALL music and film and TV IS PROCESSED to alter the original image at some POINTS--plural--along the chain) green, that really isn't a true real-life green, and I feel very strongly that film grain is NOISE. Most filmmakers don't ever intend for film grain to be notices. While The League of Gentlemen very purposely WANTED that film grain to be MORE than noticeable (a mistake to do so, I personally believe, and thankfully, the 3rd Series did NOT have any of the awful film grain. Hmm, I wonder why they changed their minds?), as their goal was to make it look like one of those "B" Horror films with their poor film print quality and absolutely abominable telecine, most likely a live running film projector into a mirror/prism at the time of broadcast with generous helpings of film GRAIN. All unintended by the film makers.

Again, I think calibration, in general is actually a good thing, especially for mid to lower end TV's, if they can get more contrast out of the lower cost ones and there is a limit to the color palette on less expensive TV's, as well, but often people can see the difference there. The high end HDTV's seem to come out of the factory so well calibrated as they are, paying extra for a personal calibration borders on a waste of money, with only minimal gains, if any. So, decide on calibration wisely and check places like AV Forums to see if it is worth it for particular models.

But even a great calibration isn't the cure all nor will it provide you with the best PQ. Calibration does virtually nothing to address some of the most common and painful to look at PQ problems: mosquitos, banding, compression artifacts, any artifact in general, noise, and the list goes on if one watches more than Blu-rays, which themselves have may suffer from a lousy transfer that a calibrated HDTV will only make look WORSE. The solution there, if not your HDTV's built-in processing (but most HDTV built-in processors are pretty course and don't do near as good a job as external processing) is external processing, but that would put your back where you were BEFORE the calibration. Or, one could turn-off the external processor to allow the source to be seen with the benefit of calibration only, and turn on processing for your TV services, where even OTA's now have crappy PQ that are in dire need of processing at the home.

Just know and understand what one gets into with calibration, and its greatest strength is with Blu-ray movies where the muted real-life colors and lack of processing can enhance the film watching experience and, perhaps, draw you in more to the movie without distracting information like popping colors, etc. As for TV, I've got Anchor Bay and Marvell Qdeo doing it necessary magic (the Faroudja I had seemed to perform identical to the Anchor Bay) while my Series 6 Sammy's (the only HDTV with out external processing) built-in processor just can't kick the misquito without a noticeable compromise, and people can tell it doesn't look as good as the other HDTV's, even worse, they say, than my horrid entry level Sharp (far inferior to the Series 6 Sammy) but is now great looking, they say, via Anchor Bay compared to the non-external processed Series 6 Sammy. Everybody notices the effect of the external processing I use, but they have no idea it is being externally processed. They just think it is all the TV. They just aren't people who even know external processing exists. But I get compliments and they think their very expensive HDTV does not look as good. That makes me fell a bit better about the investment I made.

Long answer, but, no, I do not notice what you've noticed, OP. And thankfully so. :).
 
Last edited:
  • Like
Reactions: KazooGuy
Haha, I've always tried various "settings" people use for the same TV models that I've had and almost always ask, "how can you watch like this?".

Sent from my SCH-I535 using Tapatalk
 
  • Like
Reactions: DWS44
Frankly, on higher end TV's (or even just better brands) one does not see the difference with either resolution for either picture or EPG. Most likely, it is how the HDTV is handling the conversion to its native resolution (in other words, OP, you see ALL images at your HDTV's NATIVE resolution no matter what the resolution of the source as the HDTV does the scaling just before displaying the picture. It may have to do with your recent calibration.

Look, I'm not against calibrations, but people should be aware that a calibration can allow one to see more FLAWS and even result in a worse "aesthetic" PQ. A calibration is a less "forgiving" picture because calibrations, in general bypass some of the electronics for PQ enhancement, and while some of those settings, especially the "noise" reducers are best left off, some of the processing is best left ON, for very good reasons that the PQ from the studio or network uplink will suffer somewhat when it reaches your HDTV. I really don't view my HDTV as a Conrac Monitor for a TV Production Studio.

Most calibrations focus on the greatest/best contrast and accurate color, which is not necessarily what people view as the "best" color, well aware that the HDTV is enhancing the saturation, but people PREFER that, not reality, which is more bland, and they do want that HDTV to "pop." It's a preferred aesthetic, and even movie makers and TV directors go to pains to REMOVE any banality to the look and use popping color for mood and symbolism or just plain aesthetics.

Aside from color looking "washed out," among the most ironic complaints I've heard after calibration is that viewers can see (and I've seen these and more for myself with calibrated HDTV's) the grain of the film far too often and consider it "noise" along with the contrast--YES, calibrated ideally according to specs, looks a bit too dark in some places. But there are more.

The problem with calibrations is that they are done absolutely OBJECTIVE. Yet, in music recording studios, and video editing suites and transfer facilities everywhere, they have and use all the OBJECTIVE electronic tools, but, also, make SUBJECTIVE adjustments. In other words, the music recording engineer/producer will ALWAYS use their EARS for that final verdict and SUBJECTIVELY alter what is supposed to what it ought to be according to the tool. The video editor/transfer tech, ALWAYS makes final adjustments to his/her eyes to get it to look "right." These electronic tools provide an objective base from which the best people use their ears and eyes to finish the job.

All things are a compromise, and I suspect that your calibrator--most likely doing the what he should be doing--made adjustments objectively for the best PQ which would be the case in some circumstances, and may have bypassed or turned off processing he may have found objectionable for the purposes of calibration, that your HDTV was designed to take a not so good image and clean it up and make so you could not tell the difference in the display of an EPG at 720P or 1080i source and scale it properly and enhance it, but especially something like an EPG which could be an overlay, and EPG's are the source of a lot of complaints about PQ. You may be getting it raw from the Dish box and it only rears its ugly head when the EPG is displayed, but I think one could also see other instances in real world PQ.

Yours truly, can see NO difference in the quality of the EPG on any of my high-end nor entry level HDTV's with the box set for 720P or 1080i without external processing. All but one now have some form of external processing, which I find far more beneficial than calibration, and it provides far greater control.

IMHO, TV and Movies are NOT real life. I enjoy a processed, but not over-processed (and ALL music and film and TV IS PROCESSED to alter the original image at some POINTS--plural--along the chain) green, that really isn't a true real-life green, and I feel very strongly that film grain is NOISE. Most filmmakers don't ever intend for film grain to be notices. While The League of Gentlemen very purposely WANTED that film grain to be MORE than noticeable (a mistake to do so, I personally believe, and thankfully, the 3rd Series did NOT have any of the awful film grain. Hmm, I wonder why they changed their minds?), as their goal was to make it look like one of those "B" Horror films with their poor film print quality and absolutely abominable telecine, most likely a live running film projector into a mirror/prism at the time of broadcast with generous helpings of film GRAIN. All unintended by the film makers.

Again, I think calibration, in general is actually a good thing, especially for mid to lower end TV's, if they can get more contrast out of the lower cost ones and there is a limit to the color palette on less expensive TV's, as well, but often people can see the difference there. The high end HDTV's seem to come out of the factory so well calibrated as they are, paying extra for a personal calibration borders on a waste of money, with only minimal gains, if any. So, decide on calibration wisely and check places like AV Forums to see if it is worth it for particular models.

But even a great calibration isn't the cure all nor will it provide you with the best PQ. Calibration does virtually nothing to address some of the most common and painful to look at PQ problems: mosquitos, banding, compression artifacts, any artifact in general, noise, and the list goes on if one watches more than Blu-rays, which themselves have may suffer from a lousy transfer that a calibrated HDTV will only make look WORSE. The solution there, if not your HDTV's built-in processing (but most HDTV built-in processors are pretty course and don't do near as good a job as external processing) is external processing, but that would put your back where you were BEFORE the calibration. Or, one could turn-off the external processor to allow the source to be seen with the benefit of calibration only, and turn on processing for your TV services, where even OTA's now have crappy PQ that are in dire need of processing at the home.

Just know and understand what one gets into with calibration, and its greatest strength is with Blu-ray movies where the muted real-life colors and lack of processing can enhance the film watching experience and, perhaps, draw you in more to the movie without distracting information like popping colors, etc. As for TV, I've got Anchor Bay and Marvell Qdeo doing it necessary magic (the Faroudja I had seemed to perform identical to the Anchor Bay) while my Series 6 Sammy's (the only HDTV with out external processing) built-in processor just can't kick the misquito without a noticeable compromise, and people can tell it doesn't look as good as the other HDTV's, even worse, they say, than my horrid entry level Sharp (far inferior to the Series 6 Sammy) but is now great looking, they say, via Anchor Bay compared to the non-external processed Series 6 Sammy. Everybody notices the effect of the external processing I use, but they have no idea it is being externally processed. They just think it is all the TV. They just aren't people who even know external processing exists. But I get compliments and they think their very expensive HDTV does not look as good. That makes me fell a bit better about the investment I made.

Long answer, but, no, I do not notice what you've noticed, OP. And thankfully so. :).
IF only your post could flesh out your response with more details. :biggrin
 
Frankly, on higher end TV's (or even just better brands) one does not see the difference with either resolution for either picture or EPG. Most likely, it is how the HDTV is handling the conversion to its native resolution (in other words, OP, you see ALL images at your HDTV's NATIVE resolution no matter what the resolution of the source as the HDTV does the scaling just before displaying the picture. It may have to do with your recent calibration.

Look, I'm not against calibrations, but people should be aware that a calibration can allow one to see more FLAWS and even result in a worse "aesthetic" PQ. A calibration is a less "forgiving" picture because calibrations, in general bypass some of the electronics for PQ enhancement, and while some of those settings, especially the "noise" reducers are best left off, some of the processing is best left ON, for very good reasons that the PQ from the studio or network uplink will suffer somewhat when it reaches your HDTV. I really don't view my HDTV as a Conrac Monitor for a TV Production Studio.

Most calibrations focus on the greatest/best contrast and accurate color, which is not necessarily what people view as the "best" color, well aware that the HDTV is enhancing the saturation, but people PREFER that, not reality, which is more bland, and they do want that HDTV to "pop." It's a preferred aesthetic, and even movie makers and TV directors go to pains to REMOVE any banality to the look and use popping color for mood and symbolism or just plain aesthetics.

Aside from color looking "washed out," among the most ironic complaints I've heard after calibration is that viewers can see (and I've seen these and more for myself with calibrated HDTV's) the grain of the film far too often and consider it "noise" along with the contrast--YES, calibrated ideally according to specs, looks a bit too dark in some places. But there are more.

The problem with calibrations is that they are done absolutely OBJECTIVE. Yet, in music recording studios, and video editing suites and transfer facilities everywhere, they have and use all the OBJECTIVE electronic tools, but, also, make SUBJECTIVE adjustments. In other words, the music recording engineer/producer will ALWAYS use their EARS for that final verdict and SUBJECTIVELY alter what is supposed to what it ought to be according to the tool. The video editor/transfer tech, ALWAYS makes final adjustments to his/her eyes to get it to look "right." These electronic tools provide an objective base from which the best people use their ears and eyes to finish the job.

All things are a compromise, and I suspect that your calibrator--most likely doing the what he should be doing--made adjustments objectively for the best PQ which would be the case in some circumstances, and may have bypassed or turned off processing he may have found objectionable for the purposes of calibration, that your HDTV was designed to take a not so good image and clean it up and make so you could not tell the difference in the display of an EPG at 720P or 1080i source and scale it properly and enhance it, but especially something like an EPG which could be an overlay, and EPG's are the source of a lot of complaints about PQ. You may be getting it raw from the Dish box and it only rears its ugly head when the EPG is displayed, but I think one could also see other instances in real world PQ.

Yours truly, can see NO difference in the quality of the EPG on any of my high-end nor entry level HDTV's with the box set for 720P or 1080i without external processing. All but one now have some form of external processing, which I find far more beneficial than calibration, and it provides far greater control.

IMHO, TV and Movies are NOT real life. I enjoy a processed, but not over-processed (and ALL music and film and TV IS PROCESSED to alter the original image at some POINTS--plural--along the chain) green, that really isn't a true real-life green, and I feel very strongly that film grain is NOISE. Most filmmakers don't ever intend for film grain to be notices. While The League of Gentlemen very purposely WANTED that film grain to be MORE than noticeable (a mistake to do so, I personally believe, and thankfully, the 3rd Series did NOT have any of the awful film grain. Hmm, I wonder why they changed their minds?), as their goal was to make it look like one of those "B" Horror films with their poor film print quality and absolutely abominable telecine, most likely a live running film projector into a mirror/prism at the time of broadcast with generous helpings of film GRAIN. All unintended by the film makers.

Again, I think calibration, in general is actually a good thing, especially for mid to lower end TV's, if they can get more contrast out of the lower cost ones and there is a limit to the color palette on less expensive TV's, as well, but often people can see the difference there. The high end HDTV's seem to come out of the factory so well calibrated as they are, paying extra for a personal calibration borders on a waste of money, with only minimal gains, if any. So, decide on calibration wisely and check places like AV Forums to see if it is worth it for particular models.

But even a great calibration isn't the cure all nor will it provide you with the best PQ. Calibration does virtually nothing to address some of the most common and painful to look at PQ problems: mosquitos, banding, compression artifacts, any artifact in general, noise, and the list goes on if one watches more than Blu-rays, which themselves have may suffer from a lousy transfer that a calibrated HDTV will only make look WORSE. The solution there, if not your HDTV's built-in processing (but most HDTV built-in processors are pretty course and don't do near as good a job as external processing) is external processing, but that would put your back where you were BEFORE the calibration. Or, one could turn-off the external processor to allow the source to be seen with the benefit of calibration only, and turn on processing for your TV services, where even OTA's now have crappy PQ that are in dire need of processing at the home.

Just know and understand what one gets into with calibration, and its greatest strength is with Blu-ray movies where the muted real-life colors and lack of processing can enhance the film watching experience and, perhaps, draw you in more to the movie without distracting information like popping colors, etc. As for TV, I've got Anchor Bay and Marvell Qdeo doing it necessary magic (the Faroudja I had seemed to perform identical to the Anchor Bay) while my Series 6 Sammy's (the only HDTV with out external processing) built-in processor just can't kick the misquito without a noticeable compromise, and people can tell it doesn't look as good as the other HDTV's, even worse, they say, than my horrid entry level Sharp (far inferior to the Series 6 Sammy) but is now great looking, they say, via Anchor Bay compared to the non-external processed Series 6 Sammy. Everybody notices the effect of the external processing I use, but they have no idea it is being externally processed. They just think it is all the TV. They just aren't people who even know external processing exists. But I get compliments and they think their very expensive HDTV does not look as good. That makes me fell a bit better about the investment I made.

Long answer, but, no, I do not notice what you've noticed, OP. And thankfully so. :).

What he said...
 
Dayton: WHIO (CBS), WDTN (NBC), WPTD (PBS)

Cincinnati: WKRC (CBS), WLWT (NBC), WCET (PBS)

Rochester: WXXI, WHEC, WROC
 
Last edited:
Just leave it at 720p and let the tv upconvert it to 1080p if it is able. I have my hopper set for 720p and then it is run to my google revue unit that puts out the signal to 1080p. So it looks great . I would think that it would be easier to upconvert from 720p to 1080p then from 1080i to 1080p. You know progressive to progressive rather than interlaced to progressive. Besides whether it is 720p or 1080i , it looks the same , it is both hd.

I disagree with this idea for people with 1080p TVs. 1080i and 1080p are the exact same resolution. They are both 1920x1080. To convert from 1080i to 1080p all your TV really has to do is deinterlace. If you are watching 720p content with your receiver set to 1080i it has to upconvert from 1280x720 to 1920x1080. The TV then has to deinterlace this signal.

720p, on the other hand, has a resolution of 1280x720. If you are watching a 1080i TV channel with your receiver set to output 720p the receiver has to downconvert from 1920x1080 to 1280x720. Your 1080p TV would then have to upconvert the new 1280x720 image back to 1920x1080i to fit the native resolution of it's display.

When you set the receiver to 720p you have to convert the resolution of the image twice when you are watching a 1080i channel. Setting the receiver to 1080i only requires you to convert the resolution once when you are watching a 720p channel.

When you are watching a 1080i channel set on a receiver set to 1080i you don't have to convert the resolution at all. When you are watching a 720p channel on a receiver set to 720p you still have to convert the resolution once to get to a 1080p TVs native resolution.

I think it makes sense that the fewer times you convert an image's resolution the better. If you own a 108op TV I think it makes more sense to set your receiver to 1080i and cut out a few resolution conversions.

Are there any TV stations that actually broadcast in 1080i? Are you sure your local does?

There are lots of 1080i stations. CBS and NBC for starters on the local side. In my experience most cable channels are 1080i too, with Disney and Fox owned channels being the big exception. If you are curious about the resolution of a partictular channel the wikipedia page almost almost always has that information.
 
I disagree with this idea for people with 1080p TVs. 1080i and 1080p are the exact same resolution. They are both 1920x1080. To convert from 1080i to 1080p all your TV really has to do is deinterlace. If you are watching 720p content with your receiver set to 1080i it has to upconvert from 1280x720 to 1920x1080. The TV then has to deinterlace this signal.

720p, on the other hand, has a resolution of 1280x720. If you are watching a 1080i TV channel with your receiver set to output 720p the receiver has to downconvert from 1920x1080 to 1280x720. Your 1080p TV would then have to upconvert the new 1280x720 image back to 1920x1080i to fit the native resolution of it's display.

When you set the receiver to 720p you have to convert the resolution of the image twice when you are watching a 1080i channel. Setting the receiver to 1080i only requires you to convert the resolution once when you are watching a 720p channel.

When you are watching a 1080i channel set on a receiver set to 1080i you don't have to convert the resolution at all. When you are watching a 720p channel on a receiver set to 720p you still have to convert the resolution once to get to a 1080p TVs native resolution.

I think it makes sense that the fewer times you convert an image's resolution the better. If you own a 108op TV I think it makes more sense to set your receiver to 1080i and cut out a few resolution conversions.



There are lots of 1080i stations. CBS and NBC for starters on the local side. In my experience most cable channels are 1080i too, with Disney and Fox owned channels being the big exception. If you are curious about the resolution of a partictular channel the wikipedia page almost almost always has that information.
Well if the video gets converted two times , then my picture would look twice as good ,right? :eureka

I really don't know that much about picture upconverting and which would be better. I only know that my bedroom hdtv ,a older tube top from early in the last decade, looks better using 720p upconverted to 1080p. Using 1080i there are flaws and pixels and the video looks worse. To me 1080p looks best. That is why I use the google revue units to do all my converting of my tv picture to 1080p on all 4 of my hd tvs. The picture looks great to me and I have it split over my hdmi splitter to both my computer room and living room so I get 1080 on both hd tvs. When I used just 1080i setting, I could see more noise or a roughness to the picture in the background scenes. The 1080p is so much smoother and it looks great to me.
 

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)