A definitive answer- 720p vs 1080i ? (and other questions)

SaintsFan

Active SatelliteGuys Member
Original poster
May 5, 2004
23
0
I just got hooked up yesterday but have been trying to absorb info from this forum for the last few days. A few questions-
1. Straight answer- which is supposed to yeid the best picture 720p or 1080i
2. There is no information in my program guide- whats the problem
3. Slightly unrelated but is there a big difference in sound between optical and coax
 
1. there is no answer. it depends on your TV and how you like to see TV material
2. you need to tell your receiver where you are located, put in Zip code.
3. no, optical is better for longer runs and coax for shorter. but there is really no hearable difference
 
To your #3 question... I would go w/ fiber optic always. I've had problems w/ losing signal when using digital coax (different receivers, dvd players, etc still happened). Plus you don't have to worry about emi/rf interference with fiber optic because it transfers signals with light.
 
what do you mean it depends on my t.v.?
I have a 57'' Hitachi Ultravision that supports progressive scan
 
He means does your tv support the different viewing modes like 720p. Some rptv's don't.
 
mini1 said:
1. there is no answer. it depends on your TV and how you like to see TV material
2. you need to tell your receiver where you are located, put in Zip code.
3. no, optical is better for longer runs and coax for shorter. but there is really no hearable difference


where do u put in ur zip code..are you talking bout the website?

oh btw how does mapping work.
 
Yep... look, there is no real answer to which is better. We each have our preferences. If you really want to understand what each means in scientific terms, read the link in my sig.

-MP
 
Pick the one that supports the native resolution of your TV. If it's a CRT RPTV then it's probably 1080i. If it is a LCD or DLP, it's probably 720p.

If it supports both NATIVELY, then you are the man and should be watching all material in their source format.
 
Well your sig is wrong. It should read 1080i could be 540p. Even then you don't have the fixed pixel count. The United states is the only who broadcast in 720p and 1080i. Japan broadcasted in 1035i and now adopted to th 1080i. Another thing these 720p promoters aren't saying is how many fields and fps. NTSC broadcast in 525 lines, 59.94 fields/29.97 frames per second . Where as ATSC HDTV signal are doulbe this.

Look at this chart

9808d3.gif


As you can see the 1080i at 60 fps is considered better than even the 1080p at 30 fps.
 
Like I said... read the link. If you think Dr. Smith doesn't know what he's talking about, then so be it. But he certainly has the credentials to back it up :). I'm done arguing th point. Native 720p on a native 720p set looks amazing, and most of the people who think otherwise have never seen it in reality.

-MP
 
He is so full of bull he is trying to say 1080i/60fps is equal to 540i/30fps. Look at my graphs I provided. What exactly is Dr Smith supposed to be a Dr. in? Home theater display technologies? No he is a Dr. In Computer Science.
 
rexoverbey said:
He is so full of bull he is trying to say 1080i/60fps is equal to 540i/30fps. Look at my graphs I provided. What exactly is he A. Smith supposed to be a Dr. in? Home theater display technologies?

Why are you asking dumb questions instead of just reading the info on the web page that madpoet has been asking you to read?
 
Who asked you andrzej? The guy is a graphic artist.. big deal so is my wife. How does this give him expertise in displays. Guess what? I have a CS degree also that doesn't mean they teach you display technologies. In a CS major they teach you programming , algorithms, information systems, and networking.

If you looked at where I got the pictures I posted it was way more relevant. Mine was from the Communications Engineering & Design - Home Page (I think that would be a lot more relevant than the Dr. ).
 
Excuse me for getting in the crossfire. All I have is an old 480i analog set, and I am looking into HDTV and VaVaV*. I am purely a lay person about this. But here's what I see, and please tell me objectively if I am missing something...

On the one hand, 720p is "better" in that it is rendered more cleanly with a single pass. On the other hand, 1080i is "better" in that it has more pixels of resolution, but takes two passes to refresh all those pixels.

So, the bottom line is that EITHER ONE could be better depending on what you are viewing. If you are viewing slow moving scenery on DHD, 1080i viewed on a 1080i native monitor will render a "better" picture because of the higher resolution than 720p. If you are viewing fast action, sports, or action movies, 720p viewed on a 720p native monitor will render a "better" picture because of the cleaner movement with it's single pass rendering.

Naturally, a native 1080i display is going to render native 1080i in all situations better than it will render converted 720p, and a native 720p display is going to render native 720p in all situations better than it will render converted 1080i.

Now, this is purely my guess based on what I have read. Am I close to understanding this correctly??