1080p/24

  • WELCOME TO THE NEW SERVER!

    If you are seeing this you are on our new server WELCOME HOME!

    While the new server is online Scott is still working on the backend including the cachine. But the site is usable while the work is being completes!

    Thank you for your patience and again WELCOME HOME!

    CLICK THE X IN THE TOP RIGHT CORNER OF THE BOX TO DISMISS THIS MESSAGE
There has been a lot of concern and confusion over the difference between 1080i and 1080p. This stems from the inability of many TVs to accept 1080p. To make matters worse, the help lines at many of the TV manufacturers (that means you, Sony), are telling people that their newly-bought 1080p displays are really 1080i. They are idiots, so let me say this in big bold print, as far as movies are concerned THERE IS NO DIFFERENCE BETWEEN 1080i AND 1080p. See, I did it in caps too, so it must be true. Let me explain (if your eyes glaze over, the short version is at the end).
For clarification, let me start by saying that there are essentially no 1080i TVs anymore. Unless you bought a CRT based TV, every modern TV is progressive scan (as in LCD, Plasma, LCOS, DLP). They are incapable of displaying a 1080i signal as 1080i. So what we?re talking about here mostly applies to people with 1080p native displays.
Movies and almost all TV shows are shot at 24 frames-per-second (either on film or on 24fps HD cameras). All TVs have a refresh rate of 60Hz. What this means is that the screen refreshes 60 times a second. In order to display something that is 24fps on something that is essentially 60fps, you need to make up, or create new frames. This is done using a method called 3:2 pulldown (or more accurately 2:3 pulldown). The first frame of film is doubled, the second frame of film is tripled, the third frame of film is doubled and so on, creating a 2,3,2,3,2,3,2 sequence. It basically looks like this: 1a,1b,2a,2b,2c,3a,3b,4a. Each number is the original film frame. This lovely piece of math allows the 24fps film to be converted to be displayed on 60Hz products (nearly every TV in the US, ever).
This can be done in a number of places. With DVDs, it was all done in the player. With HD DVD, it is done in the player to output 1080i. With Blu-ray, there are a few options. The first player, the Samsung, added the 3:2 to the signal, interlaced it, and then output that (1080i) or de-interlaced the same signal and output that (1080p). In this case, the only difference between 1080i and 1080p is where the de-interlacing is done. If you send 1080i, the TV de-interlaces it to 1080p. If you send your TV the 1080p signal, the player is de-interlacing the signal. As long as your TV is de-interlacing the 1080i correctly, then there is no difference.
The next Blu-ray players (from Pioneer and the like) will have an additional option. They will be able to output the 1080p/24 from the disc directly. At first you may think that if your TV doesn't accept 1080p, you'll miss out on being able to see the "unmolested"1080p/24 from the disc. Well even if your TV could accept the 1080p/24, your TV would still have to add the 3:2 pulldown itself (the TV is still 60Hz). So you're not seeing the 1080p/24 regardless.
The only exception to that rule is if you can change the refresh on the TV. Pioneer's plasmas can be set to refresh at 72 Hz. These will take the 1080p/24, and do a simple 3:3 pull down (repeating each frame 3 times).
Short Version
What this all means is this:
• When it comes to movies (as in HD DVD and Blu-ray) there will be no visible difference between the 1080i signal and the 1080p signal, as long as your TV correctly de-interlaces 1080i. So even if you could input 1080p, you wouldn't see a difference (because there is none).
• There is no additional or new information in a 1080p signal from movie based content.
• The only time you would see a difference is if you have native 1080p/60 content, which at this point would only come from a PC and maybe the PS3. 1080p/60 does have more information than 1080i/30, but unless you're a gamer you will probably never see native 1080p/60 content. It is incredibly unlikely that they will ever broadcast 1080p (too much bandwidth) or that 1080p/60 content will show up on discs (too much storage space and no one is using it to record/film).
So all of you people who bought 1080p displays only to be told by the companies that you had bought 1080i TVs, relax. The TV will convert everything to 1080p. Now if you bought a TV that doesn't de-interlace 1080i correctly, well, that's a whole other story.

Did that answer your question even though I added much other information to make it all come together?
Also, check out this link - Feature Article
 
Few things I'd like to add:

1. Interlacing of movie based material is called telecine (1080/24p->1080/60i) and the reverse process - inverse telecine (IVTC).

2. There is a difference how the bits are stored on DVD and HD/BD: progressive frames 1080/24p on HD/BD (telecined by the player if needed) and already telecined on DVD.

3. Every player capable of 1080p output can also do 1080/60i. Only entry level HD players can't do 1080p.

4. 1080/24p output capability is of significant value (eliminates judder) when using a projector (many of them can run at a multiple of 24Hz refresh rate).

Diogen.
 

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)

Latest posts