1080p Doesn't Matter

Poke

Pub Member / Supporter
Original poster
Dec 3, 2003
13,886
238
OK
Thought this was a good read since there has been debate on this in the past..

http://www.broadbandreports.com/shownews/77475

An interesting read over at Gamespot dissecting why the 1080p high-definition standard doesn't really matter. The author breaks down why the emerging standard isn't particularly important as it pertains to TV, gaming, and film. As you might imagine, limited bandwidth (at least via existing compression schemes) is a main reason you won't see 1080p content via dish or cable any time soon:
"The American ATSC standard gives each broadcaster 19.4Mbps to transmit video for each broadcast channel. Broadcasters are free to transmit as many streams as they want as long as the total bandwidth for all the channels does not exceed 19.4Mbps. Consider that one 1080i stream compressed using MPEG2 at decent quality takes up about 12Mbps. Now consider that an equivalent 1080p stream will take up twice that bandwidth. You can see why nobody does 1080p, and this situation will not change until a new encoding standard arrives, which won't happen for at least another decade."

IPTV over VDSL outfits like AT&T, limited to around 24Mbps for the full pipe at first swing, are even less likely to embrace 1080p content. It's a good read that should make people who've dropped big-cash on a non-1080p display feel much better.

Here the read from GameSpot and IPTV as we know is haveing issues already with bandwidth which its not good to have everything going through your net connection.. Anyway article below is the origianl source..

http://www.gamespot.com/pages/forums/show_msgs.php?topic_id=24908759

Pay attention, class. There will be a test at the end of this class.

1080p does not matter. Here's why:

There are a number of facts that must be grasped first:

All digital displays are progressive scan by nature.
Virtually all film releases are shot at 24 frames per second and are progressive scan.
1080i delivers 30 frames per second, and 1080p delivers 60 frames per second.
All HDTV broadcasts and virtually all games will be limited to 720p or 1080i for the foreseeable future.

Got all that? Good. Now lets go into the explanation.

Movies

Take a movie. It's 24 frames per second, progressive scan. This is the nature of how movies are shot on film today. Just about all movies are shot this way; the only exceptions are films where the director or producer wants to make an artistic statement. But if you saw it at your local multiplex, it's in 24fps progressive.

Now, let's put it onto a disc so we can sell it. First, we scan each individual frame of the movie, one by one, at a super high resolution (far higher than even 1080p.) This gives us a digital negative of the film, from which every digital version of the film will be made (this means the HD, DVD, On-demand, PPV, digital download, digital cable and PSP versions were all made from this one digital negative.) We'll only concern ourselves with the HD version for now.

Because it's HD, we'll take the digital negative and re-encode it in MPEG2, .h264 or VC1 at 1920x1080 and 24 frames per second to match the source material. And this is how it is on the disc when you get it from the store, whether it's Blu-ray or HD-DVD.

Once you put it in your disc player to view the film, a number of things happen.

1080i/1080p
Because the film is in 24fps, and 1080i is 30fps, every second the player has to come up with 6 additional frames to make up the gap. It does this through a process called 3:2 pulldown whereby 4 film frames (1/6th of a second of the film) are processed to create 5 video frames (1/6th of a second on your TV screen). Just exactly how this is done is outside the scope of this post (click here) but the important thing to realize is none of the picture data is lost during this process; just re-formatted.

Now, here's the crucial difference between 1080i and 1080p, as it relates to movies. With 1080i transmission, the player interlaces the frames during the pulldown and sends the interlaced frames to the TV set to be deinterlaced. With 1080p transmission, the player never interlaces the frames. Click to see how deinterlacing works. Regardless, you will get the exact same result. The only exception is if you have a crap TV that doesn't deinterlace properly, but chances are that TV won't support 1080p anyway.

So 1080p doesn't matter for movies.

Television

Television is a little different. Television is typically not shot on film, it's shot on video which is a vastly different technique. While movies are almost always shot at 24fps, standard-def NTSC TV is shot at 30fps interlaced, and HDTV is shot at whatever the production company decides, usually 1080i at 30fps, or 720p at 60fps, depending on the network. What, no 1080p? Nope. Why? Bandwidth.

The American ATSC standard gives each broadcaster 19.4Mbps to transmit video for each broadcast channel. Broadcasters are free to transmit as many streams as they want as long as the total bandwidth for all the channels does not exceed 19.4Mbps. Consider that one 1080i stream compressed using MPEG2 at decent quality takes up about 12Mbps. Now consider that an equivalent 1080p stream will take up twice that bandwidth. You can see why nobody does 1080p, and this situation will not change until a new encoding standard arrives, which won't happen for at least another decade.

So 1080p doesn't matter for television.

Games

Ah, now we come to the heart of the matter. Games. The reason why there will be very few 1080p games is a simple one: lack of memory. All graphics cards, including those found in Xbox 360 and PS3, have what's known as a frame-buffer. This is a chunk of memory set aside to store the color information of every pixel that makes up a frame that will be sent to the screen. Every single calculation the graphics card makes is designed to figure out how to fill up the frame-buffer so it can send the contents of the frame-buffer to the screen.

Time to break out the calculators, because we're doing some math.

A 720p frame is 1280 pixels wide by 720 pixels high. That means one 720p frame contains 921,600 pixels. Today's graphics cards use 32-bit color for the final frame. This means each pixel requires 32 bits - 4 bytes - to represent its color information. 921,600x4 = 3,686,400 bytes or a little over 3.5MB.

A 1080i frame is 1920 pixels wide by 540 high. That's 1,036,800 pixels, 4,147,200 bytes or a little less than 4MB.

Now, a 1080p frame. 1920 wide by 1080 high. 2,073,600 pixels, 8,294,400 bytes, a smidgen less than 8MB.

Ooh, but the 360 has 512MB, and the PS3 has 256MB for graphics. How is 8MB going to hurt? Oh, it hurts. Graphics cards will have several internal frame-buffers to handle different rendering passes, and each one requires memory. And the textures and mapping surfaces all have to fit within that same memory space. In the case of the 360, there's also audio and game data fighting for the same space (though the "space" is twice as big on Xbox 360.) That's why GTHD looked like crap, because in order to get it running in 1080p, they sacrificed most of the rendering passes and other effects.

This is why the vast, vast majority of Xbox 360 and PS3 next-gen games will stick to 1080i or 720p.

So 1080p doesn't matter for games.

In conclusion, 1080p does not matter. Period. If you think it does, you're just buying in to Sony's marketing hype.

Class dismissed.
 
If I understood the explanation of 1080p on the yahoo site, 1080p will display 60 frames persecond and 1080 i will display 30 frames per second.
The frames in 1080p will be redunant. In other words two full frames made up of the even and odd scans which = 2 identical frames where 1080i will trick the eye into seeing 1 full frame per second.
Two identical frames will look the same as one full frame, so 1080 p is useless unless you think two identical frames will make it look better.
The author says he can;t see any difference.
Now this happens before the 3-2 pull down to 24 frames per second.
 
Poke:

Ever heard of continuing education? Here it comes...

Movies

Take a movie. It's 24 frames per second, progressive scan. This is the nature of how movies are shot on film today. Just about all movies are shot this way; the only exceptions are films where the director or producer wants to make an artistic statement. But if you saw it at your local multiplex, it's in 24fps progressive.

Even digital film is 24fps.

Now, let's put it onto a disc so we can sell it. First, we scan each individual frame of the movie, one by one, at a super high resolution (far higher than even 1080p.) This gives us a digital negative of the film, from which every digital version of the film will be made (this means the HD, DVD, On-demand, PPV, digital download, digital cable and PSP versions were all made from this one digital negative.)

Only true for true film based productions. An increasing number of productions are going full digital.

We'll only concern ourselves with the HD version for now.

Because it's HD, we'll take the digital negative and re-encode it in MPEG2, .h264 or VC1 at 1920x1080 and 24 frames per second to match the source material. And this is how it is on the disc when you get it from the store, whether it's Blu-ray or HD-DVD.

THis isn't too bad, but the next part is where the wheels come off the cart.



1080i/1080p
Because the film is in 24fps, and 1080i is 30fps, every second the player has to come up with 6 additional frames to make up the gap. It does this through a process called 3:2 pulldown whereby 4 film frames (1/6th of a second of the film) are processed to create 5 video frames (1/6th of a second on your TV screen).

The process is called TELECINE, not 3:2 pulldown. When we start talkng about the Telecine process, we are no longer dealing with frames, but rather fields (odd and even half frames).

Just exactly how this is done is outside the scope of this post (click here) but the important thing to realize is none of the picture data is lost during this process; just re-formatted.

When done perfectly yes. Unfortunately, experience tells us that errors happen quite frequently in the process. Assuming that things will change when we make the leap from SD to HD is a big leap of faith that has yet to be demonstrated as true.

Now, here's the crucial difference between 1080i and 1080p, as it relates to movies. With 1080i transmission, the player interlaces the frames during the pulldown and sends the interlaced frames to the TV set to be deinterlaced. With 1080p transmission, the player never interlaces the frames. Click to see how deinterlacing works. Regardless, you will get the exact same result. The only exception is if you have a crap TV that doesn't deinterlace properly, but chances are that TV won't support 1080p anyway.

And if your set can accept the 24p and then double/triple that to 48 or 72Hz then it's a bunch of unnecessary steps that result in judder.

Judder is terrible. Have you ever watched pans and seen jerky motion on the screen? This is especially true when credits roll, but any type of pan that's faster than a slow pan will exhibit judder.

This is a direct artifact in doing a non-integer conversion between 24fps and Xfps.

Outputting 24fps and allowing the display to "deal with it" allows a display capable of handling 2x or 3x to be judder free.

24p --> i60 (not 30p) --> 60p == Judder

24p --> 48p or 72p != Judder

I know which one I want.

So 1080p doesn't matter for movies.

Hahahahaha. Ignorance == bliss.


Television

Television is a little different. Television is typically not shot on film, it's shot on video which is a vastly different technique.

This is already wrong. Every weekly episodic television show (ie your Sitcoms and your Dramas) are shot on film. Mostly 16mm, though I think that the best looking primetime dramas are shot on 35mm film. Reality shows are generally shot on video, as are live action sports/news/talk shows that are presented in high definition.

So more than half of your prime time lineup is film based. Already, this is > 1/2 wrong, but let's continue anyway.


While movies are almost always shot at 24fps, standard-def NTSC TV is shot at 30fps interlaced, and HDTV is shot at whatever the production company decides, usually 1080i at 30fps, or 720p at 60fps, depending on the network.

Wrong, wrong, wrong wrong wrong. Did I mention this is wrong. What's wrong about it?

HDTV is shot at whatever the production company decides, usually 1080i at 30fps

When we're talking about shooting interlaced video there is no such thing as a real frame. Each field is a discrete point in time. You cannot reassemble a "frame" that never existed by simply taking the odd/even field pairs and saying "Voila` we have a frame". This is, by the way, called weave deinterlacing. It looks like crap on native interlaced video material because each field is offset in time by 1/60th of a second. It's full of what is referred to as jaggies -- rough diagonal edges from the misalignment of the disjoint fields.

You need something that has both Motion Adaptive and Directionally Correlated deinterlacing to smooth out the jaggies. It ain't a piece of cake, it's actually the most compute intensive task that we require of our systems.

This is right here:

What, no 1080p? Nope. Why? Bandwidth.

The American ATSC standard gives each broadcaster 19.4Mbps to transmit video for each broadcast channel. Broadcasters are free to transmit as many streams as they want as long as the total bandwidth for all the channels does not exceed 19.4Mbps. Consider that one 1080i stream compressed using MPEG2 at decent quality takes up about 12Mbps. Now consider that an equivalent 1080p stream will take up twice that bandwidth. You can see why nobody does 1080p, and this situation will not change until a new encoding standard arrives, which won't happen for at least another decade.

So 1080p doesn't matter for television.

Wrong, wrong, wrong, wrong, wrong again.

Guess which takes up more bandwidth, 1080i60, 1080p30 for video based sources?

I'll give you a hint, it's 1080i60.

So much for the bandwidth argument.

For film based sources, guess what takes up more bandwidth:
1080i60 or 1080p24 (which the display would then telecine)?

Once again our "winner" is 1080i60.

I like these articles where someone tries to make a point, and doesn't have a grasp of the technical fundamentals.
 
dude2 said:
If I understood the explanation of 1080p on the yahoo site, 1080p will display 60 frames persecond and 1080 i will display 30 frames per second.
The frames in 1080p will be redunant. In other words two full frames made up of the even and odd scans which = 2 identical frames where 1080i will trick the eye into seeing 1 full frame per second.
Two identical frames will look the same as one full frame, so 1080 p is useless unless you think two identical frames will make it look better.
The author says he can;t see any difference.
Now this happens before the 3-2 pull down to 24 frames per second.

Yes/no/it depends.

It's not quite as simple as that.

We are going to 1080p displays, and the best signal to feed them is... wait for it....



1080p.

Regards,
 

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)