Thought this was a good read since there has been debate on this in the past..
http://www.broadbandreports.com/shownews/77475
An interesting read over at Gamespot dissecting why the 1080p high-definition standard doesn't really matter. The author breaks down why the emerging standard isn't particularly important as it pertains to TV, gaming, and film. As you might imagine, limited bandwidth (at least via existing compression schemes) is a main reason you won't see 1080p content via dish or cable any time soon:
"The American ATSC standard gives each broadcaster 19.4Mbps to transmit video for each broadcast channel. Broadcasters are free to transmit as many streams as they want as long as the total bandwidth for all the channels does not exceed 19.4Mbps. Consider that one 1080i stream compressed using MPEG2 at decent quality takes up about 12Mbps. Now consider that an equivalent 1080p stream will take up twice that bandwidth. You can see why nobody does 1080p, and this situation will not change until a new encoding standard arrives, which won't happen for at least another decade."
IPTV over VDSL outfits like AT&T, limited to around 24Mbps for the full pipe at first swing, are even less likely to embrace 1080p content. It's a good read that should make people who've dropped big-cash on a non-1080p display feel much better.
Here the read from GameSpot and IPTV as we know is haveing issues already with bandwidth which its not good to have everything going through your net connection.. Anyway article below is the origianl source..
http://www.gamespot.com/pages/forums/show_msgs.php?topic_id=24908759
Pay attention, class. There will be a test at the end of this class.
1080p does not matter. Here's why:
There are a number of facts that must be grasped first:
All digital displays are progressive scan by nature.
Virtually all film releases are shot at 24 frames per second and are progressive scan.
1080i delivers 30 frames per second, and 1080p delivers 60 frames per second.
All HDTV broadcasts and virtually all games will be limited to 720p or 1080i for the foreseeable future.
Got all that? Good. Now lets go into the explanation.
Movies
Take a movie. It's 24 frames per second, progressive scan. This is the nature of how movies are shot on film today. Just about all movies are shot this way; the only exceptions are films where the director or producer wants to make an artistic statement. But if you saw it at your local multiplex, it's in 24fps progressive.
Now, let's put it onto a disc so we can sell it. First, we scan each individual frame of the movie, one by one, at a super high resolution (far higher than even 1080p.) This gives us a digital negative of the film, from which every digital version of the film will be made (this means the HD, DVD, On-demand, PPV, digital download, digital cable and PSP versions were all made from this one digital negative.) We'll only concern ourselves with the HD version for now.
Because it's HD, we'll take the digital negative and re-encode it in MPEG2, .h264 or VC1 at 1920x1080 and 24 frames per second to match the source material. And this is how it is on the disc when you get it from the store, whether it's Blu-ray or HD-DVD.
Once you put it in your disc player to view the film, a number of things happen.
1080i/1080p
Because the film is in 24fps, and 1080i is 30fps, every second the player has to come up with 6 additional frames to make up the gap. It does this through a process called 3:2 pulldown whereby 4 film frames (1/6th of a second of the film) are processed to create 5 video frames (1/6th of a second on your TV screen). Just exactly how this is done is outside the scope of this post (click here) but the important thing to realize is none of the picture data is lost during this process; just re-formatted.
Now, here's the crucial difference between 1080i and 1080p, as it relates to movies. With 1080i transmission, the player interlaces the frames during the pulldown and sends the interlaced frames to the TV set to be deinterlaced. With 1080p transmission, the player never interlaces the frames. Click to see how deinterlacing works. Regardless, you will get the exact same result. The only exception is if you have a crap TV that doesn't deinterlace properly, but chances are that TV won't support 1080p anyway.
So 1080p doesn't matter for movies.
Television
Television is a little different. Television is typically not shot on film, it's shot on video which is a vastly different technique. While movies are almost always shot at 24fps, standard-def NTSC TV is shot at 30fps interlaced, and HDTV is shot at whatever the production company decides, usually 1080i at 30fps, or 720p at 60fps, depending on the network. What, no 1080p? Nope. Why? Bandwidth.
The American ATSC standard gives each broadcaster 19.4Mbps to transmit video for each broadcast channel. Broadcasters are free to transmit as many streams as they want as long as the total bandwidth for all the channels does not exceed 19.4Mbps. Consider that one 1080i stream compressed using MPEG2 at decent quality takes up about 12Mbps. Now consider that an equivalent 1080p stream will take up twice that bandwidth. You can see why nobody does 1080p, and this situation will not change until a new encoding standard arrives, which won't happen for at least another decade.
So 1080p doesn't matter for television.
Games
Ah, now we come to the heart of the matter. Games. The reason why there will be very few 1080p games is a simple one: lack of memory. All graphics cards, including those found in Xbox 360 and PS3, have what's known as a frame-buffer. This is a chunk of memory set aside to store the color information of every pixel that makes up a frame that will be sent to the screen. Every single calculation the graphics card makes is designed to figure out how to fill up the frame-buffer so it can send the contents of the frame-buffer to the screen.
Time to break out the calculators, because we're doing some math.
A 720p frame is 1280 pixels wide by 720 pixels high. That means one 720p frame contains 921,600 pixels. Today's graphics cards use 32-bit color for the final frame. This means each pixel requires 32 bits - 4 bytes - to represent its color information. 921,600x4 = 3,686,400 bytes or a little over 3.5MB.
A 1080i frame is 1920 pixels wide by 540 high. That's 1,036,800 pixels, 4,147,200 bytes or a little less than 4MB.
Now, a 1080p frame. 1920 wide by 1080 high. 2,073,600 pixels, 8,294,400 bytes, a smidgen less than 8MB.
Ooh, but the 360 has 512MB, and the PS3 has 256MB for graphics. How is 8MB going to hurt? Oh, it hurts. Graphics cards will have several internal frame-buffers to handle different rendering passes, and each one requires memory. And the textures and mapping surfaces all have to fit within that same memory space. In the case of the 360, there's also audio and game data fighting for the same space (though the "space" is twice as big on Xbox 360.) That's why GTHD looked like crap, because in order to get it running in 1080p, they sacrificed most of the rendering passes and other effects.
This is why the vast, vast majority of Xbox 360 and PS3 next-gen games will stick to 1080i or 720p.
So 1080p doesn't matter for games.
In conclusion, 1080p does not matter. Period. If you think it does, you're just buying in to Sony's marketing hype.
Class dismissed.
http://www.broadbandreports.com/shownews/77475
An interesting read over at Gamespot dissecting why the 1080p high-definition standard doesn't really matter. The author breaks down why the emerging standard isn't particularly important as it pertains to TV, gaming, and film. As you might imagine, limited bandwidth (at least via existing compression schemes) is a main reason you won't see 1080p content via dish or cable any time soon:
"The American ATSC standard gives each broadcaster 19.4Mbps to transmit video for each broadcast channel. Broadcasters are free to transmit as many streams as they want as long as the total bandwidth for all the channels does not exceed 19.4Mbps. Consider that one 1080i stream compressed using MPEG2 at decent quality takes up about 12Mbps. Now consider that an equivalent 1080p stream will take up twice that bandwidth. You can see why nobody does 1080p, and this situation will not change until a new encoding standard arrives, which won't happen for at least another decade."
IPTV over VDSL outfits like AT&T, limited to around 24Mbps for the full pipe at first swing, are even less likely to embrace 1080p content. It's a good read that should make people who've dropped big-cash on a non-1080p display feel much better.
Here the read from GameSpot and IPTV as we know is haveing issues already with bandwidth which its not good to have everything going through your net connection.. Anyway article below is the origianl source..
http://www.gamespot.com/pages/forums/show_msgs.php?topic_id=24908759
Pay attention, class. There will be a test at the end of this class.
1080p does not matter. Here's why:
There are a number of facts that must be grasped first:
All digital displays are progressive scan by nature.
Virtually all film releases are shot at 24 frames per second and are progressive scan.
1080i delivers 30 frames per second, and 1080p delivers 60 frames per second.
All HDTV broadcasts and virtually all games will be limited to 720p or 1080i for the foreseeable future.
Got all that? Good. Now lets go into the explanation.
Movies
Take a movie. It's 24 frames per second, progressive scan. This is the nature of how movies are shot on film today. Just about all movies are shot this way; the only exceptions are films where the director or producer wants to make an artistic statement. But if you saw it at your local multiplex, it's in 24fps progressive.
Now, let's put it onto a disc so we can sell it. First, we scan each individual frame of the movie, one by one, at a super high resolution (far higher than even 1080p.) This gives us a digital negative of the film, from which every digital version of the film will be made (this means the HD, DVD, On-demand, PPV, digital download, digital cable and PSP versions were all made from this one digital negative.) We'll only concern ourselves with the HD version for now.
Because it's HD, we'll take the digital negative and re-encode it in MPEG2, .h264 or VC1 at 1920x1080 and 24 frames per second to match the source material. And this is how it is on the disc when you get it from the store, whether it's Blu-ray or HD-DVD.
Once you put it in your disc player to view the film, a number of things happen.
1080i/1080p
Because the film is in 24fps, and 1080i is 30fps, every second the player has to come up with 6 additional frames to make up the gap. It does this through a process called 3:2 pulldown whereby 4 film frames (1/6th of a second of the film) are processed to create 5 video frames (1/6th of a second on your TV screen). Just exactly how this is done is outside the scope of this post (click here) but the important thing to realize is none of the picture data is lost during this process; just re-formatted.
Now, here's the crucial difference between 1080i and 1080p, as it relates to movies. With 1080i transmission, the player interlaces the frames during the pulldown and sends the interlaced frames to the TV set to be deinterlaced. With 1080p transmission, the player never interlaces the frames. Click to see how deinterlacing works. Regardless, you will get the exact same result. The only exception is if you have a crap TV that doesn't deinterlace properly, but chances are that TV won't support 1080p anyway.
So 1080p doesn't matter for movies.
Television
Television is a little different. Television is typically not shot on film, it's shot on video which is a vastly different technique. While movies are almost always shot at 24fps, standard-def NTSC TV is shot at 30fps interlaced, and HDTV is shot at whatever the production company decides, usually 1080i at 30fps, or 720p at 60fps, depending on the network. What, no 1080p? Nope. Why? Bandwidth.
The American ATSC standard gives each broadcaster 19.4Mbps to transmit video for each broadcast channel. Broadcasters are free to transmit as many streams as they want as long as the total bandwidth for all the channels does not exceed 19.4Mbps. Consider that one 1080i stream compressed using MPEG2 at decent quality takes up about 12Mbps. Now consider that an equivalent 1080p stream will take up twice that bandwidth. You can see why nobody does 1080p, and this situation will not change until a new encoding standard arrives, which won't happen for at least another decade.
So 1080p doesn't matter for television.
Games
Ah, now we come to the heart of the matter. Games. The reason why there will be very few 1080p games is a simple one: lack of memory. All graphics cards, including those found in Xbox 360 and PS3, have what's known as a frame-buffer. This is a chunk of memory set aside to store the color information of every pixel that makes up a frame that will be sent to the screen. Every single calculation the graphics card makes is designed to figure out how to fill up the frame-buffer so it can send the contents of the frame-buffer to the screen.
Time to break out the calculators, because we're doing some math.
A 720p frame is 1280 pixels wide by 720 pixels high. That means one 720p frame contains 921,600 pixels. Today's graphics cards use 32-bit color for the final frame. This means each pixel requires 32 bits - 4 bytes - to represent its color information. 921,600x4 = 3,686,400 bytes or a little over 3.5MB.
A 1080i frame is 1920 pixels wide by 540 high. That's 1,036,800 pixels, 4,147,200 bytes or a little less than 4MB.
Now, a 1080p frame. 1920 wide by 1080 high. 2,073,600 pixels, 8,294,400 bytes, a smidgen less than 8MB.
Ooh, but the 360 has 512MB, and the PS3 has 256MB for graphics. How is 8MB going to hurt? Oh, it hurts. Graphics cards will have several internal frame-buffers to handle different rendering passes, and each one requires memory. And the textures and mapping surfaces all have to fit within that same memory space. In the case of the 360, there's also audio and game data fighting for the same space (though the "space" is twice as big on Xbox 360.) That's why GTHD looked like crap, because in order to get it running in 1080p, they sacrificed most of the rendering passes and other effects.
This is why the vast, vast majority of Xbox 360 and PS3 next-gen games will stick to 1080i or 720p.
So 1080p doesn't matter for games.
In conclusion, 1080p does not matter. Period. If you think it does, you're just buying in to Sony's marketing hype.
Class dismissed.