ALL VOOM TO GO MPEG4 AUGUST 15th

The vast majority of digital video cameras in use today have 1440x1080 CCDs right? It will be a while (read that as a few years) before we see more video camera content that actually delivers on 1920x1080 in any significant quantity.

Then it becomes a question of do you use the bandwidth to do the unsqueeze ahead of transmission or to have the receiver do the unsqueeze?

This doesn't hold true for film originated material or digital movie cameras.

I'm not against full resolution but if the source isn't full resolution it's really an argument over who's going to do the anamorphic processing.

Cheers,

Actually the only pro format that outputs 1440 is HDCAM. HDCAM camera actually produces 1920x1080i image but HDCAM tape decks record 1440 due to its compression. It is still in use, but being phased out. All sporting events or shows are done in 1920x1080i and always have. For example La Liga on World sport is produced in 1920x1080i. OB trucks all use either Ikegami, Thomson LDK 6000 cameras which are 1920x1080i. Newer trucks use Sony 1920x1080p cameras.
 
No, as John said in the post above, most Voom was created in 1440x1080, not 1920x1080. No point in making them 1920 if there's only 1440 data, unless you think your TV would do a bad job of 1440 to 1920.

Just because many of the shows for Voom were created using HDCAM it does not mean, the channels are not 1920x1080i. Voom can actually be credited with inventing HD lite as before them all channels were 1920x1080i. They started using 1440 to save bandwith, not because their channels were done that way. As a matter of fact all the channels on Voom were 1440x1080i, that includes HBO, Cinemax, Starz. I have no problem with some channels originating at 1440x1080i. My problem is with providers throwing away resolution. If channel is delivered at 1920x1080 then it should be passed that way. That is all.
 
The vast majority of digital video cameras in use today have 1440x1080 CCDs right? It will be a while (read that as a few years) before we see more video camera content that actually delivers on 1920x1080 in any significant quantity.

Then it becomes a question of do you use the bandwidth to do the unsqueeze ahead of transmission or to have the receiver do the unsqueeze?

This doesn't hold true for film originated material or digital movie cameras.

I'm not against full resolution but if the source isn't full resolution it's really an argument over who's going to do the anamorphic processing.

Cheers,

There is no anamorphic process in HD. HD uses square pixels. HDCAM is 1440x1080 square pixel. No HD equipment supports anamorphic squeeze and that includes Blu-ray and HD DVD. HD lite process does not create anamorphic squeeze, it just throws away pixels.
 
My problem is with providers throwing away resolution. If channel is delivered at 1920x1080 then it should be passed that way. That is all.
This is simply knowing just enough about technology to get the wrong idea.
MPEG2 and MPEG4 automatically throw away resolution in the compression process.
If you have a still picture in .TIF format from your camera, you might be seeing 1920x1080. On your TV, you never see it. Digital video (like 35mm movies) is an illusion giving you the impression that you are watching a realistic scene.
Just watch the channel. If it looks good, it is good PQ. If it looks bad, it is bad PQ.
 
This is simply knowing just enough about technology to get the wrong idea.
MPEG2 and MPEG4 automatically throw away resolution in the compression process.
If you have a still picture in .TIF format from your camera, you might be seeing 1920x1080. On your TV, you never see it. Digital video (like 35mm movies) is an illusion giving you the impression that you are watching a realistic scene.
Just watch the channel. If it looks good, it is good PQ. If it looks bad, it is bad PQ.


I know enough about the technology. Don't you worry about that. Mpeg is lossy compression and I know that very well. However when you throw away resolution before compression you lose even more. That is the whole point.
To me the picture on E* looks fine.
 
Actually the only pro format that outputs 1440 is HDCAM. HDCAM camera actually produces 1920x1080i image but HDCAM tape decks record 1440 due to its compression. It is still in use, but being phased out. .... Newer trucks use Sony 1920x1080p cameras.

So maybe I can buy one of these old HDCAMs for a song? :eureka

So rather than say "anamorphic process" maybe we should say "dithering?"
 
There is no anamorphic process in HD. HD uses square pixels. HDCAM is 1440x1080 square pixel. No HD equipment supports anamorphic squeeze and that includes Blu-ray and HD DVD. HD lite process does not create anamorphic squeeze, it just throws away pixels.

Um, by definition HDCAM is an anamorphic format. 1440x1080 is a 4:3 aspect ratio if you're using square pixels.
 
navychop I suppose that depends on how well you can sing. :)


What I find so amazing is how so many smart people can't comprehend the concept that we had quite a bit of HD production being done with these lower cost HDCAMs in the beginning as well as today. IMO, upsampling a 1440 to 1920 is not replacing what you call HD lite. It doesn't give you any better picture, not that you could see it anyway if all you are using is a digital monitor that is native 1280. IT gives yo the ability to mesh the content with other 1920 frames. But the 1440 upsampled to 1920 will look noticeably softer.

A few years ago before the HDlite became the buzz word of lay people wishing to sound like they knew something, we had a more interesting debate that also could not be proved with any factual evidence. It was which was better 720p x 1280 or 1080i x 1920. If one studies the reason why 1080i was even listed was because the interlaced version of HD was for interlaced scanned monitors, that is CRT's, As we progressed (pun) 720p became the buzz since it looks so much smoother on a progressive monitor. Now we have the new kid on the block, 1080p which looks fabulous on a true 1080p monitor but source for this is only available on disk, not broadcast.

So what will it be in the next debate- anything less than 1080p x 1920 is HD lite? I see that coming.
 
Figaro, Figaro, F-I-G-A-R-O!
(I'm doomed).

Naw, anything less than 7,680 x 4,320 will be "Ultra HD Lite." But the good news is, there'll only be a single format- HVD!;)
 
Um, by definition HDCAM is an anamorphic format. 1440x1080 is a 4:3 aspect ratio if you're using square pixels.


Oh gee!

HDCAM is a tape recording format that is non-sq pixels in recording 1080i x 1440. The camera head is 1080i x 1920. For playback the content is upsampled to 1920 again. The argument is that the HDCAM format does not maintain true 1920 resolution through the production process. In effect, what you are seeing, IS 1440 maximum as the rest was discarded for recording.

But if numbers make the HD lite 1440 crowd happy, we should print the specs that way and end the argument. Me, I know better and prefer to just watch anything beyond 1280 on my present monitor and I'm happy. Because I no longer use a CRT display, I convert everything to 720P x 1280 because that is the native resolution of my monitor.
 
I know enough about the technology. Don't you worry about that. Mpeg is lossy compression and I know that very well. However when you throw away resolution before compression you lose even more. That is the whole point.
To me the picture on E* looks fine.
No, when you throw away resolution before compression, you lose less.
And - that is why the picture looks fine.
One of the methods used by MPEG2 and MPEG4 compression is to throw out resolution. If you lower the resolution first, then that is an aid to a real-time compression system. And, by lowering the compression, you are lowering the bandwidth input to the system, which therefore has to compress the stream less to accomplish its target bitrate.
 
According to VOOM, nearly all of their HD content is stored in 1920x1080i, they (a high ranking exec from VOOM) once stated this on a Charlie Chat. Much of it is converted from film to 1920x1080i. I'm sure some of it was captured on 1440x1080 cameras, but if this was upconverted to 1920x1080 for storage, then downconverting it back to 1440x1080 will not restore it to its pristine original image. The downconversion process will further degrade the quality of the image.

Thus it is my belief that nearly all of VOOM's HD content would look better if it were broadcast in full 1920x1080 (given sufficient bandwidth). Which was further confirmed back in the days when a few of the channels were being transmitted in 1920x1080 and they looked great.
 
All Voom MPEG-4? Not.

As of August 19 I still have 5 of the 10 Voom channels that my model 811 receiver was getting. I still get MonstersHD, FilmFestHD, AnamainiaHD, EquatorHD and RaveHD.

Anyone else with a non-ViP series receiver still see these?
 

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)

Latest posts