Interesting discussion on the 1080i vs. 720p battle

Walter L.

Supporting Founder
Original poster
Supporting Founder
Mar 29, 2004
1,855
0
Portland, Oregon
There is an interesting discussion on the 1080i vs 720p battle going on at AVSForum. Rich Peterson has posted an interesting explanation by Mark Schubin on why 1080i networks (ex. CBS) seem to be winning the PQ battle against 720p networks (ex ABC, Fox), even when showing high-speed sports. Basically, the explanation says that the advantage is not in the format itself, but on what kind of cameras are available for a given format. Anyway, here it is:

There are many factors that affect picture quality aside from format.
These include lens design, lens condition, lens mounting, camera type,
camera design, camera condition, and camera setup. Those can have huge
effects on picture quality. But I'll concentrate on image format.

An interlaced format has a number of drawbacks relative to a progressive
format. Regardless of image, there is a "pi" effect when scanning lines
are visible that can draw attention, but that's unlikely to be an issue
on an HDTV consumer display viewed at normal distances.

There is another reduction in vertical resolution, referred to as the
"interlace coefficient," which researchers have placed anywhere from 0.5
(half the resolution) to 1.0 (no reduction). In NHK's early HDTV
testing, they found 731 (total) lines progressive to be visually
equivalent to 1125 (total) lines interlaced for still images, but that
research was conducted in an early era of tube cameras.

Another deficiency that can appear in interlace even in still images is
called interline flicker or twitter. If a horizontal line in an image
appears in only one scanning line, it will appear only 29.97 times per
second instead of 59.94. The smaller number is below the human vision
flicker threshold for most viewing conditions. The line, therefore,
will appear to flicker. That's a problem for graphics, but most cameras
are normally set up to do line averaging (two rows on the image sensor
are added to create a scanning line; in the next interlaced field, each
of those rows has a different partner). That reduces the effect.

The other deficiencies of interlace all relate to moving images. When
vertical motion is at a rate that is a multiple of one scanning line per
field, vertical resolution is halved because what was in a scanning line
in one interlaced field appears in a different scanning line in the next
interlaced field, and the adjacent interlaced scanning line never sees
the additional detail in the source. That's a serious issue in graphics
(consider credits at the end of a show), but, thanks to gravity, it's
less of an issue in football.

In horizontal motion, the fields get separated. That's a problem in
signal compression and processing, which is why progressive scanning has
been said to have a compression-efficiency advantage over interlace
roughly equivalent to the interlace coefficient. It's obviously a big
problem in still images. It can be a problem for displays that show all
lines at once, such as LCD, DLP, and some plasma displays. But it is
not a problem for normally interlaced displays like CRTs (direct-view or
projection) other than any interlace-coefficient losses.

So, all else being equal, progressive should look better than interlace.
But all else is absolutely NOT equal. Even ignoring lens, camera,
maintenance, and setup issues, interlace has one very significant
advantage over progressive. It has half the information rate.

A 1280 x 720 progressive camera has a little under a million pixels per
frame. At 59.94 frames per second, it approaches sixty million pixels
per second. An interlaced camera has only 29.97 frames per second, so
it can use roughly twice as many pixels per frame and achieve the same
number of pixels per second. 1980 x 1080 is roughly two million pixels
per frame.

If we assume NHK's research still holds true today, then the 720 lines
of a progressive camera will actually provide slightly better vertical
resolution than the 1080 of an interlaced camera. But there's no
question that the 1920 pixels per line of the interlaced camera are far
more than the 1280 of a progressive camera.

That's a limiting-detail discussion. There's also sharpness. The
psychovisual sensation of sharpness is proportional to the square of the
area under a curve plotting contrast ratio against detail fineness. All
such curves (normally called "modulation-transfer function" or MTF
curves) have a shape somewhat like the right side of a bell shaped
curve, i.e., high at the left, sloping down slightly on a "shoulder,"
dropping faster after the shoulder, and then flaring out at the bottom
in a "toe." The shoulder area is what is most significant for
sharpness. If the shoulder can be made higher and broader, sharpness
increases even when images are viewed after recording on an analog VHS
cassette. The toe area, being low in contrast ratio, is relatively
insignificant, which is how Sony got away with dropping all resolition
over 1440 pixels per line in the professional HDCAM format (JVC and
Panasonic do similar in D9 HD and DVCPRO HD, respectively).

It has LONG been known that more pixels in the camera make a broader
shoulder. Ordinary high-end standard-definition cameras intended for
use in analog broadcasts (which, in the U.S., cannot carry more than
about 440 pixels per line) have typically had about 1300 pixels per line
for the purpose of raising the shoulder of the MTF curve. It works.
That's why the pictures from those cameras look better than pictures
from older cameras, even when viewed off VHS cassettes recorded off
analog broadcasts.

1080-line HDTV cameras typically have 1920 x 1080 sensors. The pictures
would look better if they had, say, 4000 x 1080, but the technology
hasn't really been available to do that economically yet.
Unfortunately, most 720-line HDTV cameras typically have 1280 x 720
sensors. 1280 is fewer pixels per line than in even some high-end SDTV
cameras. It makes for a shortened, lowered shoulder and, therefore,
significantly less sharpness than in a typical 1080-line camera. The
720-line format does not preclude more pixels per line in the camera; it
just hasn't been done until very recently.

Finally, let me discuss format conversion or scaling. ABC the network
distributes 720p signals, but not all ABC affiliates broadcast it. WFAA
in Dallas, for example, uses 1080i in house. So whatever ABC
distributes goes through a format-conversion stage that is likely to
reduce image quality prior to transmission. I don't know what Comcast's
Dallas-area cable systems are doing (I suspect passing whatever the
broadcast is), but, back when they were AT&T's, an executive of the
company told Congress they would convert any 1080i to 720p for the
aforementioned compression efficiency. Add, say, a Pioneer 720p plasma
panel and an early Pioneer set-top ATSC receiver that could only emit
1080i, and you could come up with this bizarre format-conversion scenario:

- ABC decides to show a clip from CBS in a "Monday Night Football" show
and converts it from 1080i to 720p.
- WFAA in Dallas converts ABC's 720p to 1080i.
- Hypothetically, a Dallas-area cable operator converts WFAA's 1080i to
720p.
- The Pioneer set-top box (back in the days when most cable operators
used 8-VSB for DTT retransmission) converts the 720p to 1080i.
- And the Pioneer plasma display converts the 1080i back to 720p.

That's FIVE passes through format converters, regardless of lens,
camera, maintenance, setup, and production issues.

So, to sum up, the only advantage 1080i, as a format, has over 720p is
1920 pixels per line, but those above 720p's 1280 fall in the MTF toe
and are not, therefore, very significant. Unfortunately, 1080i CAMERAS
have 1920 pixels per line and most 720p cameras do not (although they
COULD). That affects the shoulder of the MTF curve and gives 1080i a
big advantage in sharpness.

Many in the industry eagerly await Sony's 1080p camera, which should
make lovely 720p pictures.
 
I'd also argue that most people are not watching with a device capable of displaying native 720p, and a lot of the problems they have with the image is actually in the upscaling their box/display does to get it to 1080i.
 
My pj is native 720p and the 720p football games are not as nice as the 1080i productions. On closeups, there is no difference, they are both very vibrant, crisp and clear. The huge difference is on wide field shots; the grass "dances" and there are spots between players that appear out of focus and there are "squigglies" kind of like mosquito noise around objects on the field. I never see any of these artifacts from a CBS or HDNET sports prodution. BTW, it doesn't seem to matter what resolution I put my stb in either.
 

voom and recording

Baseball is returning to WWOR-TV. (Yankees)

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)

Latest posts