An update from Scott at CES

  • WELCOME TO THE NEW SERVER!

    If you are seeing this you are on our new server WELCOME HOME!

    While the new server is online Scott is still working on the backend including the cachine. But the site is usable while the work is being completes!

    Thank you for your patience and again WELCOME HOME!

    CLICK THE X IN THE TOP RIGHT CORNER OF THE BOX TO DISMISS THIS MESSAGE
Don,

Then what, in layman's terms then, would be a good way of understanding a digital display device's resolution capability?
 
Actually, with digital it is easy... pixels of native resolution.

When selecting a monitor- you should ask What is the monitor's native resolution? All digital monitors are rated this way and should be operated this way for best results. This doesn't mean that a monitor seeing a different resolution won't display it, it just means that sending a signal that is other than the native resolution will require conversion or discarding data and that may not be the best looking image on the screen



a monitor that is expressed in 720Px1280 has a native resolution that is 720 pixels high and 1280 wide. Send it a signal that is 720x1281 and one pixel get's discarded. Send it a signal that is 720x1279 and the image will be distributed over the 1280 by whatever method the monitor rescales. It could be a black bar or it could be a replication of a pixel or more depending on the design.

What this means is that any image information that is sent to that monitor that is >1280 pixels will be imaged by those 1280 pixels and all you'll ever see is just the 1280. And that is why I (going back full circle) maintain that offering up more than 1280 pixels in a signal is lost anyway. With analog the story changes because there is no finite limit. Analog is much more subjective to a visual and is so variable that double blind testing and other averaging tests must be performed to achieve a consensus. Again- with digital it is easy because it is either on or off.
 
Okay, so we can rely on H x V resolution when looking at the native resolution reported in the manual on any given digital capable display device. Now, for the statement that in "analog...there is no infinite limit". I understand that Japan has been broadcasting HD for a long time now, but they do it analog. Does that mean they use a much larger bandwidth?
 
Stacy- I don't know that is true. I do know that back in 1988 NHK with Hitachi did some initial first HDTV broadcasts in 1150 analog but MY most recent knowledge was during the 2002? Japan Olympics where NHK and NBC did that in 1080i using the ATSC digital standard. I believe that Japan as well as world wide has adopted the ATSC digital HDTV broadcast standard. Sorry, but I have not kept up on the old analog HD experiments and who might still be using them. Using these original analog HD experiments, to answer your last question, YES, analog HD was much higher bandwidth than ATSC which was designed to maintain a 6 Mhz bandwidth of frequencies per TV channel, same as current day analog NTSC. But now were diverging into RF and that is a different subject than monitors. :)



"Now, for the statement that in "analog...there is no infinite limit"."

You mis quoted me- I said there is no FINITE limit. :)
 
Don Landis said:
MY most recent knowledge was during the 2002? Japan Olympics where NHK and NBC did that in 1080i using the ATSC digital standard. I believe that Japan as well as world wide has adopted the ATSC digital HDTV broadcast standard.)

I know ATSC is not world wide, Japan came up with their own format and Europe uses DVB-T with COFDM modulation and Australia came up with hybrid format that no one else uses.

I know Canda picked ATSC and I think Mexico did too.
 
You are technically correct. Japan adopted the ISDB which uses the same Video standards as ATSC but the modulation is different.
 

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)

Latest posts