What no one will tell you about HDTV...

  • WELCOME TO THE NEW SERVER!

    If you are seeing this you are on our new server WELCOME HOME!

    While the new server is online Scott is still working on the backend including the cachine. But the site is usable while the work is being completes!

    Thank you for your patience and again WELCOME HOME!

    CLICK THE X IN THE TOP RIGHT CORNER OF THE BOX TO DISMISS THIS MESSAGE

How many techs have been told that the picture was better after re-pointing a dish?


  • Total voters
    187
There is absolutely no way a digital signal can possibly produce ghosts or degrade to analog quality. The first thing you notice is block errors, no block errors, no degradation. Ghosting, you're killing me.

Welcome to our nightmare.
 
Please don't kill this thread. It is much more entertaining than reading the threads lamenting the lack of, and predicting the future of SciFi and USA HD. :)
 
Please don't kill this thread. It is much more entertaining than reading the threads lamenting the lack of, and predicting the future of SciFi and USA HD. :)

I won't nor do I think the others will... read the url at the top of your browser, this is Satelliteguys, not some other sites. ;)
 
Admirable

I won't nor do I think the others will... read the url at the top of your browser, this is Satelliteguys, not some other sites. ;)

You have indeed further elevated the status of Satelliteguys, in my estimation, to the top forum available on the subject! Hands down. Bravo!

So, if that is the case and the discussion hasn't gotten out of hand, then your enjoyment shall increase as I provide you with more information for your scrutiny, criticism, and input.

I ask one thing though...destroy me, or vindicate me, ONE point at a time, o.k.? Let's discuss a point and find a valid consensus.

Here's a starting point.

"Bitrate: Measured as "bits per second," and used to express the rate at which data is transmitted or processed. The higher the bitrate, the more data that is processed and, typically, the higher the picture resolution. Digital video formats typically have bitrates measured in megabits-per-second (Mbps). (One megabit equals one million bits.) The maximum bitrate for DVD playback is 10 Mbps; for HDTV it's 19.4 Mbps."

Have fun, got to go for now.
 
HD PQ vs signal strength

Above a certain bit error rate (BER) threshold, there is almost NO EFFECT on PQ from increasing the signal quality (which reduces the BER.)

At or near that threshold, small changes in BER can have noticeable effects on PQ.

In practice, this BER threshold VARIES slightly with the video signal, because MPEG requires less data to render a set of static frames than to render the same size set of moving or different frames.

The threshold is fairly sharp, if you see additional artifacts in the picture from high BER, then pixelization, blocking and/or loss of picture will result from pretty small signal strength drops.

Bear in mind that FEC data can be corrupted as well, and errors in that data could affect an otherwise good picture. However, this situation is not too likely unless the signal is near the BER threshold as noted above, as the BERs for picture data and FEC data are the same.

Other system noise and artifacts can also affect PQ. Some of these may be high-order, subtle effects which are system-dependent--and would be more likely on poorly-designed equipment.

On properly-designed equipment, a solid signal (strength/quality/BER) and good connections/cabling/wire routing will ensure good and stable PQ.

A placebo effect can apply, too. "I think/feel it's better" often applies after a change is made which is thought to improve things. An objective resolution/contrast test pattern is the best way to prove/disprove this.

As most agree, maximizing signal while getting the lowest BER (with the BER taking priority) is best.
 
Here's a starting point.

"Bitrate: Measured as "bits per second," and used to express the rate at which data is transmitted or processed. The higher the bitrate, the more data that is processed and, typically, the higher the picture resolution. Digital video formats typically have bitrates measured in megabits-per-second (Mbps). (One megabit equals one million bits.) The maximum bitrate for DVD playback is 10 Mbps; for HDTV it's 19.4 Mbps."


Yes. That is a definition of bitrate.

And? Why are we at this starting point? Is there something you wish us to respond to here?

I am guessing, because you highlighted the part about higher bitrate = higher picture resolution, that your point is that it somehow supports your original contention?

If that is your point, bzzzt, sorry, wrong. Was it?
 
Here's a starting point.

"Bitrate: Measured as "bits per second," and used to express the rate at which data is transmitted or processed. The higher the bitrate, the more data that is processed and, typically, the higher the picture resolution. Digital video formats typically have bitrates measured in megabits-per-second (Mbps). (One megabit equals one million bits.) The maximum bitrate for DVD playback is 10 Mbps; for HDTV it's 19.4 Mbps."

Have fun, got to go for now.

Yeah....and this supports your early argument how? The installer or tech cant control the bit rate at the consumer end
 
The info is wrong again. Max bitrate for HDTV is not 19.4 megabits. Second of all it really has nothing to do with your signal quality again.
 
The info is wrong again. Max bitrate for HDTV is not 19.4 megabits. Second of all it really has nothing to do with your signal quality again.
True, a quick google search will yield several different results on what the "max" bit rate is. I guess jeff just picked one.
 

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)

Latest posts