HDMI Scam

You can call me crazy if you want, but better HDMI cable give better picture and sound.

I guarantee that with a double blind test you would not be able to tell the difference between a $5 and $150 HDMI cable even with vector scopes and wave form monitors.

I am sure you see and hear a difference because you convinced yourself or some one convinced you there was and not because of any quantifiable evidence.
 
If you have to calibrate for a living to see the advantages of a $150 Monster cable compared to
a $5 from monoprice, it is the best proof it is a scam for the 99.99% of users...

Diogen.

That's why I said its not worth it for video. Very important for audio.
 
Since HDMI with HDCP is encrypted, all the bits have to make it or the block of bits will not decode properly leading to visual/audio problems.

If your cheap cable does not show any issues you can be assured that all your bits have made it and a more expensive cable will not make a difference.

From wiki, note that they use error correction on both audio and video channels to insure all the bits make it: HDMI - Wikipedia, the free encyclopedia@@AMEPARAM@@/wiki/File:HDMI_Logo.svg" class="image"><img alt="The HDMI logo with the word HDMI in a large font at the top with the term spelled out below in a smaller font as High-Definition Multimedia Interface. A trademark logo is to the right of HDMI." src="http://upload.wikimedia.org/wikipedia/commons/thumb/e/e2/HDMI_Logo.svg/300px-HDMI_Logo.svg.png"@@AMEPARAM@@commons/thumb/e/e2/HDMI_Logo.svg/300px-HDMI_Logo.svg.png
Both HDMI and DVI use TMDS to send 10-bit characters that are encoded using 8b/10b encoding for the Video Data Period and 2b/10b encoding for the Control Period. HDMI adds the ability to send audio/auxiliary data using 4b/10b encoding for the Data Island Period.[72] Each Data Island Period is 32 pixels in size and contains a 32-bit Packet Header, which includes 8 bits of BCH ECC parity data for error correction and describes the contents of the packet.[73] Each Packet contains four subpackets, and each subpacket is 64 bits in size, including 8 bits of BCH ECC parity data, allowing for each Packet to carry up to 224 bits of audio data.[74] Each Data Island Period can contain up to 18 Packets.[75] Seven of the 15 Packet types described in the HDMI 1.3a specifications deal with audio data, while the other 8 types deal with auxiliary data.[73] Among these are the General Control Packet and the Gamut Metadata Packet. The General Control Packet carries information on AVMUTE (which mutes the audio during changes that may cause audio noise) and Color Depth (which sends the bit depth of the current video stream and is required for Deep Color).[76][77] The Gamut Metadata Packet carries information on the color space being used for the current video stream and is required for xvYCC.[
 
Last edited:
That's why I said its not worth it for video. Very important for audio.
This needs elaboration before it has a chance to be true.

Is the audio stream already decoded or being bitstreamed to a receiver for decoding?
What equipment are we talking about ($$$)? What distances? What audio material? What room (any treatment)?

Diogen.
 
Ben- With all due respect to your occupation, I have to say the cable scam is real. Not just for digital video but for digital audio as well.

In the digital world, if you lose enough 1's and 0's as you put it, you won't get picture or sound at all. There is NO SUCH THING as a degraded signal in digital. It's more like a high pass filter. You either get perfect signal or you don't. So, the real test is more like a switch. You can improve the life span or durability of a cable with better quality, or improve reliability with a better quality cable but not PQ nor sound quality on a digital.

Analog is an entirely different animal and can suffer signal quality degradation with cables that have oxidized copper and poor shielding. You have the same oxidized copper and poor shielding on an HDMI cable and you just may lose picture and sound but the picture won't shift to a washed out look or audio that looses high frequencies. These are some of the claims I have heard as the justification for spending obscene amounts of money on a cable.

Another reason to avoid cheap cables is when the cable is not constructed well it may develop intermittent connection due to a breakage inside the connector. Hopefully, better more expensive cables will be more durable but that is not a guarantee based on money spent.

The main rule on digital cabling is if it conducts signal it will be 100%. Otherwise it could be intermittent or non-transmissive at all.

The cost of a cable is often based on many other factors besides the cable itself. Advertising costs and retail markups can add significant $$$ to a cable that an equally built quality cable without those added costs comes without. Save money buy direct!

One more thing people need to consider- HDMI cable length. As your cable length gets longer the capacitance between the conductors increases and this looks like a short to the digital pulses. Longer cables have to be designed with lower and lower capacitance to prevent this shorting of the signal. In very short cables, you don't have the problem because the total capacitance in the cable length is so small it has no affect on the frequencies being handled. There are exceptions to this, however. I have an older HDMI cable that is 10 ft long but the cable diameter is only 3/16" including all insulation. It works on the lowest video signals but not on those higher than 480i. It's capacitance at even 10 ft. length is just too high.

Bottom line- spec out the cable for the length and video frequencies, pick what looks like a good durable construction for long lasting life and then shop the lowest price.

BTW- I have equally poor opinion of ISF calibrators. No offense, but I have spoken to many who are borderline conmen as opposed to real calibration technicians who know their monitor science. It is too often claimed that ISF calibrators can make an arbitrary significant difference to justify the cost basis in this service. While I do agree calibration can make a difference it is not a guarantee. Knowledgeable individuals with some basic calibration tools can come close enough, making the ISF exercise to near perfect image an exercise in playing the rule of diminishing returns. There are just too many influencing issues with the science of monitor calibration that impacts the picture art, which is what you are really doing. Once this calibration is confirmed in the procedure, the settings degrade as the bulb life ages. The image varies as the room lighting and reflected light changes. It is not a perfect world with image calibration. There is far more variance in signal input to the monitor than the difference between a good Test DVD calibration and one done with ISF standards. However, having said that, if one is not able to perform the calibration with a test DVD then hiring a professional may just be what is required. I'm just saying ISF calibration is not a guaranteed visual improvement just because it is a very scientific calibration process.
 
You can call me crazy if you want, but better HDMI cable give better picture and sound. The funny thing is it actually makes a bigger difference in sound! I've compared many HDMI cables and while a better cable does infact improve picture quality, very few can see it. I calibrate(ISF) for a living and can see very small improvements. However the improvements, at reasonable viewing distances is not worth the extra money on HDMI. The reason I buy better cables is for AUDIO. Most high end audio now goes through the HDMI to your receiver. These high bitrate tracks NEED every advantage they can get. Yes HDMI is all digital it is 1s and 0s, BUT you do not get ALL the 1s and 0s that is why EVERY SINGLE hdmi component has an interpolation circuit that "fills in" the missing bits. Cheaper cables will lose more bits to be missed requiring interpolating(guessing) what comes in next. If you don't use extrernal audio equipment, then I say why bother with super expensive cables. However if you have external amps and receivers that you are using, buy the good stuff. It makes a big difference. Buy one and play some decent music and compare small clips of your cds with a cheap cable and a better shielded/better material cable. You can notice the "space" opens up and the highs especially become clearer.
And if you use optical audio, then it does not matter which HDMI you use for audio
 
And if you use optical audio, then it does not matter which HDMI you use for audio

Optical audio is much lower bandwidth than HDMI audio. The optical spec is for the old lossy Dolby 5.1, HDMI can carry all 7 channels uncompressed.

Again the HDMI signal has error correction built in, all the bits arrive or they do not, not a random bit error here and there.
 
There are two variables in play when converting a digital sound bit to an audible sound: the signal itself and its clock.
The signal itself, when transfered over SPDIF and/or HDMI, is 100% perfect. If it weren't, we would hear white noise due
to DD/DTS encoding not recognized as such by the receiver. Playing DTS CDs on Windows PCs is the best example of this.

The (inevitable) fluctuations in the clock value creates jitter. How significant that is - and how audible - depends on many factors.
For example, it is easy to make a case that any receiver for under $500 isn't capable of proper reproduction of anything better than 20kHz/16bit (DACs capabilities).
Hence, feeding it with a higher spec sound stream can produce audio dependent on other factors including cables.
Assuming you have perfect hearing, speakers, etc.

Bottom line: "I don't hear it!" and "It doesn't exist!" are very different things...

Diogen.
 
Last edited:
Again, I bet that in a double blind test using scopes, no one could reliably tell the difference of an HDMI signal carried by working $5 cable or a a $150 cable.
Scopes? As in oscilloscopes?
There would be a HUGE difference on proper equipment.

Would that be audible? It depends. It definitely can be.

But if you bought a home theater in a box and never listened to anything but 64kbps MP3s - it won't.

Diogen.
 
Again, I bet that in a double blind test using scopes, no one could reliably tell the difference between a working $5 cable and a $150 cable.

I did this back in the days when this first started blowing up.

I spent 25 years as an R&D engineer in the Hewlett-Packard/Agilent measurement divisions. I took Monster, some other brands of yuppie cable, Radio Shack, generic Wal Mart and the cheap stuff that comes in the box.

I tested with a vector network analyzer for signal loss (attentuation) and frequency dependent phase shift through the cables. None of the cables except the cheapie in the box stuff showed any significant attenuation, and they were all about equal. In the phase shift tests, Monster was about the worst. Not counting the cheap stuff, most of the other cables had about the same frequency dependent phase shifting characteristics.

I presented this on the newsgroups along with the data (early '90s) and got soundly blasted by the 'enlightened audiophiles'. Was told that measurements proved nothing and that a good cable provided 'coloration' necessary for an audiophile sound. I asked what this 'coloration' was, and how it could be measured. Someone suggested an audio analyzer and microphone in front of the speakers. Someone else did that test, and when is showed no difference, the argument was again that this was an esoteric quality that wouldn't show up in measurments, but which could be heard by the 'chosen'

I concluded that this feeling was the increased spring in their step from being able to walk around with lighter wallets.
 
Read my previous posts. Oscilloscope and wave form monitors. If one cannot pick out which signal is better or worse reliably, there is no difference. The differences are in the observer's brain and nowhere else. It is the same psychology that leads to belief in other things that aren't there.
 
Read my previous posts. Oscilloscope and wave form monitors. If one cannot pick out which signal is better or worse reliably, there is no difference. The differences are in the observer's brain and nowhere else. It is the same psychology that leads to belief in other things that aren't there.

I wasn't disagreeing. A network analzer is kind of a super scope with a frequency source and a display that reads amplitude and phase as a function of frequency. Better test than just a scope.

My results were pretty much what you said, but it cut no weight with the pricey cable crowd.

The only point where I agreed was that the 30 gauge junk being shipped back then was garbage. Fortunately, the cables being shipped today are much better.
 
Since HDMI with HDCP is encrypted, all the bits have to make it

Just to set the record straight (not to advocate the overpriced cables ;) )

Unlike network protocols (like TCP/IP), HDMI protocol does not guarantee reliable data delivery. It doesn't have any retransmission mechanism, for example. The wrong or missing data is not always recovered, it is simply ignored, (or sometimes interpolated, as Ben Bassinger pointed out), which can potentially result in video or audio noise and/or sound/image degradation.

For example, here is a quote from the HDMI 1.3 spec related to audio decoding (http://www.hdmi.org/download/HDMI_Spec_1.3_GM1.pdf):

7.7 Error Handling (Informative)
The behavior of the Sink after detecting an error is implementation-dependent. However, Sinks
should be designed to prevent loud spurious noises from being generated due to errors. Sample
repetition and interpolation are well known concealment techniques and are recommended.
 
The differences are in the observer's brain and nowhere else.
Jitter is not snake oil but a scientific entity. You can calculate it. You can be trained to hear it.
Just like you can be trained to hear the difference between 128kbps and 256kbps MP3, close to 100% of the time.

Diogen.
 
I wish I could challenge you to a double blind test. I would bet a year's pay that you couldn't pick out the cheap cable more than half the time (statistically guessing).

jayn_j, sorry for the confusion. I was responding to diogen in my last post. I did not see yours.
 
Optical audio is much lower bandwidth than HDMI audio. The optical spec is for the old lossy Dolby 5.1, HDMI can carry all 7 channels uncompressed.

Again the HDMI signal has error correction built in, all the bits arrive or they do not, not a random bit error here and there.
Well, I only have a 7 year old 6.1 audio system set up as 5.1 that does not have HDMI, so the cable does not matter.

Besides, I have component going to my TV anyway

Who cares if HDMI can carry 7 channels since few things broadcast in 7 channels anyway? All broadcast TV and MOST DVDs do not carry 7 channel audio
 
Last edited:
I wish I could challenge you to a double blind test. I would bet a year's pay that you couldn't pick out the cheap cable more than half the time.
I wouldn't accept the challenge, because I can't. But that doesn't mean nobody can...
I like you backing off the blank statement
The differences are in the observer's brain and nowhere else.

Diogen.
 
Well, I only have a 7 year old 6.1 audio system set up as 5.1 that does not have HDMI, so the cable does not matter.
Well, if so, then you are missing out on all the advancements in sound quality that were introduced with Blu-ray. The home theater sound quality has improved dramatically from DVD to Blu-ray. The new lossless audio formats like Dolby TrueHD and DTS-HD Master Audio leave Dolby Digital in the dust.

And there are two ways you can take advantage of these new Blu-ray audio formats: either upgrade your receiver to a the one that can handle HDMI, or get a Blu-ray player with 5.1 analog audio outputs (if they still make those). That will make a huge improvement even with your existing AV receiver, provided it has 5.1 or 6.1 analog inputs.
 
I use a Oppo BD-83SE with analog 7.1 outputs. This gives an optimized audio delivery since the digital to analog is contained within the BD Player.

When digital audio is lost the "coloration" I hear here is a loud pop if the loss is long enough for the system to respond. But this is not normally referred to as coloration. It is simply signal loss. Coloration of sound should never occur in the reproduction equipment. Although it is bad enough we still have microphones and speakers that color the sound. At least everything in between should be inert to the audio color. Consequently, anyone who claims an HDMI coloration is important is justifying the distortion in his system. :)
 

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)