HDMI Cables??

  • WELCOME TO THE NEW SERVER!

    If you are seeing this you are on our new server WELCOME HOME!

    While the new server is online Scott is still working on the backend including the cachine. But the site is usable while the work is being completes!

    Thank you for your patience and again WELCOME HOME!

    CLICK THE X IN THE TOP RIGHT CORNER OF THE BOX TO DISMISS THIS MESSAGE

paulman182

SatelliteGuys Pro
Original poster
Sep 10, 2006
420
18
East Kentucky
I need an HDMI cable for when I get my HD DVR next week.

I see them at Wal-Mart for $30 or so, and I see them on that auction site for $7.

Is there any reason to spend big bucks for a six-foot HDMI cable?

Thanks!
 
paulman182 said:
I need an HDMI cable for when I get my HD DVR next week.

I see them at Wal-Mart for $30 or so, and I see them on that auction site for $7.

Is there any reason to spend big bucks for a six-foot HDMI cable?

Thanks!

$30 isn't the big bucks... look at HDMI cables at Fry's or Best Buy!
 
It doesn't really matter who makes the HDMI cable. It's a digital connection. It either works or it doesn't.

I spent at most $15 for my HDMI cable and it works just fine. :)
 
Neutron said:
It doesn't really matter who makes the HDMI cable. It's a digital connection. It either works or it doesn't.

I spent at most $15 for my HDMI cable and it works just fine. :)
I didnt spend much either but I have read that shielding can make a difference. And a thicker better shielded cable will cost more.
 
There could be a 20 page argument sparked by that comment, but suffice it to say.... NO. There is no difference any human can detect on a double test. Monoprice.com is where to go to get your cables
 
So the transfer rates they advertise make no difference weather is 3 gigabits or 10 gigabits?
Yes and no.
As a rule of thumb: if the cable is under 6', the specs hardly matter. Otherwise they do.
It doesn't mean that the marketing BS is true, of course...

The audio you hear is comprised of two components: the audio bits themselves and timing (clock).
The first is transferred bit perfect (if bitstreamed over HDMI). The second is not and can affect the experience...

Diogen.
 
The audio you hear is comprised of two components: the audio bits themselves and timing (clock).
The first is transferred bit perfect (if bitstreamed over HDMI). The second is not and can affect the experience...

Diogen.

Yeah, I've heard that one several times. Problem is that timing errors are not going to be generated by poor cables. Cables can affect the cleanness of pulse edges, overall strength of the signal and the power factor between the leading pulse edge of voltage and current. It will not make a pulse train go slower or produce variable clock timing.

That would be a function of the devices generating and receiving the pulses. A cheaper player that relies on the rotational speed of the disc to produce the clock would have some clock variance. However, virtually all current players now buffer the raw bitstream. By the same token, your mid to higher end receivers will correct for minor clock variations by passing the clock's sample trigger through a phase lock loop. Inexpensive and it will provide a rock solid reference. The only time it could create a problem would be when the skew is so bad that the corrected sample trigger acquires the wrong sample. If conditions are that bad, you have other problems as the video will be macroblocking as well.

Sorry for getting all geeky here. My 30 years in the data acquisition industry are showing. I feel that sometimes the industry shills and audiophile snobs (BTW, I am NOT accusing diogen of being a snob) start looking for any and all excuses to get consumers to spend more money. There are plenty of reasons to buy a better player or amplifier, but this isn't one of them. For cables, the only things that should matter are resistance and capacitance and frankly that isn't going to make a whit of difference for anything less than 12'
 
I'm not an audiophile and I'm old (can't hear anything above 12-15kHz).
I don't believe in most of the audiophile talk but I believe in science.

Just one story about 10 years old...

I read a gazillion times that 128kbps MP3s are really close to the original and nobody
(statistically speaking) can hear the difference between a 256kbps stereo track and its WAV original.
And i believed that because I can't hear the difference.

Until I met a person that can.
In 9 cases out of 10 he was able to do what I thought was utterly impossible: pick 192/256 kbps MP3s from WAV.
On relatively average equipment.

Bottom line: I still can't hear it.
But I think twice before claiming nobody ever will...

BTW, HDMI as a standard is such a mess, it's not even funny...

Diogen.
 
I'm not an audiophile and I'm old (can't hear anything above 12-15kHz).
I don't believe in most of the audiophile talk but I believe in science.

Just one story about 10 years old...

I read a gazillion times that 128kbps MP3s are really close to the original and nobody
(statistically speaking) can hear the difference between a 256kbps stereo track and its WAV original.
And i believed that because I can't hear the difference.

Until I met a person that can.
In 9 cases out of 10 he was able to do what I thought was utterly impossible: pick 192/256 kbps MP3s from WAV.
On relatively average equipment.

Bottom line: I still can't hear it.
But I think twice before claiming nobody ever will...

BTW, HDMI as a standard is such a mess, it's not even funny...

Diogen.

Agree entirely on the standard. If you recall, I had a pretty intense argument a couple of years ago on the subject where I stated that the HDMI alliance was more concerned about manufacturers using the logo properly and paying the royalties than they were in performance testing. When I read the certification specs, they were all about marketing and not about performance.

I agree on the fact that some people can tell subtle differences. In this case though, I believe those differences would show up in how well the player generated consistent clocks and how well the receiver did the time base corrrection. I really don't think the cable could color the signal, and I'd love to see any well designed test that could show a double blind difference on a reasonable sample size and explain why. I just don't think it could be the cable.

Also, with the guy that could do it... Were the number of bits in the encoding DAC consistent? Were both signals generated from the same analog source? Were the #of bits in the decoding DAC the same? If not on any of these, you would hear a difference in the number of distinct steps and that could show up as lower frequency harmonics. I think it would be very difficult to design a fair test there.

BTW, I'm just as old (61) and I drop off quickly at about 11kHz.
 
I did the encoding myself. Classic and pop. Clips of 30 seconds long.
Directly cut from the CD WAV vs. encoded using LAME and converted back to WAV.
Don't remember what program I used (it's been a while).

The resulting CD had twenty 30-sec clips. It was played twice from beginning to end.

During playback Denon equipment - CD player and receiver - and Paradigm speakers were used.

The one he got wrong was ABBA's "Dancing Queen".

Diogen.
 

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)