HDMI cables with RedMere technology from Monoprice

rharkins

SatelliteGuys Guru
Original poster
Mar 8, 2006
140
96
Kansas City, MO USA
I have to fish my HDMI cables through a built-in in my living room, so I sprang for the RedMere cables from Monoprice. They were a few bucks more ($20 for 15 feet), but they are amazingly thin and flexible.

I've tested both cables, and they seem to work well. They are directional, however. One end is marked 'display', and must go to the TV.

There was only one other mention of them on the forum, and that was Monster's press release from 2010.

Does anyone have experience with these? My only concern is introducing another piece of active electronics into the mix...
 
Make sure to test them with your equipment before fishing through the walls.
This is a new technology and it may still have compatibility issues.
 
I've saw these cables on Mono website. Started to buy two but changed my mind. Can you notice any diff in picture quality, I know the diff in size.
 
I was thinking about getting a long one to test the signal, but I really don't have any use for cables longer than 6 feet. I'm either setting up the xBox/tablet right next to the AVR or streaming from my PCs over wired Ethernet. I have a friend that wants a 50 foot run from PC to projector though.
 
I've used them on a few installs, including my own.. They work as advertised, but I wouldn't bother with them for any application under 15'. Basically its almost like a repeater so you don't get sparklies in the picture or handshake errors with a long run to a projector or remote room while being able to use a thinner 28ga cable rather than a beastly 22ga.

Just keep in mind if a standard cable already works good for you, Redmere cables will not make any improvement in picture. Remember, HDMI is digital, and digital is either perfect or its not. There's nothing between 0 and 1.. :D
 
Last edited:
JerseyMatt:

If you're getting sparkles, the transmission is not perfect so your premise is flawed.

Sent from my Samsung Galaxy Note 2 using Tapatalk 2.x

I don't think that's necessarily what he meant. Mojoe asked if there is any difference in picture quality. With HDMI being digital, there is no improvement with picture quality with one cable over another like there was in the analog days. With digital it either works or it doesn't. If there is an issue with image quality it is either TV is not producing a good image or the content never had a good image. I think what he means by sparkles is the digitization and pixilation that happens when a few packets are lost during the transmission. With digital the transmission is either perfect or it's not.
 
Yeah that's exactly what I was getting at.. Thanks! :)

sparklies are lost/corrupted bits in the stream. They can be random like snow, or they can affect the same pixels consistently. Its similar to the blockiness/pixelization you see in a digital cable or sat stream when there is a glitch, but it is distinctly different in that it affects individual pixels.
 
Last edited:
Some people think that since HDMI is digital, it either works or it doesn't. There is nothing in-between.

That's a very common misconception! I think it comes from network protocols like TCP/IP which have error-correction. If a network packet gets damaged during transmission the TCP/IP protocol will keep resending it. So you either get correct data or you don't get anything at all.
Unfortunately HDMI protocol doesn't have error-correction (to be more precise it only has error correction for some control data, but not for the video image data.) So, if the bits are changed (which may happen for a number of reasons, particularly when you have a very long cable), then the image gets degraded: you see sparkles, or in the worst case the entire frames get skipped, or some other artifacts show up.

As for the RedMere technology, the purpose of it is not to "improve" the picture, but to allow longer HDMI runs with thinner cables - something that is not normally possible without repeaters.
 
That's a very common misconception! I think it comes from network protocols like TCP/IP which have error-correction. If a network packet gets damaged during transmission the TCP/IP protocol will keep resending it. So you either get correct data or you don't get anything at all.
Unfortunately HDMI protocol doesn't have error-correction (to be more precise it only has error correction for some control data, but not for the video image data.) So, if the bits are changed (which may happen for a number of reasons, particularly when you have a very long cable), then the image is degraded: you see sparkles, or in the worst case the entire frames get skipped, or some other artifacts show up.

As for the RedMere technology, the purpose of it is not to "improve" the picture, but to allow longer HDMI runs with thinner cables - something that is not normally possible without repeaters.

No you also missed the point of the statement. We are comparing it to analog - wherein you can (usually) visually see a difference in the picture by using a better quality cable. For example a 50ft component run to a projector using RG6 cables will usually look better than the same run with some 12ft crap cables coupled together. The analog signal loses intensity and picks up interference along the way. You can have 60% of the signal reach the display with cheap cables, while better cables might get 90% of the signal there.

If you have an HDMI connection with a standard cable, and the picture is already good (with no artifacts) it means 100% of the digital stream is reaching the display, therefore switching to an active cable such as Redmere will have no effect because you can't get better than the perfect signal you have.
 
I am not disagreeing with that! If the picture is already perfect (no perceivable loss of bits) than you can't improve it by using a better cable. However, as you said it yourself, the picture is not always perfect. You mentioned seeing sparklies with longer cables. And that's exactly the indication that the digital stream is not 100% perfect and it can be improved by using a better cable or repeaters.

I think that's the point John Kotches was trying to make.
 
Ilya:

I don't know of any real-time protocols that allow for data re-transmission... If it did it wouldn't be real time ;-)

There is error correction in the Audio and Control sections but none in the video. It's likely due to the volume of data for video. Audio and control are miniscule by comparison. At maximum allowed spec audio would be less than 37 Mbits/second exclusive of protocol overhead as compared to Gbits/second for video.
 
Ilya: I don't know of any real-time protocols that allow for data re-transmission... If it did it wouldn't be real time ;-)

Theoretically you could have some (CRC-like) error correction without re-transmission. But that would require significant overhead in the amount of transmitted data and more importantly, it would require significant processing power. Not feasible for HDMI applications. ;)
 
JerseyMatt:

Your wording was poor at best -- digital transmission isn't perfect, that's what error correction is for. It's not analog, but digital systems by design have checks to insure integrity.

See my previous comment re: error correction on the video stream.
 
Theoretically you could have some (CRC-like) error correction without re-transmission. But that would require significant overhead in the amount of transmitted data and more importantly, it would require significant processing power. Not feasible for HDMI applications. ;)

Parity checking wouldn't have been terribly dear to implement, at least then you could get counters to increment when there was a parity error.

Of course it's easy for me to Armchair QB the solution 10 years or so down the road :)
 
Uhh. OK.. First of all, HDMI has NO error correction, as has already been established. That is WHY we get spark lies and artifacts. So there was no point in bringing it up.

Second, with regard to HDMI (the topic of THIS thread) the signal IS either PERFECT or it is NOT. If you have no perceptible loss of bits, then the signal can be considered PERFECT and there is absolutely NOTHING in the signal chain that can be changed to improve it. If there are sparklies or other artifacts in the picture, than it can NOT be considered perfect and there IS something that can be changed (ie: upgrading a cable or installing repeater) to improve it.


We are talking to laymen, not engineers, and as such things do need to be put into terms that an average human can comprehend.
 
Last edited by a moderator:
Uhh. OK.. First of all, HDMI has NO error correction, as has already been established. That is WHY we get spark lies and artifacts. So there was no point in bringing it up.

HDMI has error correction on the Data Island (which is audio data) and the Control Island (which is control data). It does not have correction on the video side which we've already mentioned. To say HDMI has no error correction is inaccurate. To say HDMI Video has no error correction is accurate.

Jason (Neutron) and I make the distinction, you don't.

Second, with regard to HDMI (the topic of THIS thread) the signal IS either PERFECT or it is NOT. If you have no perceptible loss of bits, then the signal can be considered PERFECT and there is absolutely NOTHING in the signal chain that can be changed to improve it. If there are sparklies or other artifacts in the picture, than it can NOT be considered perfect and there IS something that can be changed (ie: upgrading a cable or installing repeater) to improve it.

The issue is that your wording was poor, and some of us wanted clarification.

We are talking to people here, some of whom are engineers.

Also, if you have a problem with my posts, I invite you to click on the triangle with exclamation point to report my posts to the moderator.
 
guys please lets stay on topic here and watch the language. thank you....
 
Last edited:
I have had good success with HDMI over dual cat 6 at around 100'. With longer cable lengths it easier for me to run Ethernet cable rather than an HDMI cable.

http://www.amazon.com/dp/B003EE8OL6/?tag=satell01-20

is what I used.

I've used a Gefen Cat5e for about 7 years. That was back when they were the only game in town for this type of product. I can't speak to the one you're using but I am pretty sure you would tell us it sucked if it didn't do the job ;) With the size of the HDMI connector, category cabling would always be easier than HDMI for a wall pull IMO.

The redmere cable solution is a nice option where you aren't into the wall, say for example in a family room where the components are in a cabinet and the TV is mounted on the wall. The minimal size of the cable could be easily dressed and almost completely hidden from view.
 

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)

Latest posts