If you're talking DVI-D, you've got to be kidding. There is absolutely NO WAY that you will see a picture quality difference due to cable type on a DIGITAL signal. PERIOD. What will happen as the length causes enough loss is that you will start getting pixellation and dropouts, just the same as with dish rain fade.
With a 25' run for 720p and 1080i, I got random bits of red, green and blue; that's digital noise. I've seen it with expensive and inexpensive cables. It was exacerbated when I had some DVI/HDMI convertors inline.
We're talking about pushing > 1Gbit/per second of time sensitive data to the display. It's not ethernet packets that aren't time sensitive.
It's worse if you're talking about 1080p.
So yes, you can see a cable quality difference, but it isn't about price though, it's about shielding, impedance and proper termination.
As for DVI-A, that's nothing more than component output with a different connector, and in that case, yes, cable quality can make a difference.
DVI-A is RGBHV or Component, more likely RGBHV than component., designed more for monitors to get RGBHV digital and analog on a single cable.
Finally, DVI itself is already being phased out in favor of HDMI.
What sucks is that HDMI is a superior technological interface with a really crappy connector.
Regards,