HDMI and DVI are signal compatible - so that converter is essentially just a passive thru, so that in itself isn't the issue.
Generally it is highly unlikely that a "bad" cable would give you a grainy picture. It is serial digital stream, so like any other it is either going to work or not, or perhaps produce chunks and areas where the picture breaks up or tears just like digital TV when the signal is poor. But a uniform degrading of the picture is not likely to come from the cable - VGA sure, not HDMI/DVI.
I suppose the video chipset could be systematically corrupting a bit or two for a pattern of pixels that might give that effect. But changing the card should sort that out. Driver? Seems unlikely - this is nvidia?
The native resolution suggestion is an interesting one - likely that the TV is using very different systems to scale a digital image than an analogue one. Digital images are often much more difficult to scale well(cheaply!) - definitely a good idea to find the real native resolution of the panel (read the specs very carefully because they will often state 1080, but when you read the detailed specs that just means it will accept a 1080 frame and scale it down to the native 1280x768 panel resolution). Make sure you are matching the panel to your xorg. nvidia-settings has some useful data on scaling and native sizes..