I think we may be talking at cross purposes.
I understand that my video card is taking successive lines and outputing them progressively from an input signal that "arrived" from and interlaced source. It would have to or the screen wouldn't form a viewable image.and in that sense it is de-interlacing.
What I'm talking about is the post processing compensation options like in the web admin that are to correct pictures that are going through the process described above.
Because the source was interlaced, and successive lines are from 1/50th second apart, with motion this will often cause particularly horizontal offset due to the difference in time they are sampled. If these lines are then displayed interlaced this in effect neutralises the effect. But if you reorder the lines to make them progressive, you are putting significantly offset lines next to each other, and that horizontal effect is "combing" and it looks terrible!
The de-interlacing is a very heavy duty post processing function, which is I presume why they provide several different levels in the console.
Sure many video cards may have this function in hardware, but they aren't going to know to use it unless they are told to.
I guess my question is - are there any other options to control this problem without having to take the heavy CPU hit, although as I say would have thought this could be done in hardware. Can vdr control it so if the source is interlaced then it passes that through as interlaced and let the TV switch - frankly the benefit of a progressive scan image is tiny compared with the terrible effect of combing caused by converting to the progressive signal from interlaced!
So passthru automatically, hardware de-interlacing to save CPU, or perhaps some other approach???