There's really no reason to believe that the DVI circuitry itself will yield different picture quality between a Radeon and a GeForce.
If you're just displaying a simple Windows desktop, and the resolution and color settings are identical between the two cards, then the picture should be identical. Not close to each other, but identical. Any quality differences would manifest themselves as obvious pixel errors or inability for the monitor to sync, not as subtle quality differences; and would likely be corrected by using a higher-quality or shorter DVI cable.
If gotapex is trying to imply that the DVI quality is subtly different between the two cards---specifically when displaying, say, a 24-bit Windows desktop---then I would bet that somewhere in his tests there was some low-level difference in the software drivers or settings between the two cards, such as a software gamma correction. Gamma correction should be turned completely off in such a scenario.
Now with DVD playback, the situation is different. The DVI output stages will still produce identical results, but the hardware-based scaling algorithms that come before DVI will be different. That's where the Radeon-based scaling algorithms come into play.
For other video applications I would assume that the siutation is similar to DVD, but I don't know for sure.
For gaming, that's a matter of taste. Again, the DVI stage will be identical, and the video scaler will likely not be employed. So now we're left with the 3D engines. My understanding is that the GeForce wins for raw power, but the Radeon may be good enough, allowing you to enjoy its advantages in the DVD arena as well.