Originally Posted by jonnythan
I moved to the Sandy Bridge from a GeForce 8200. The contrast and black levels are markedly worse, and there seem to be some small stuttering issues on some broadcasts. Driving me bonkers.
I figure if the ancient integrated nVidia 8200 was good enough, a GT 620 has to be at least as good...
I actually get quite good results with an i3 Sandybridge. I have several Nvidia cards around here including a GTS450 I could stick in the box. It is being used for OTA and Cable TV (HD Homeruns and a Ceton). Absolutely zero issues that are not already in the source. I have a Tivo HD as well and everything is being fed to the TV via a Lumagen Radiance. The i3 GPU is doing just fine and the levels are very good. While the HTPC is not calibrated the TV is via 125 Point Color Auto-Calibration on the Lumagen.
I don't know what display dirver I am using for the intel graphics but it must be at least a year old. I never touch the box, it runs 24x7 and other than necessary patch updates it 1-00 percent dedicated to Live and Recorded TV in WMC. Could it be better with Nvidia, perhaps a tiny bit, but since you are kind of limited on decoders and renderers with TV in media center I am quite content. Considering the power demands of a discrete card and the marginal possible picture quality improvement it just does not make sense. BTW, I have messed MadVR and LAV and all that but since they are pretty much useless with WMC and protected content there is not much point. If I was watching a lot of movies on the box and running JRiver or MP-HC with madVR and LAV I would think about additional GPU power. I have zero stuttering or motion issues with the intel. Any glitches are signal or Ceton.
I run the Lumagenn and get very nearly perfect results with the 3D LUT 125 point color calibration, I also use a Darbee Darblet. For WMC based TV, this TV can not look any better. No point in the heat and energy use. Maybe your setting are not right, or intel farkled recent driver versions.