In theory a pure digital signal should be the best.
With VGA the signal must be convert twice. Digital to analog then back to digital again. That conversion can be good or poor quality.
I'm not familiar with all the digital tricks used by HDTVs but it may be that once the digital signal is in the TV it attempts to manipulate the image via software.
An analogy is using DVI on a LCD and finding the VGA looks better.
People often run a LCD in non native rez. Even though the LCD is digital it uses algorithms to display images that are run in non native rez. This introduces lag, artifacts etc. This is why some people will run the display in 1:1, that way the display is still in native rez (Where its fastest) and have to deal with black bars around the image.
VGA can look smoother than DVI, again just using computer LCD here. DVI can only handle 60hz while VGA can do more. Now we have dual link DVI than can handle 120hz.
Anytime you get worse image using pure digital vs analog I would suspect some kind of post processing going on in the TV or LCD.
Originally Posted by gunbunnysoulja
On my Sammy HL-S6187w, I very much prefer VGA over HDMI for my HTPC.
I wanted to use HDMI to keep it simple, but now I have to use VGA for video and HDMI for audio
I compared many times, and VGA was superior every time, particularly with text, both getting 1:1 with overscan off.