Any consumer-grade video processing circuitry, particular something as complex as a scaler, is bound to introduce some artifacts, throw away a little resolution, etc. Think of what is happening: three analog signals (RGB) must undergo A/D conversion, be buffered, have mathematical equations run to produce new pixel values, some equations are for other processing that is included such as gamma correction or overscanning etc. Just the introduction of overscan alone is likely to change the value of every "pixel" in the incoming signal. So when I say "scaling" I'm really talking more about the entire circuit path between the VGA connector and the DMD chip, than just the scaling per se. Anyone who has seen high-quality 720p material on the Sharp with and without the scaler bypass, can immediately see the effect.
You have to get into prosumer or broadcast-grade scalers to avoid introducing degradation; these scalers are not found in consumer displays.
When DVI connections start showing up in our homes, the problem is not quite solved. In any given TV there may be an arbitrary amount of processing done on the DVI signal before pixels arrive at the display panel; having our video in digital source form does not guarantee pixel-perfect addressability!