Originally Posted by stephenju
256 shades are 8-bit. Right?
Correct. As I understand all digital *source* formats are 8-bit/color (at least until "deepcolor" arrives) ... the issue is whether the processing (more important for scaling displays) can smooth out the 8bit gradients.
For instance, it would make difference in 1080i/p (or 480i/p) conversion to 720p ... not so much for a native 720p source. The extra 2 bits are used to ward off rounding/truncation issues.
... And so-called "720p" flat panels are another beast
... Then again if the DLP chip is self is capable of more than 8bit/color display, then it would make a difference at all source resolutions. Personally, that's all beyond my feeble grasp of the topic.
Feel free to correct any inaccuracies.
EDIT: Ok ... stupid me, I see that DLP chips are 10bit devices ... makes sense now that you'd need to "scale" the color info from 16-235 (or 0-255) up to 0-1024 for display.