Originally Posted by Gradius2
They don't know what 10-bit does?
Well, nothing an example (or two) can't help:
Well hold on now. That comparison above looks like a change in source color resolution, not display color resolution. Any current display, when presented with a decent source, will look like the 2nd pix and not the first. I mean really, does anyone have an 8 bit display and have a picture that 'really' looks like that first picture? I don't nor have I ever had one. So again, we have a tendency to confuse source issues with display limitations. This has been going on ever since I've been on AVS.
In fact when flat panels were first introduced, many thought their panels were improperly displaying color gradients when their pictures occasionally looked just like the first one above. I was fortunate enough to have a CRT HD display on hand as well as my Fujitsu flat panel. Both displays showed banding like your first picture because the issue in question was source related. When I popped on a different, better, DVD with similar colors and color gradients, it became clear that either display was absolutely capable of displaying a picture that resembled your 2nd picture. The Fujitsu was not a 10 bit panel.
So again, I'm not sure what degree of improvement a 10 bit panel brings to the table relative to an 8 bit. The numbers are great and talking millions, billions or trillions of colors is great, but what does the picture look like and what are the real OBSERVED differences
. I don't know, does anyone, has anyone seen it?