Originally Posted by Mr.D
In my experience things like Photoshop, Shake , Nuke (not sure on after effects but I consider it prosumer) have always demonstrated cleaner visual output by moving to 10bit or greater processing cards ( this is coming at it from when the cards only did 8bit processing even high end ones).
In terms of the level of precision I generally use when appraising these sorts of attribute I don't think I'm merely looking at 8bit data hidden in 10bit or greater dither. And this is when creating imagery that has to stand up to display that is truly transparent to at least 10bit if not 12.
Well the newest version of Photoshop does now support 10-bit output, now that it can render the image on the GPU, but it requires a Quadro or FirePro card.
There is no support for 2D 10-bit output in Windows at all, it requires D3D or OpenGL rendering. OS X doesn't have any native support for 10-bit colour, even on Quadro or FirePro cards.
I don't believe Shake has 10-bit support, and I'm not sure about Nuke. (a quick search suggests that it does not)
That's not to say that having a 10-bit LUT won't be beneficial at all. I suspect that if you are using ICC profiles, the data in the video card LUT will at least have greater precision there, but it's certainly not going to be transparent to 10-bit if you are doing that. (though it may be transparent to 8-bit?)
Originally Posted by 703
AMD and Nvidia cards can output 10-bit RGB via HDMI/Displayport without upsampling. E.g. MadVR has a 16-bit internal processing queue, then is dithered down to the 10-bit RGB output bitdepth.
That's not true at all. MadVR does use 16-bit internal precision, and has a D3D output, bit it does not
support 10-bit output, it only passes dithered 8-bit data to the display.
There has been talk that it may
get 10-bit output added to the Fullscreen Exclusive mode in a future update, but as of v0.80, it does not. This is what I mean about not believing what your display is telling you. With the exception of a very limited number of specialist applications, almost nothing actually passes more than 8-bit data to the GPU. It's not as simple as a developer flipping a switch and enabling support, the application has to use a specific OpenGL or Direct 3D output to enable it.