Originally Posted by Masharak
Banding is not due to nVidia or AMD or anything like that. Its due to slight measurement inaccuracies when you measure with a non-perfect-accuracy device like i1Display Pro. It may be that less banding was seen with AMD on 10bit devices since AMD supported 10bit before nVidia could support 12bit+.
nvidia can send 12 bit since dx 10 cards so card like gtx 260 or stuff like this. the option in the driver is cosmetic it is only useful to force 8 bit with high bit deep input that's it.
nvidia is adding a ton of banding when used with a simply ICM profile and output is at 8 bit it's just a flaw in the nvidia driver. use madVR overlay mode should look worlds better.
I also learned that while 8bit uses only 0-255 levels, 12bit uses 0-4000+ levels! This makes me wonder whether ArgyllCMS would measure all of those 4000+ levels when 12bit depth mode is enabled...
how? your colorimeter can't see the difference and how to send 12 bit to the driver in the first place it is not even possible. you could technically send 16 but...
that's for a pixel you can use dither to show more bit deep over multiply pixel that the idea of dithering in the first place.
12 bit is 2~12-1=4095 so 0-4095 but a display that can display this doesn't exist.
and feel free to calculate how many measuring are needed to make even use of 10 bit. 8 bit has already over 16 million possibilities.
the test chart is using % from maximum color so they are not based on a bit deep.
the test i linked you test how deep the colorimeter think's your "display" is
I also purchased a High-Speed HDMI and it allowed me to use 12bit even @ 60Hz! And no High-Speed HDMI is not a gimmick, its not the same as HDMI versions, which are determined by the devices themselves and not the cable. High-Speed HDMI cables have much higher bandwidth compared to Standard HDMI cables, which were actually made for 720p and 1080i, but they surely handled more than that, yet were limited by bandwidth.
standard speed HDMI cable still exist? RGB 1080p60 hz with 12 bit is nothing special.
BD content was supposed to use 12bit depth, but it didn't...
never heard this before. and a 12 bit panel is already very unrealistic in the first place when the BD spec was planned.