Ok, so I just wanted to get a sense of how responsive my system is to changes in the LUT, so I did an experiment with Matlab and Psychtoolbox (similar to what I imagine dispcal -R does)
Starting with a linearized gamma table, I displayed a full screen image of a light grey patch.
The gamma table was 256 rows, each specified with a precision of 0.0001, scaled between 0 and 1.
I then incremented the values in the table by increments of 0.0001, while measuring this grey patch with the i1 display 3 (running off a nearby laptop).
I was able to get consistent changes in luminance with LUT value increments of 0.0005. For example, if I raised the LUT by 0.0001, the luminance reading didn't budge (or rather, it didn't budge from its fluctuation of about plus/minus 0.01 nits). But as soon as I raised the LUT by another four bumps of 0.0001, the reading suddenly jumped up.
For example, at gray level 100 (out of 255), the reading was 4.50-4.52 nits.
The 100th LUT value before any adjustments = 0.3882.
I changed the value to 0.3883, no change in luminance whatsoever, however, when the value changed to 0.3887, the luminance suddenly went up to 4.54-4.56 nits.
If I'm interpreting this correctly, that means that changing the LUT by five ten thousandths had an impact on the voltage being sent through the DVI-VGA cable. This is a precision of one part in two thousand, which is roughly 11 bits.
Next, I just have to figure out a good psychophysical approach to estimating JNDs (I'll look through the literature), but it's good to know I have at least 10 bits of LUT precision to work with.
Last edited by spacediver; 07-17-2014 at 10:44 PM.