Originally Posted by omarank
I was talking about the gamut reduction (lowering of saturation) as the luma level drops, as shown in the attached image.
That's not a thing, as far as I'm aware. It depends somewhat on how you define "saturation" though. When it comes to color appearance, there are several metrics analogous to saturation, and some of them are defined in a way that has them change with luminosity level. From chapter 4 of the first edition of "Color Appearance Models" by Mark Fairchild:
Attribute of a visual sensation according to which the perceived color of an area appears to be more or less chromatic.
Colorfulness of an area judged as a proportion of the brightness of a similarly illuminated area that appears white or highly transmitting.
Colorfulness of an area judged in proportion to its brightness.
By these definitions both Chroma and Saturation remain constant at different luminance levels (assuming the spectral shape remains the same), while Colorfulness changes with luminance level.
But all of this is irrelevant to the discussion - the relative saturation of a set of narrow primaries to a set of broad primaries remains constant over lightness (unless there is some real world factor intruding), and so any separation function can be independent of lightness.
I would imagine that the reduction in gamut/saturation with the decreasing luma levels implies that there is change in spectral shape of primaries too.
Why ? The physics doesn't/shouldn't change with luminance levels.
In fact, some studios who use OLED mastering monitors, use a LUT (often baked with Calibration LUT) to simulate this effect (as shown by LCD, CRT and projectors), because OLEDs maintain the gamut/saturation all the way down to black.
So do other display technologies.
It's possible though that this is referring to other effects - as the light levels fall to really low levels (scotopic levels), our color vision disappears, because the LMS cones stop responding, while the rods continue to work. But this is a human effect, not the display.
I am taking reference of this article
by LightIllusion here.
It's poorly explained, but I guess they are referring to the fact that LCD displays have a fixed minimum black level, so of course the effective spectral characteristic of the primaries "broadens" as you approach black. This is simply the result of the minimum black being added to the primary values. Exactly the same effect will occur for OLED displays if you measure them at a distance in a real world environment with ambient light - the "off" state is not a perfect absorber, so a small amount of ambient light will be reflected for black, diluting the primaries, and desaturating near black. (In the diagram the percentage brightness seems way off for this though.) This has little to do with primaries and everything to do with the black level of the display. Yes you can simulate a less than perfect black level on a display that has a better one.
This effect may not even be interpreted as a saturation loss in a subjective sense - it could be perceived as something else, such as being occluded by a "fog" for instance.