I'm watching at 90 nits as well. Subjectively I find 100 nits too bright.
This discussion was held years back in one of the old home theatre geeks episodes.
(Where a systems integrator calibrated different nits levels and also prefered lower than 100 nit ones for better immersion (less 'activation on part of the viewer))
In film projection, it’s the projector without any film projecting onto the screen. With digital projectors, it’s a 100% white image. It’s interesting to note that the two are not identical. Film base attenuates the light through the projector, so a white film frame would measure lower, but digital projectors use no such film, so 100% is 100%. The target luminance is between 12 and 22 foot-lamberts (fl). The target is 16fl, but a group of surveyed viewers much preferring the 22fl screen brightness. Many movie houses are dimmer, around 7-10fl. Yes, it’s a cost thing. Xenon bulbs are expensive, and last longer if you don’t burn them as bright.
16fl are 54 nits
22fl are 75 nits
That said - especially on OLEDs with black crush - stick to around 85 or 90 (else darker areas get too grainy too often).
And just to make that clear, the calibration standard is dark room scenario at 100 nits (but with smaller screens probably..
), so anything that differs from that is considered non standard. That said luminance (brightness portion of colors), impacts dE the least (hue and saturation impact it more).
So in my humble opionion, just turn brightness down (or just calibrate at lower brightness targets).
I've actively recommended to a friend not to go with a Philips with ambilight, which he wanted to use for bias lighting before - and recommended to him to just turn brightness down. 'Priming' your eyes with bias lighting in a dark room, so they can take brighter images overall to me always sounded a little heinous. (Outside grading suites.)