If you have to ask that question, its not for you. Unless you just want it anyhow.
I provided source links so people read them. (Im not copy pasting them in here just for lolz.)
If then in return a question to summarize from smartphone user comes in. Chances are that bias lighting is not what you are looking for.
So here is the deal.
1. Colorist not wanting to work in a tomb like environment still is the most likely/obvious cause for it being used at all.
2. SMPTE revising spec down to use 5 nits bias lighting instead of 10 nits max means "sh*t, 10 nits probably introduced perceived color difference" (but was previous spec for years).
3. Bias lighting being standard for HDR authoring in retrospect probably is also a good idea. While average scene luminance level only increased a little, having to work with 2000nits+ highlight details in a mastering scenario means, faster eye fatigue. Also you cant change max brightness down, or gamma at all in HDR mastering (EOTF is absolute, not relative), so resorting to bias lighting is the only thing you can do - really, if you need color accuracy and higher mastering productivity in that sort of environment. (It becoming more prevalent with HDR mastering makes sense.)
4. Spec in this case is designed by getting 5 grey beards in a room to agree on something. (Spec for room light condition varied all over the place in the past decade. Capabilities of TVs varied all over the place in the last decade. (Plasma TVs at one point werent even capable of 100 nits. No one complained.))
So ask yourself the following questions
- Do my eyes get tired when watching 100nits SDR content in a dark room?
If yes - am I happy, with turning down screen brightness and recalibrating (Screen Backlight / OLED light setting)?
If no - maybe invest in bias lighting.
- Do my eyes get tired when watching HDR content in a dark room? (Which I primarily watch.)
If yes - unless you are god tier calibrator with current get OLED set, you cant and shouldnt change HDR EOTF (rolloff/cutoff, or even more of the curve), so the only option is to get a bias lighting.
All of this is not applicable to 99.999999(whats the world population again)% of users.
No one in the history of TV manufacturing is expecting you to watch in a darkened room, if you are not a nerd that is single.
No one in the history of TV manifacturing is expecting you to control room light conditions when turning your TV on.
No one in the history of TV manufacturing is expecting you to source a D65 bias lighting source that you calibrate to 5 nits maximum, just so you can minimize iris contraction (less eye fatique).
For effing 7 years, people could not decide on if 80 nits, 100 nits or 120 nits should be the SDR standard (you did what you wanted) while some TVs where capable.
And all the SMPTE ever did is to increase the target for standard brightness arbitrarily to 120 nits - once capable devices came onto the market even as SDR was perfectly defined already.
With stuff like pupil dilation (amounts of rods and cones active on your retina) you get into psychovisual perception territory - and no one can give you a clear answer here at all.
The truth is, that no one other than color graders expects or works in that environment. No one cared to spec it out as anything but "best practice, with brightness dialed up to 120 nits as performance of screens incresed". And no one can say anything substantial against only watching the material at 80 nits either.
Yes there will be color difference. No, no one is keen to objectify and measure it. Subjectively its fine (but probably above JND (just noticeable difference).
You must be a very special breed of person to get it. (Perfection oriented, or. Anal retentive.) They guys who came up with the spec, had not your best interest in mind (or they would have kept it an unchanged standard for longer) - and for all I care wanted to get to a bar earlier when they defined a revision.
Vincent likes it though - and I can respect that as well.
But reasonably speaking - never, ever, ever get talked into something like that.
Especially if your mode of decision making is - SO I GOT ALL THOSE INFO, but guys - im an smartphone, can you break it down for me even more, whats best?
If you are that kind of person, and are asking 'if you are missing out'? - No, you arent.
Why can SDR be "relative" in that sense and HDR can not?
Our eyes only can cover a small range of contrast information, without adapting (Iris retracting, and contrast range shifting).
So for SDR you can shift that range around a little, because its intent is not to blind you - or make you squint or... And your eyes will adapt to that range accordingly. If you are watching in a dark room environment - at 100 or 120 nits though - because baseline is black environment, your irises will pump regardless - causing contrast perception differences everytime they do. So to minimize that (also with an impact on contrast perception), you can turn down screen brightness (you do on every laptop, and every smartphone, every day - without even thinking about it), or - as an alternative - you can introduce bias lighting (so your eyes never adopt to "dark room" levels, taking that range out of the occasion, making your irises pump less.).
On HDR every brightness level, and gamma is fixed - as that wonderful standard wants to play with the person watching a movie being blinded, or squiting, or being emotionally moved by a simple brightness effect. So the 'capability' to turn down screen brightness isnt in the standard. Flipside is, that you now get 100s of projector owners believing they also can do HDR, because marketing told them so, while their source device tops out at 120 nits (off of the canvas they are using). And HDR (home user) material currently is mastered to 1000 nits or 2000 nits standards. No one is talking reality into those people either.
They then use relative brightness, and turn down average image brightness level, to be blinded by an effect once in a while - and call that "tha HDR experience".
So they do on purpose, and willingly - what you are so reluctant to try to see how you like it.
Because somone fed you a buzzword, and now you think you need bias lighting.
In short I doubt it.
If you are a colorist - you currently are supposed to use it (especially on HDR ("Boss I have a headache again" (dark room mastering 2000 nits highlights) - goes into money real fast, if you are a production facility)), but you werent in the past.
The first thing you have to learn about this scene is, that someone is always trying to upsell you on something. And the goal is to get you to a point, where you can make informed decisions on your own.