Originally Posted by dovercat
According to Joe Kane DVE test disc reference for maximum ambient light test pattern and guide the maximum surround brightness should be 10% of the peak white. CinemaQuest also say "SMPTE recommends the level (brightness) of ambient lighting should be no greater than 10% of the brightest white produced in the video image on the display"
SMPTE RP166 recommends a white level of 120 nits (35fL) with an ambient light level <12 nits (<3.5fL) so max 10%. But I have read in practice they use similar white levels to those use in Europe EBU TECH 3320 version 2 Oct 2010, 70 to at least 100cd/m2 (20.43-29.19+fL) white, the old EBU TECH 3320 version 1.1 May 2008 used 80cd/m2 (23.35fL) as an example of a reference white point. So that would be 17-12% with 15% as the example. EBU TECH 3320 states "If the viewing conditions are standard dim surround (15% as in ITU-R Rec. BT.500-11)" ITU-R BT.500-11 Methodology for the subjective assessment of the quality of television pictures is "Ratio of luminance of background behind picture monitor to peak luminance of picture:~0.15"
So max 10% according to the SMPTE standard, maybe 12-17% using the EBU white level and possibly 15% in practice and according to the ITU.
I'm not sure how things work in the EBU, but I believe the "10% max" that Kane and SMPTE refer to is relative luminance
, which would be the equivalent of an ~35% stimulus gray on a 2.2 display. (The maximum ambient reference
on the 2003 Component edition of DVE is an ~35% stimulus gray.)
According to DVE, video (in the US) has an average picture (stimulus) level of about 15% over time. In practice I've found a 15% stimulus gray to be a good starting place as a reference for surround light levels with my 34" CRT. Depending on the content, display, etc. you may need to go somewhat higher, or slightly lower than that for better results. I leave enough room in the contrast/white level adjustment on my display to accomodate a range of ambient references between about 13% and 20% stimulus.
If your display has poor (ie brighter) black levels, then you might want to err on the brighter side of this range, perhaps a bit closer to a 20% stimulus range, to help make blacks look a bit deeper on the display.
As always, YMMV.
FYI, on a 2.45 gamma display (which is the reference I use), a 15% stimulus equals about 1% relative luminance, so the ratio of peak white to surround is roughly 100:1 in terms of relative luminance. In this PDF
, Poynton proposes a surround level closer to 5% of peak white for video mastering, which would be a 20:1 ratio of peak white to surround luminance. He also suggests surround lighting that's much brighter than standard practice in the US (100 lux vs. approximately 16 to 64 lux in the US for video mastering and sRGB encoding, respectively). IMO, most TV viewers in the US would find a 20:1 ratio of relative luminance very lacking in contrast.
I'm not sure what the 15% figure you're quoting for the EBU above refers to (% stimulus?, absolute luminance?, relative luminance?). If it's relative luminance
though, then you're talking about a peak white to surround ratio of only about 7:1, which would be extremely low in contrast compared to video in the US. That would be getting more into the neighborhood of an "average surround" than a "dim surround".