AVS Forum banner

21 - 40 of 44 Posts

·
Registered
Joined
·
1,548 Posts
From the The European Broadcasting Union (EBU) Document: USER REQUIREMENTS FOR VIDEO MONITORS IN TELEVISION PRODUCTION


Grade 1 Monitor: 70 to at least 100 cd/m2
Grade 2 Monitor: 70 to at least 200 cd/m2



ITU-R BT.500-11 requires monitor brightness up to 200 cd/m2 for tests simulating domestic viewing conditions


So there is no reference light output. But a min requirement for mastering at least 100nits/cd/m2. Since the min. requirement for testing domestic viewing conditions is up to 200cd/m2, there should be no problem with such brightness and above. If you watch in higher brightness/light output you should just check the white clipping test pattern if everything is ok and nothing gets clipped..


Note:


Grade 1 monitors are devices for high-Grade technical quality evaluation of images at key points in a production or broadcast workflow. They are used for critical evaluation during image capture, postproduction, transmission and storage. As a minimum requirement, these monitors shall have the quality properties of the image system they are used to evaluate. It is expected that all applied technologies are state-of-the-art at this level. This means that artefacts should not be unduly masked nor should additional artefacts be introduced.
As a reference device, the settings of this type of monitor should be adjustable as well as lockable (mechanically or electrically), so that only authorized access is possible.
The Grade 1 monitor is a ‘measuring instrument’ for visual evaluation of image quality. Therefore, it would be highly desirable for the monitor to be able to reproduce the native scanning mode of the presented signal (i.e. progressive or interlaced) or as it is intended to be viewed (e.g. 50 Hz presentation of 25p material).



A Grade 2 monitor may have wider specification tolerances than a Grade 1 monitor, and as such, can be priced significantly lower, or be smaller in size or weight than a Grade 1 monitor. Grade 2 monitors are used in applications where the tighter tolerances of a Grade 1 monitor (for example on accuracy of colour reproduction and stability) or the additional features of a Grade 1 monitor, are not necessary.
Grade 2 monitors are usually used for image preview, control walls, edit suites, and control rooms where no picture quality manipulation is carried out.
It should be possible for Grade 2 and Grade 1 monitors to be used together, for example in television production control walls.


 

·
Registered
Joined
·
9,575 Posts
From the The European Broadcasting Union (EBU) Document: USER REQUIREMENTS FOR VIDEO MONITORS IN TELEVISION PRODUCTION

Grade 1 Monitor: 70 to at least 100 cd/m2
Grade 2 Monitor: 70 to at least 200 cd/m2


ITU-R BT.500-11 requires monitor brightness up to 200 cd/m2 for tests simulating domestic viewing conditions
Hi, the ITU-R BT.500-11 is pretty old (June 2002), the latest revision (in-force) is ITU-R BT.500-13 (January 2012) ...but currently industry based to that ref. document: EBU Tech 3320 (September 2017) which covers HDR also.

Studios are using Grade 1 monitors calibrated to 100 nits for SDR mastering via 3D LUT tables, if you want to maintain the artistic intent you need 100nits 3D LUT calibrated display to your place, for more perfection... for the eye to see color accurately, the surround environment need to be chromatically neutral also. (studios have walls, even desks painted with specific paint)

There is an industry-specified neutral matte gray with 18% wall reflectace (Munsell N5) specifically formulated for critical color viewing conditions with neutral surround as specified by ISO 3664:2009 (Viewing Conditions - Graphic Technology & Photography) / SMPTE ST 2080-3:2017 (Reference Viewing Environment for Evaluation of HDTV Images), not all gray's are the same, you need spectrophotometric measurement to be sure that it has an equal mixture of all the spectrum (r-o-y-g-b-i-v) colors.

Below is the calibration report of the calibrated paint of my room:

 

·
Registered
Joined
·
1,548 Posts
Oh, WOW! Now I got something to read :)


What about lighting? Is there a recommended colour temp for light used in the room?
 

·
Registered
Joined
·
9,575 Posts
Oh, WOW! Now I got something to read :)

What about lighting? Is there a recommended colour temp for light used in the room?
Personally I don't use lights, but to use a light, the walls have to be treated first, to minimize the 'color pollution' of viewing areas caused by reflections from chromatic surfaces.

When a high quality (Ideal-Lume PRO by CinemaQuest is great choice) D65 standard illumination is reflected from colored walls, etc., its color quality changes so it is no longer “standard”.

The application of a calibrated (Munsell N5) neutral gray to chromatic surfaces will eliminate such color pollution by providing spectrally neutral surfaces around the viewing area.

Room light is critical for color grading rooms where they are working for many hours 8-14H a day to finish a project etc, see there: https://mixinglight.com/color-tutorial/anatomy-of-a-grading-suite-design/
 

·
Registered
Joined
·
379 Posts
Discussion Starter #25
If 100 nits is the correct brightness for all content, why would I ever switch off HDR mode on Windows 10? Just for 8-bit RGB?
 

·
Registered
Joined
·
9,575 Posts
If 100 nits is the correct brightness for all content, why would I ever switch off HDR mode on Windows 10? Just for 8-bit RGB?
Movies for SDR home video release are mastered using 100nits calibrated monitors.

What is happening with other content or what is the recommendation for desktop/web/design/photo edit, under various light conditions during all hours the day is completely other topic.

Calibration is about understanding and following references, not doing stuff using personal preferences.
 

·
Registered
Joined
·
379 Posts
Discussion Starter #27
Movies for SDR home video release are mastered using 100nits calibrated monitors.

What is happening with other content or what is the recommendation for desktop/web/design/photo edit, under various light conditions during all hours the day is completely other topic.

Calibration is about understanding and following references, not doing stuff using personal preferences.
It's also reasonable to assume that most web designers and game developers work at 100 to 120 nits D65 as well, so that basically covers almost everything.
 

·
Registered
Joined
·
1,746 Posts
It's also reasonable to assume that most web designers and game developers work at 100 to 120 nits D65 as well, so that basically covers almost everything.
I often wonder how reasonable that assumption is.
 

·
Registered
Joined
·
379 Posts
Discussion Starter #29
I often wonder how reasonable that assumption is.
Well if they don't, your calibration just randomizes an already random source, so it doesn't matter. It's like trying to review speakers with an unknown response, in a room with an unknown response.
 

·
Registered
Joined
·
186 Posts
does anyone know if Sony's X-tended Dynamic Range distorts the HDR tone curve? I ask because, when HDR is detected, both brightness and XDR are automatically set to maximum. XDR however can be turned off manually. Personally, I think the image looks better with XDR off. But which setting is the most accurate? Thanks.
 

·
Registered
Joined
·
379 Posts
Discussion Starter #31 (Edited)
does anyone know if Sony's X-tended Dynamic Range distorts the HDR tone curve? I ask because, when HDR is detected, both brightness and XDR are automatically set to maximum. XDR however can be turned off manually. Personally, I think the image looks better with XDR off. But which setting is the most accurate? Thanks.
I would switch off XDR for HDR content. Measurements have shown that Sony follows the PQ curve in HDR mode even for content mastered at 10000 nits, so XDR will only distort it. If you have an X930E or Z9D which goes well above 1000 nits, I see no reason to use any form of dynamic tone mapping.
 

·
Registered
Joined
·
18,301 Posts
I would switch off XDR for HDR content. Measurements have shown that Sony follows the PQ curve in HDR mode even for content mastered at 10000 nits, so XDR will only distort it. If you have an X930E or Z9D which goes well above 1000 nits, I see no reason to use any form of dynamic tone mapping.
This is incorrect. Leave XDR set to its default setting in HDR and DV for Sony LCDs. It has zero impact of PQ tracking and is used with tone mapping and luminance output for Sony LCDs.
 

·
Registered
Joined
·
379 Posts
Discussion Starter #33
This is incorrect. Leave XDR set to its default setting in HDR and DV for Sony LCDs. It has zero impact of PQ tracking and is used with tone mapping and luminance output for Sony LCDs.
Tone mapping is the act of deviating from PQ tracking. I'd leave it on for OLEDs which are limited to 600-800 nits peak, but not for an LCD which can do 1800 nits peak, because most content is mastered for 1000 nits.
 

·
Registered
Joined
·
18,301 Posts
Tone mapping is the act of deviating from PQ tracking. I'd leave it on for OLEDs which are limited to 600-800 nits peak, but not for an LCD which can do 1800 nits peak, because most content is mastered for 1000 nits.
What??? Tone mapping is compression. But you want to use the term deviation, what is a Sony like the Z9D going to do with 2000+ nit content?

Also XDR does not exist on OLEDs. When was the last time you saw a Sony OLED menu and/or actually calibrated one?
 

·
Registered
Joined
·
2,228 Posts
Tone mapping is the act of deviating from PQ tracking. I'd leave it on for OLEDs which are limited to 600-800 nits peak, but not for an LCD which can do 1800 nits peak, because most content is mastered for 1000 nits.


I suggest that you listen to the advice given D-Nice is a trusted calibrator that knows his stuff with many years of experience.
 

·
Registered
Joined
·
379 Posts
Discussion Starter #36
What??? Tone mapping is compression. But you want to use the term deviation, what is a Sony like the Z9D going to do with 2000+ nit content?

Also XDR does not exist on OLEDs. When was the last time you saw a Sony OLED menu and/or actually calibrated one?
I meant dynamic tone mapping in general, on TVs that do above 1000 nits.
 

·
Registered
Joined
·
18,301 Posts
I meant dynamic tone mapping in general, on TVs that do above 1000 nits.
Unfortunately you still are not making sense. You do not touch any of the default values on any flat panel TV when it comes to HDR10/DV since those settings are tied to the tone mapping algorithm. If you change the setting, you are degrading the output. It does not matter if the TV can do 1800 nits. It will still tone map. XDR isn’t a dynamic tone mapping setting either.
 

·
Registered
Joined
·
379 Posts
Discussion Starter #38
Unfortunately you still are not making sense. You do not touch any of the default values on any flat panel TV when it comes to HDR10/DV since those settings are tied to the tone mapping algorithm. If you change the setting, you are degrading the output. It does not matter if the TV can do 1800 nits. It will still tone map. XDR isn’t a dynamic tone mapping setting either.
I see XDR behaves differently in HDR mode than in SDR mode. What exactly does XDR do in HDR mode?
 

·
Registered
Joined
·
6,772 Posts
21 - 40 of 44 Posts
Top