Originally Posted by Javs
Current MadVR Settings, tried all of them:
Tried every combination of settings on this page.
I'm pretty close to that, with one exception, I think I've got my "fix to bright" setting at 90% luminance reduction, 10% saturation reduction. I really should play around with it more though. I also haven't settled on a peak nits. 200 resulted in an image that's way too bright, way, way brighter than SDR. I think I've got mine at 300 now too, and that still seems too bright.
Also, I've got my calibrated gamma set to 2.4, since that's what my projector is calibrated to.
Originally Posted by Dave Harper
Excellent write up and info Stanger89! There is a third option too though. You can do like I did with HarperVision (and now the German's are for the Sony's custom gammas) and still send full HDR to the projector, but then manually switch to SDR mode on the projector (while maintaining the HDR input signal). This will then cause you to have to make a custom gamma and really jerk some of the picture settings in the menus around to get it back looking normal and awesome and better than straight HDR on projectors (and most other options in my humble opinion!).
That's basically what the custom gamma is, it's just that Arve made a tool to compute/upload it for us. The first custom gammas we did were by hand adjusting the 11-point gamma settings to be correct.
Originally Posted by Bytehoven
Stranger... how are you calibrating your straight line 2.4 gamma? Do you use the projector gamma controls or do use arve's to design a traditional 2.4?
I just do an autocal, then select a Custom gamma, 2.4, and then +2 on the Dark Level which seems to get pretty close to BT.1886.
Harper... can you comment on how you are adjusting the sdr gamma controls to shoehorn the hdr souce?
Not to speak for Dave, but what I remember reading in the Epson threads, he's just cranking on the multi-point gamma controls, or, exactly the same thing we're doing with custom curves, it's just we've got a tool to do it.
I am also curious if the madvr or oppo tone mapping is linear or can you tweak, stretch or compress areas of the map to suit a desired look?
Javs posted the madVR controls above, so that's pretty much what you've got to work with there.
I'd love to get that link again with the nit information for films. Also, Is there a link(s) which keeps a master list and updates it to include more titles? It would be nice if the studios would provide the tech info right on the title to assist with playback setups. In this regard, is there a trend as far as films being mastered to 300, 1000 and 4000? Or are they falling where ever right now? Seems a little bit like the wild west just now, until maybe a new thx-esque sheriff comes to town. Maybe that will be dolby?
Someone posted this, no idea if it's being kept up or not:
Originally Posted by Dominic Chan
Haven't people been using two custom tone mapping curves, one for 1000-nit master and the other for 4000-nit master?
The more posts/threads I read, the more I think Kris Deering is right, including the mastering monitor details in the metadata was a mistake. Knowing the mastering monitor details, whether the Max Luminance is 1000 or 4000, or whether the black level was 0.005 or 0, doesn't tell us anything useful, and has just, and continues to lead to confusion and taking wrong paths. Most notably look at the recent discussion on calibrating black level. For a long time we thought it was "optimal" to calibrate to black of 0.005 because of a couple of poorly mastered titles and the coincidence
that they happened to be from 0.005 nit-black mastering monitors.
But the mastering monitor properties are utterly useless. What we should be paying attention to is MaxCLL, that's what says what the brightest pixel in the movie is, and what's relevant. Take a look at Blade Runner 2049, it's reported as Max Luminance (mastering monitor) of 10,000, however it's MaxCLL is 181. If you were going to pick a curve for that, would it be better to pick a 4000 nit curve, because the Max Luminance is > 1100? Or should you pick a 1100 nit curve because Max CLL is < 1100?
There's lots and lots of "1000 nit" titles that have MaxCLL and MaxFALL brighter than "4000 nit" titles, so going by the mastering monitor properties is no more likely to lead to an "optimal" curve than just using one curve for everything, which is my recommendation. When I've tried comparing curves, where the only thing I change is the white clipping point, the results are vanishingly small, if I wasn't watching the curve be uploaded, I wouldn't notice a difference at all.