Originally Posted by Javs
When I was doing my quick pictorial analasis of the Arve curves a while back (I am still using the 650 Curve and love it - so much so, I havent been fussed as of yet to try the little bit extra all you guys have managed with the tool) I had set SDR to wide open to match at least the peak white of the HDR that I was comparing.
I have to say that the 1300 nit clipped SDR was very bright, but clearly lose a crap ton of highlight fineese to the image, and when compared directly with HDR, well, the image became more contrasty and punchy, definitely HDR looked better with the curves.
BUT, the 4000 nit SDR title, which I used the contrast slider only in the Panny to clip to 4000 nits was very close to the HDR image curves but with a couple of natable differences, the majority of the midtones were matching HDR, but the SDR highlights were totally muted it seemed, odd since the peak white points would have been fundamentally the same, but it was the distance between say 200 nits ant 4000 nits which just looked a bit muted, and this can be seen in the below images.
If there is something specific you want me to try out Manni, set for a certain peak white, using certain curves, I would be more than happy to spending an hour having a look and even taking a couple more snaps, if it helps...
But yeah, my take, since I actually did brightness match SDR with HDR for about 80 nits was, HDR just looked more 'correct' when they were both matched in peak white clipping points. No doubt about it. You do need to quite quickly A/B them, or better yet take some images to see/remember how the highlights look different but they definitely do, and not in favour of SDR WCG.
Going down to -10 for SDR WCG clipping back at 1100 nits again, well its nice, but with those stress test shots in 4k nit films you just lose too much, but for 1000 nit films, its still nice, I don't mind it, but have to say, I don't miss it either since getting the DI back and a completely natural HDR image now. I also dont see the ANSI increase in wide open being much of a muchness when you can hit >100k:1 intra-scene with lower APL films, I think that's a far more obvious advantage over what is it, 30:1 ANSI increase on a good day going from -10 to -0? Wish I could have my cake and eat it too. I do relish in normal HD Bluray viewing actually, I like both. I am sure Planet Earth would be far less of an event in SDR in HD thats for sure, thats why I held out, but older films and such, I certainly dont sit there wishing they were HDR. MadVR does such a good job that for me its all still 99% SDR HD.
I should give LDVD's Medium curve a go, sounds like everybody likes it. I do feel like despite clipping for 4000 nits in my curve, I would like just a tiny bit more finesse on the edge of that clipping point, I do feel like it last few levels of gradient are just a bit too close together hence almost giving the illusion of clipping earlier.
I know you have seen these, but check out the 4000 nit comparisons again, those are set for approx 80 nits clipping and brightness matched, you will also see were 4000 nit SDR falls apart, its not in average brightness its actually the highlight info being a bit muted and losing a lot of that punch you get with HDR of the same exact peak white.
Thanks for this. I've advised to use the actual peak brightness for SDR BT2020 up to 100nits for a while, so you did what was expected. I'm not surprised by your results, but you have twice as much peak brightness as Ric, so you do have much more room for highlights than he does.
Also, Ric is clipping around 2500nits (as I remember), so he's losing highlights detail in 4000nits titles too, which you aren't with a 650 curve.
The reason why you lost detail above 200nits in SDR BT2020 might be because you use only contrast in the Panny to set the clipping point. As you might remember, that's not what I recommend. I use the SDR slider too, down to -4 or -5, and constrast at -1 in the Panny. I know that I'm clipping around 2000nits, but I definitely have all the details below. For example, in a 1000nits title such as Deadpool, there is no loss of detail in the highlights at all. I know that below that it changes gamma significantly, but until -5 I find it helps getting better highlights.
Of course, with 4000nits titles that go up to 4000nits such as Madmax or BvS, there is a loss of details, but not as much as what you describe.
So yes, when you have some time, it would help if you could try to set SDR BT2020 properly, with contrast at -1 in the Panny and SDR at -4 or -5 (that gives me 1200-2000nits clipping). I might have the contrast slider at -1 in the JVC as well, can't remember.
Still, you have a lot more headroom for highlights. With a 650 curve, you can have reference white at around 16nits in HDR, like most of us do, while Ric has to lower it to under 10nits or so, and he's still clipping at 2500nits, so still losing details on 4000nits titles with content going up to 4000nits.
As people are getting closer to 100nits peak brightness, no doubt it makes sense to use a custom curve and get the linker. Anything above 50nits has potential for some improvement, but as up to 100nits you can simply set peakY to your actual PeakY in SDR BT2020 and enjoy, I find the benefit becomes stronger as you get close to 100nits, and above of course, as you can keep reference white at the same level (around 16nits) and have more headroom for highlights.
My question was specifically for Ric, who with a 42nits peak brightness doesn't have enough brightness for SDR arguably. This is the reason why I was asking him where the PQ improvement objectively lie in his HDR curve vs SDR BT2020. With 80nits peakY, you're close enough to 100nits to get a lot more benefits, and in any case you are at least above 50nits, which is what we need for SDR. If you do any testing, please set your peakY to 42nits, otherwise I already agree with you, with 80nits I would use HDR with a custom curve and a Linker too.
By the way, I'm not saying that there are no objective PQ benefits in Ric's case, I'm only asking what they are because I can't understand why or how he could reach a significantly better picture with such a low peak brightness.
Also, in SDR I would always privilege on/off over ANSI especially for low APL titles, but in HDR I find that 30-35% better ANSI makes a huge difference, that's why I prefer to use the iris fully open in HDR, especially now that we've solved the black floor issue.