Originally Posted by BondDonBond
Hmmm, Ok I might be confused and I am not an engineer.
1) Is it fair to say that for good 4K HDR you need Lumens more than contrast? Yes I know you want both in perfect world.
It's a balance, but I find poor contrast pulls me out of the experience far sooner than missing "eyeviserating" highlights.
2) For proper HDR you want 1000nits (about 10,000 lumens)
Not necessarily for projectors. Dolby Cinema is peak 106 nits. Now we don't have Dolby Cinema masters, but I'd say (at least with JVC owners) the consensus seems to be 100 nits is about what it takes for great HDR, or course more is better.
3) To most people if they have great HDR the contrast (Black) would be overlooked
I doubt it. For example Manni can get either 130 nits in low lamp (RS500 I think) or 200 nits in high lamp, has said he generally runs in low lamp for the better black levels and lower noise, only using high lamp for very bright titles. A lot of folks in the RS500 thread bought HDFury Integrals, and later Linkers, solely to be able to disable HDR (force player conversion of HDR -> SDR+Rec.2020) specifically because of reduced contrast.
4) Color gamut of course is important but the article about Joe Kane's presentation was interesting about 2020 on the main page of AVS
If we're thinking of the same thing, that's irrelevant, it's talking about Rec.2020 displays. Rec.2020 is just a container as far as we're concerned currently., UHD Blu-rays today are only mastered to about DCI P3, good projectors can hit about that. At DCI P3, there are none of the issues Joe talks about.
Soooo wouldn't you want laser with highest lumens you can afford since no one has high lumens high contrast under 100K and even then not to spec.
Just based on what I've seen in my own HT, I'd keep my RS600 rather than switch to the Sim2 Duo (9000 Lumens, but about 1/10th the contrast).
Lets put things in perspective...
Lets take a UHZ65, lets assume it can do about 1000 Lumens calibrated, and about 2000:1 native CR, both of which are better than the UHD65's have measured. Say we calibrate it for HDR with a peak white of 100 nits. That puts the black level at 0.05 nits.
Now lets assume we have a UHZ6500, with 10,000 Lumens calibrated, same CR... That would put it at 1000 nits for peak white, and a black level of 0.5 nits. That means black
would be about the same brightness as 5% APL pattern
Would you want to watch HDR where black was effectively 5% APL?
Here is a spec on a cinema projector that really confused me and there most are like this.
Resolution 4,096 x 2,160
Brightness Up to 56,000 lumens
Native contrast ratio 2,800:1 (typical) / 500:1 ANSI contrast (typical)
Cinema projectors light enormous screens, and need enormous screens, so they need to make sacrifices in other areas. But to address your question directly, brightness or contrast, lets look at Dolby Vision as it was built for Dolby Cinema, and compare it to a typical cinema.
Per DCI, a typical cinema has a peak white of about 50 nits (14-16fL) and as you note, the typical DLP cinema projector is in the 2000:1 range contrast wise, maybe a bit higher. So what did Dolby actually design with Dolby Cinema? Well, Dolby Cinema has a peak white of "only" 106 nits, just double that. However they use 6-chip, tandem DMD (stacked) that can achieve over 1,000,000:1 (theoretically) contrast.
So when Dolby went to build Dolby Cinema, HDR for cinemas, they boosted the brightness a bit, but made a massive increase in contrast, "500 times"