Originally Posted by Verge2
It's a dark chip3 dmd. Why would you think it was any better than the other dark chip 3 projectors on the market?
Being a huge HDR proponent, I'm more interested in how the HDR performance is, given the superior ANSI contrast of DLPs. This one apparently has 500:1.
Also, this Optoma has 50% more lumens than the lowest end 2017 JVC (RS420?), so that should also help with HDR perf. If you can't get as good blacks, then get higher peak whites. Reading the reviews of the other Faux-K DLPs in terms of HDR performance, they seem to be up there with the best available.
And as awesome as JVC's on/off contrast is, I do somewhat agree with the assertion that ANSI contrast matters just as much if not more because it's related I'd love to see both side by side in a variety of scenes and in a controlled environment.
Here is a review
showing from the excellent Anna and Flo:
"Before calibration every mode was measuring an ON-OFF contrast of about 13500:1 with Zoom Max / Iris closed. A bit of contrast got loss through the calibration process.
In our optimized room we measured at the screen a modified ANSI contrast value of 235:1 (for 50% ADL). At the lens, the modified ANSI contrast value reaches 257:1. The older JVC DLA-X500 showed similar results.
The ANSI contrast tells you, how good a projector can display black next to white. The JVC DLA-X5000’s ANSI contrast is good, but not the best. That is why we were able to see vertical streaking in the black parts above and below the white squares of our contrast patterns as you can see on the image below. This behavior is common for projectors with a very high on-off contrast. Luckily it is not visible in a movie, except on white writings on black background."
DLPs seems to have 2X as good ANSI contrast as JVCs. This figure is even more crucial in HDR apparently.
A similar argument can be made regarding FALD LCD performance for HDR vs OLED, given superior peak nits translates to higher dynamic range and more image "pop" which even entry level DLPs are known for. I'm still amazed at how good my w1070's image is, and I have no doubt that it would look better with a firmware to add HDR10. There is literally no reason you couldn't modify the signal handling curves on any DLP to support HDR10. They even handle 10-bit natively.