Originally Posted by Vitus4K
Everything less than 10,000 nits is a compromise.
And any theater with a budget for a projector under about a couple million dollars is a compromise. This is as much a fact as your statement since you can't get 10k nits in a screen over 85" for less (if at all) and since that is WAY too small for a home cinema, it is safe to say that ALL home theaters are a compromise. That makes your statement redundant at best, irrelevant at worst.
We all strive for "the best that we can get within our budgets" and anything that improves things within our budget as much as DTM does is not a gimmick, it is a game changer.
I run a 150" wide screen, and everyone with a larger screen like mine knows from experience that HDR has been a challenge at this size of screen. When I started with an Epson 5040 projector, HDR was unwatchable because, as you have pointed out, 10k nits was not acheivable. Sure, if I had $60k I could go to a Sony 5000es and get a couple hundred nits, but that is still a far cry from 10k nits even though with the minimal amount of tone mapping in the 5000es, HDR would be pretty decent. But when I tried a custom luminance curve on my cheap Epson projector, it made some HDR movies not only watchable, but an improvement over SDR. This opened my eyes to the possibility that tone mapping could make HDR a reality on larger screens even without 10k nits or even with less than a hundred nits. Upgrading to the JVC RS2000 with adjustable tone mapping was a major increase in functionality and truly a game changer. Nobody else could compete on my screen size for less than $60k in a single unit, and at best other brands could only compete with the addition of a processor that costs $5k or more by itself. I could now watch ANY HDR content and the quality was drastically better than when using a static luminance curve that was dialed in for one type of movie.
In fact, the only thing that could make it better was more lumens or tone mapping that analyzed each frame and adjusted accordingly. And when JVC did that FOR FREE, it was once again a "game changer".
See, the "game" is home theaters with sub-million dollar budgets and screens larger than 90", and in this game, only ONE sub $20k projector line can provide the best possible HDR with any HDMI source and with a WIDE margin in quality over the competition, and that means the competition will have to change their game to compete in the future. You CAN have good HDR with 50-100 nits. And it's not really a compromise if it is the best you can get. Sure, you can get a Sony 4k, or even a faux-K projector for less and add a multi-thousand dollar processor to achieve this, but they have done it in the projector for less money, and that is a massive win.
The game will continue to change, usually in small increments, but it will still change significantly over time LONG before we get to 10k nit large displays. For me, the next step is better upscaling while retaining the simplicity of a single source for ALL my content, from 480 all the way up to 4k HDR, without having to spend thousands on a processor and still manually switch modes in my single source device. The new nVidia Shield Pro does that. It has an AI upscaler that will offer a massive improvement over the current upscaling, making it so I don't have to manually change the resolution and use an external processor to improve the upscaling and while an external upscaler might still be slightly better, I will be above the bar I set for myself. This one doesn't "change the game" because most people are satisfied with mediocre upscaling, and the competition doesn't seem to care about having the best all around media player on the market so they won't change their game to compete. But for me it is huge because it saves me a bucketload of money and the hassle of changing resolutions for each type of media for a mere couple hundred dollars. That isn't a gimmick either, even if it isn't a game changer.
If your bar is set impossibly high, that is your prerogative, but to tell everyone else that they are below the standard even when they are already within a hair of the best you can possibly get is just unproductive. I would put my theater display up against any high end LCD TV, and feel it is almost as good as OLED, and since you can't get a 150" wide OLED in a 2.39:1 format for under $20k, that is good enough for me. I have no desire to spend $60k for a marginal improvement, or $200k for a 2-3% improvement, any sooner than I would spend a couple million for a 5-10% improvement. YMMV...