Originally Posted by Wizziwig
Is there an advantage doing the HDR tone mapping down to typical projector brightness levels inside the projector vs. inside the player? If not, can an SDR projector with equal maximum light output (and MadVR tone mapping) produce similar results to a projector with native HDR support? I don't remember what bit-depths the older SDR JVC projectors were able to resolve. I know the new upper tier models claim to have full 12-bit paths all the way down to the panel itself. All consumer flat panels I'm aware of seem to be limited to 10-bit.
All else being equal, in theory it shouldn't matter who performs the tone mapping. However, in real life everyone seems to do tone mapping differently. E.g. my Sony 4K TV looks *totally* different if I feed it HDR content untouched vs feeing it HDR content tone mapped via madVR. Obviously the Sony TV uses a completely different compression curve than the one madVR uses (the Sony's result pops a lot more, but also clips to some degree). The same is likely to be true with most other displays, too. There's simply no official standard for how to do tone mapping exactly (although SMPTE makes a suggestion in some of their standards, don't recall the exact standard name/number).
In addition to the tone mapping algorithm itself, the display may also do other things differently in HDR mode, e.g. drive the backlight/lamp much harder in HDR mode, use a cooler color temperature to squeeze more light out of the backlight/lamp etc.
In the end, doing tone mapping in the source device should work just fine, so watching HDR content on SDR displays should generally be no problem at all, if the source device is able to do proper tone mapping. However, since everyone does tone mapping differently, results will vary a lot, depending on who does the tone mapping. And you may also have to manually tweak the display/projector settings (lamp power, color temperature etc) for highest possible lumens output to match what a display/projector looks like which supports native HDR.
Bitdepth doesn't matter too much, IMHO, as long as proper dithering is used. My (older generation) JVC X35 is relatively noisy even with 10bit input, so I think it probably dithers internally and the panel is probably limited to max 8bit, but it's hard to be sure.