The w1070 supports a gamma of 2.8 max, so that's what would represent the most high contrast, HDR-like image. Then when watching HDR rips from a PC (or an HD Fury Linker set to 2.8) would have a relatively higher contrast and take better advantage of the 10-bits these projectors have to offer.
I would gladly add an HDR10 LUT if I could figure out how to do that, maybe the firmware can be hacked to modify the hex values. I'll look into it. E.g. modify the LUT for the 2.8 gamma setting to correspond to the brightness levels of a PQ signal, then upload that firmware as a "1.09 DIY" version. It could look pretty decent. But for me, I want Windows to think my w1070 is a true HDR10 display, with some custom peak nits value set somewhere because the projector doesn't understand HDR metadata. Maybe that would require a "strip metadata" feature.
I asked the folks over at HD Fury if they had a product which allows one to enter a custom, 1:1 passthrough LUT, but they don't have one. That would allow arbitrary HDR curves to be converted to SDR at least. Or HDR10 to masquerade as SDR10 (without any of the actual signal values being changed at all) and then switch to the PQ decoding mode inside the projector.
But out of the box, the HD Fury Linker product I believe allows you to pick the output gamma value, so using a 2.8 value to match what the w1070 supports, and using 10-bit input, would yield the highest contrast / least banding image. No mods required (just money). The only reason I haven't bought one myself (so I could at least watch Netflix at UHD on my w1070 with 10-bit colour), is because I much prefer to watch 1080p rips (even though I am a paying Netflix customer) and use MPC-HC to view all my content with frame interpolation.
Last edited by RLBURNSIDE; 04-16-2017 at 11:31 PM.