Originally Posted by x43x
Retailers are anticipating an uptick in pricing as well, so I might have to decide if the native 4K is worth the premium, or if contrast wins and I get a 540. I really hope the real-world contrast is decent and I would be okay with the levels that Sony has if they still accept the 18gb video signal.
18Gb/s is so 2017...
You want 48Gb/s unless you swap PJs every year. HDMI 2.1 will be mainstream next year. First use for me: RGB 4:4:4 in 4K 10/12bits at 60fps for my HTPC, to make the most of MadVR's goodness with all content. But there will be more and more content over the next couple of year that will benefit from HDMI 2.1.
I'm not investing close to 10K to get an already obsolete tech. Whatever I buy next has to be good for 3-5 years, same as my RS500 was.
My Denon X8500H is upgradable to HDMI 2.1 (that's why I bought it a few months ago and didn't wait until next year), the next nVidia GPUs landing in September will support HDMI 2.1 [edit 08-19-18: sadly it looks like the 2080ti won't support HDMi 2.1, so it looks like I have another 12-18 months before I need to upgrade], so there's no way I'm investing in a display that doesn't support it (or offers an upgrade path).
Given how quickly HDMI standards change, they should make the HDMI boards upgradable as soon as they are asking for more than 3K.
Lumagen does it with the Radiance Pro, so it's not impossible. If there is a will, there is a way
By the way, before you say that HDMI 2.2 is around the corner, what makes a difference here is the bandwidth, because that's hardware related. Most of the other features are f/w upgradable.
So I waited until I could get full 18Gb/s bandwidth to upgrade my A/V chain 2/3 years ago (GPU/Source, AVR and display), and I'll do the same this time around.
Given that JVC never adds features in their f/w upgrade and don't always correct the bugs, that's the least you can do to, if not defeat, at least fight planned obsolescence.