Originally Posted by normandia
Yes Marketing...and I was very skeptical as well, but knew that they did not say with HDR! so by omission, they get away with confusing us...it is more realistic as it is 10 bit and not 8 bit! At the time, it was the best one could get at a reasonable? cost. Also, they had greater than the 10.2 Gb chipset that was available, and to my knowledge 18Gb chipsets were not available (or at best very limited) in 2016 and no content took advantage of that at the time. Getting 13+ Gb was a very pleasant surprise.
Nevertheless, I hear you loud and clear...and am not a fan of marketing at all...
Disclaimer...I agree with you.
It inferred HDR by default as the signal has to be 10 bit to qualify for HDR, and also HDR is the only mode that pertains to HDCP2.2 and 4K in respect of latest and future standards.
But yes, there wording has plenty of full stops, and iffy structure, so they can argue 'you read it wrong'.
One part of it cannot be misconstrued though, regardless of the specs and that is the 'smooth and realistic' part. Whilst that again is subjective, reality is what it pertains to and banding or posterisation cannot be deemed realistic or smooth by any reasonable person.
Also my call from Sony yesterday said that the signal referred to is, due to hardware limitations down-sampled to 8bit anyway at high frame rates and they said a firmware update would not fix it!
If only the video engine could do a good job of full level chroma up-conversion on HDR, we wouldn't even be discussing this as even bit depth is not the actual issue.
I other words, the marketing may be full of bluster and be interpretable many ways when picking over it, but the problem actually is that the hardware in the device isn't capable of fully and correctly processing the image for display and its attempt to do so makes it look unnatural. Either omit the mode and have done with it, if it isn't 'much in use yet' or at least get someone to proof view it and see if it fulfils the promise of 'smooth and realistic' before releasing the item widely for sale.
I always set my sources and displays to 60hz as I prefer the look of it. Personal choice but that is my M.O. and one of the reasons that marketing hit home with me. I thought, great this is now at the level that I will be able to pull the trigger and use it is I prefer and want for the next half a decade or so without needing to look for anything else. That is why I avoided the 520, it had 8 bit written all over it everywhere and as such I avoided it and waited. As I would have with this if they had been more honest in the spiel.
It is of no actual consequence if content is readily available natively for that mode or not. Indeed, having content available would still not improve the render of the image. So if it is not a widely used mode, ditch it for a couple of years, until the full on chips are 'cheaper', but don't add it now and then just half bake the application of it for the sake of a flaky sales point.
Also I bought a £399 Philips 4K (not UHD certified) HDR TV in 2016 that has a full 18Gbps chipset and MHL too! Some might now argue it only has an 8 bit panel (something I dispute at the moment but will look into) but even if it does, it shows 4K 60hz HDR images at any signal you throw at it and they always look 'smooth and realistic'. All for only around 4% of the cost of this projector!
Oh and my 2015 Pioneer (SC-LX89) receiver has full fat 18Gbps HDCP 2.2 chipset too.
Incidentally, I am guessing this is why Sony doesn't subscribe to the UHD Premium alliance. It would mean their products would *have* to process and display at 10 bit without any internal trickery.
Marketing skulduggery be damned!