Originally Posted by gamermwm
Does it actually completely ignore the other layer of data though? Or could they have changed that too...
There are not multiple layers of video data. There is just one.
And again, there is no "downsampling", that term is completely wrong for this case. HDR10Plus is backwards-compatible with HDR10 This means there is only "throwing away stuff you don't understand" and not using it, and using what you do understand (the HDR10 data), completely unaltered.
There's the HDR10 static metadata, supplied along with
the video (it is not part of it) which is still used just like it always was. There are only a few values (see here)
which describe things like the brightest pixel overall, and the brightest frame-average light level. Once it is calculated by analysing the whole stream, these numbers are not going to change.
Then, as a new addition, a HDR10Plus stream has dynamic metadata which is embedded into the video stream
just like packet of video and audio data (technical term is "SEI messages"). That is how HDR10Plus works. It is definitely not another "layer". It's the same single layer of data.
- HDR10 TVs just throw away the SEI message packets containing HDR10Plus dynamic metadata when they arrive and do not change their tone-mapping strategy during the stream; they stick to the single whole-stream static metadata values.
- HDR10Plus TVs obey the SEI message packets containing HDR10Plus dynamic metadata when they arrive and change their tone-mapping strategy during the stream. This allows them to do less/no tone-mapping during dark scenes bringing out extra detail which under HDR10 might have been squeezed down (loss of detail or brightness or both, see PDF below and Vincent's video below) because of a very bright scene elsewhere and the single static of values having to be used.
That's all it is. For fun reading material, with pretty pictures showing how dynamic tone-mapping can help, try >> this :-) <<
. HDR10Plus is "ST 2094-40 App 4 (Samsung)" in that document.
Also, for a more general video explaining what tone-mapping does, and which it is necessary in the first place, watch this:
Originally Posted by DarkKnight4K
The picture quality of content on Amazon can vary greatly even within the same season of the same show. They grade stuff differently or randomly improve the picture quality of some of the content in various ways. That's why he's seeing a difference.
Originally Posted by sshuttari
If thats the case, I just want to know why the hdr is better, and I’ve checked on multiple movies and tv shows in hdr and they all looked much better
Yes, it has been the subject of MUCH speculation by many people over the last few weeks. Try asking Samsung or Amazon this question and you will get a variety of different answers!
It started with the previous August soft-launch date of HDR10Plus which came and went, but it didn't stop people saying "The Tick looks great. So, I think The Tick is HDR10Plus. So I think our TVs support HDR10Plus now."
We've been rehearsing the same discussion for 5 months now. Forgive us if we sound bored of the question
Originally Posted by ray0414
Placebo effect. nothing more, nothing less. Placebo effect runs rampant on these forums every year.
Indeed. The root cause of all this is Amazon's stupid refusal to be clear in their User Interface. But, at least they are supporting HDR10Plus.