Originally Posted by Stereodude
I'm pretty sure there's a BT1886 topic that you can spam with your tinfoil conspiracy hat theories on the ITU and BT1886 instead of this one.
I'm sorry, either this is science that can be challenged by looking at the impact and the reproducibility of data.
Or this is a believe system where anyone finding flaws that especially impact the range of devices that are available on the market gets shouted out the way you elegantly tried to short out the criticism at hand - trollingly combined with the most severe of accusations > wrong thread. (Essentially doubling down on "dont talk about it here", and "dont talk about it anyways".)
Sorry - I don't pander to
- threads where as a suggestion videos are linked in which a member of the body that formulates the suggestions that then become the standard - clearly doesnt know a thing about the standard he actually was booked to talk an hour about are STILL LINKED as the explanations one should listen to to.
- as alternatives papers and (calibrating software manufacturers) websites are linked in which a surface explanation is given that doesnt add up with the intent (at least for the vast majority of devices out in the market).
And especially no explanation is given upon the results that this produces on devices on the market - which display huge variation dependent on their black point only. While brightness doesnt seem to affect the standard at all, and there is no explaination given regarding bt1886 and room light levels.
That added to the _incredible_ fact, that this standard is getting shoe horned in 10 years after the first Blurays were released. Which brings up the _really_ sensitive question - what were you guys calibrating your devices to before? (Ups, we totally forgot about gamma compensation..) - which now should be implemented on the end users side rather then at post production?