Why better color volume is a pure marketing fabrication (second edition) - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 5Likes
  • 2 Post By acras13
  • 2 Post By RLBURNSIDE
  • 1 Post By dovercat
 
Thread Tools
post #1 of 22 Old 02-17-2017, 01:02 PM - Thread Starter
Advanced Member
 
harlekin's Avatar
 
Join Date: May 2007
Posts: 583
Mentioned: 56 Post(s)
Tagged: 0 Thread(s)
Quoted: 248 Post(s)
Liked: 77
Why better color volume is a pure marketing fabrication (second edition)

I'd STILL like to start this thread with a plea to remove Scott Wilkinson from the frontlines of AVSForums "Journalism" - as it is my personal conviction that his entire "critical process" consists of asking PR contacts if "stuff is reall" or "stuff is important" and then going with whatever the guy on the other end says. (edited the last half of that sentence for political correctness (PC) reasons).

I've watched him fish for opinions in four person panels, just to copy them subsequentially - I've caught him two times so far asking PR people and company engineers if "metamerism failure is real" -, and to be honest the last HOME THEATRE GEEKS is especially hard to watch because he basically gets fed by a PR guy on the importance of a bunch of image processing features - to the point, where the PR guy leads him to change the order of some slides - so his sales speech gets set up correctly - at which point the "Journalist" has the gall to excuse himself profusely for mistankenly having altered the order of the slides he was given to "produce" the show. During the rest of the sales pitch he offers thrilling insights such as "aha", "wow" - and "that especially great".

In the industry this ought to be considered an "industry friendly environment" - and really - someone has to intervene if all his "journalistic work" turns out to be a sales session - more often than not.

Also - this is the second time - I' have to put this thread up, because the first one got depublished by a nameless mod for "having insulted another forum member".

Please notice , that this is criticism I bring forward against the public figure of Scott Willkenson, journalist for this site - and not an average user.

As a public fiigure, having to take in public criticism is part of your role - this assertion is not meant as an insult, but as a statement against peceived malpractice as a journalistic entity.

I could write a letter to the editor instead - but why not make it public - as the amount of industry pandering has reached outrageous levels in the current podcast.
-

Now on topic.

Color volume - as a concept, is real.

But the way it currently is used to "sell" TVs is highly fraudulent.

Here are the aspects you need to know to understand this.

- Color volume is nothing new - every calibration dealt with it since the beginning of time. So whats sold to you is "increasing Color volume". And thats problematic.

Here is why.

- There is no mastering standard for HDR. The material gets mastered on a varied amount of screens with a varying amount of "color volume". The variance is supposed to increase in the following years as the industry trys to inch towards a theoretical target (they will probably never reach).

- Transfer functions that try to "mimic" the original intent in the "more limited consumer TV space" (== Dolby Vision) always alter Color Perception in comparison to the original masters intent. They are set up to limit color difference on a scene by scene bases, not at a color by color basis.

- There is no linearity at all in color error until you reach the desired luminance ("color brightness"), although the entire industry is trying to sell you on it. This is an important concept to understand.

Not hitting the intended target produces a wrong color match == a perceptible error. Not hitting the target by 1000 nits, or 3000 nits - still makes it a "different color".

The amount of perceptable difference also changes across the color space. So increasing screen brightness uniformly (towards the mastering target thats in flux as well) doesn't produce linear improvements in color accuracy - although the industry is selling it that way.
--

We have a measure for color accuracy. Its called dE. It already includes luminance ("color volume"). Color accuracy towards the defined rec2020 color space - currently is WAY off on all consumer devices. As in calibrators don't even want to publish the figures, because they aren't anywhere near the "presumed" visual match territory.

What the industry now does - is to take one dimension of the dE formula (color luminance, conincidently the one they are trying to increase the quickest - because its their main differentiator between screen technologies) and doing a size measurment contest - because they are afraid to show you the differences in the dE scale.

Color luminance - interestingly enough, is one of the less important "dimensions" in the dE Formula to begin with - which has both good and bad implications on the argument they are making.

But you can think about it that way. If a random color shows a color difference of dE 16 on a 1000 nits TV and dE 12 on a 2000 nits TV - the only assesment you can make, ist - that the color os still visibly wrong on both. Its not "better wrong" on one of them, just because the color luminance is closer to the intended target.

dE is a measurement stick that measures effectively nothing - but the point where people start to agree that colors are looking the same. It isnt set up to allow you to measure amounts of perceptible difference. In fact even if lets say five colors all show a dE value of 10, it varies by color.
--

So, whats happening in effect right now is - that "less wrong according to numbers that don't tell yo anything about perceptual differences until you hit dE 3 (I'd suggest an even lower threshhold)" is sold to you by the industry as an "improvement" year after year.

While in fact none of the TVs will ever reach the intended mastering screen even two years after the material is pressed on an UHD Bluray and shipped out - and every TV will show colors widely different. Even when Dolby Vision (the best kind of) color matching is applied.

The idea that "the TV with the highest nits output every year" is "closest to the intended target" is not upholdable from a perception standpoint. The dE numbers might be closest - or they might not be (luminance is only one aspect of dE), but for the forseeable future they will be different enough to always end up with a perceptable difference.

And the "improvement" that gets sold to you year over year looks linearly ("now 1000 nits more") but doesn't effect color errors linearly.
-

What the industry would do if they were honest - is to publish average Color Checker values - unweighted (even though that is a problem with Dolby Visions golden reference) towards the "presumed" standard (rec2020), or towards the actual standard (DCI-P3).

They aren't doing that - because looking at that metric it becomes instantly noticeable - that "color accuracy" is all over the place - even if we presume a theoretical mastering display that doesn't have to be anywhere near rec2020 spec.
--

Looking at color volume in an isolated way - just to be able to show linear progress in any metric at all is a ruse.

And Scott Wilkinson is a naive individual (edit: SELFCENSORED previous less PC discriptor), if he lets this be "sold" to him as the next marker for display excellence.

It doesn't tell you anything regarding color accuracy.

In fact there are expected non linearities in color chroma, that you currently see in every display profile (color saturation points curve off target), when measuring rec2020 that are expected to introduce more "color error", than the difference between two different luminance settings - but in the new "paradigm" those differences ought to be masked by just looking at "color volume".
--

More input and opinions are welcome - I just can't bare the thought that the industry agreed on selling sets that reproduce colors differently by design (open ended peak and average brightness, defined color space that no TV will reach for the forseeable future - defined "color volume" no TV will reach in the forseeable future) - and then jumps on "bigger color volume is beautiful" - although we still even haven't a chance to reproduce "mastering displays" - and they are a moving target by design.

The "improvement" the sets make year over year will not result in a "linear" reduction of color error or even "color difference across the entire display population" - so WHAT exactly are we announcing "bigger color volume every year results in "more beautiful pictures every year" - for?

Its a lie.

Why is "brighter" but still wrong more beautiful?

And shouldn't we take the time to define a reasonable dE threshhold to aim for the HDR production/reproduction workflow?

Or is the ISFs idea to "reprogram" their calibrators to now jump on the "its beautiful to have every TV reproduce colors, according to its individual capabilities - not reaching a common goal" already ingrained in peoples brains?

Your measure of quality is "bigger number is better" - without wanting to look at perceptual differences that will always be present and all over the place.

Its basically conning people into believing "different wrong" is better year - after year.

Last edited by harlekin; 03-04-2017 at 10:02 AM.
harlekin is offline  
Sponsored Links
Advertisement
 
post #2 of 22 Old 02-17-2017, 01:04 PM - Thread Starter
Advanced Member
 
harlekin's Avatar
 
Join Date: May 2007
Posts: 583
Mentioned: 56 Post(s)
Tagged: 0 Thread(s)
Quoted: 248 Post(s)
Liked: 77
If a moderator wants to take this down again, please leave me a notice as to why.

I see my intentions and the public criticism covered by free speech contrasting a public figure rules.

It is not like I am talking to the "private forum user" Scott Wilkinson.
harlekin is offline  
post #3 of 22 Old 03-03-2017, 12:48 AM - Thread Starter
Advanced Member
 
harlekin's Avatar
 
Join Date: May 2007
Posts: 583
Mentioned: 56 Post(s)
Tagged: 0 Thread(s)
Quoted: 248 Post(s)
Liked: 77
To keep the issue in public perception - I thought, why not update this thread with the factual lies from the current Home theatre geeks podcast (on the "believes" of professional broadcast engineers about HDR in an industry conference).

The overarcing issue is, that in the calibration field, the public gets fed bull**** frequently, and neither journalists, nor calibrators nor engineers are especially willing to say anything in fear of it hurting business or showing how little they actually know about topics - allthough they do actual presentations on them for other engineers in this case.

The topic of contention this time is, that - Ron Williams in the current podcasts insists that he has both an OLED and a LCD that can do "full rec2020" and even does so upon being asked if he's mistaken by the moderator.

Factcheck - thats bull.

He mentions the BVM-x300 which does 80% of the rec2020 color space and the Dolby Maui which of course isn't there either - but I couldn't find the actual coverage in the usual 60 seconds googling.

At least Scott Wilkinson caught that this might be BS this time, but then was convinced ("I didn't know that....!") within the usual 10 seconds of confidence game that takes him to copy any industry opinion. (The issue is, that the "journalist" never applies any actual filter (never does any research, never repudiates statements, never factchecks) - but instead, siphons the entire "stream of words" directly to the audience - providing only a "hosting and smiling" function - while at it. If you have ever been in a panel discussion - you know that role well.).

The other thing thats highly suspect in the current episode is that Ron Williams presented a HDR 2000 nits, a HDR 1000 nits and an SD comparison at an industry conference that presumably showed "more dark detail in the HDR images" - with industy decision makers often commenting on "why is this part "out of focus" in SDR".

Thats BS as well. If the signal source is the same - no HDR>SDR processing cuts dark details. Not on the TV side, not on the post processing side. This is just an example of looking at the HDR gamma EOTF out of black - and at an SDR monitor with a different gamma curve.

Its the bright details that get clipped, or "uniformly reduced" (Dolby vision) not the dark details. If information is in the signal from lets say 0.1 to 20 nits ("dark detail") it isn't touched. if its there you can make it visible by applying "the same gamma curve out of dark" as the HDR EOTF.

It's important that this is not just another round of BS that get shoved into this communites mindets - although it "presumably" has gone through a "journalistic" filter -

those things are important for people - even in the industry - to make important decisions going forward.

the tldr; on FULL rec2020 color space is - that it will never be reached on three primaries (devices that only use red, green, blue "pixels", or "laser illuminants") devices - because manufacturers even today hit the related metamerism failure issue (where people start to perceive colors entirely different) on current "close to DCI-P3" gamut devices - with SHARP - quote "adding more green into the mix, to combat that" (increasing peak width = lessen the ability to reach wider gamut triangles) on their projectors.

Everyone - even in the industry thats familiar with the tests on those devices knows about the significant, not just cost related issues in reaching rec 2020.

the tldr; on HDR is, that if I have to hear one more comparison to iPhone photo HDR I will tend to want to slap you in your face, because the iPhone doesn't have an HDR display mode in place - so what ever BS you are talking about (third obvious lie in the current podcast) its HDR faked to be visible in an SDR display environment - but also that dark detail representation will not change at all - considering that the detail is in the image to begin with.

This is especially important as people all around this forum have started to talk about "dark details" they see in HDR Bluerays over conventional material, and again - this is BS because its entirely reliant on the gamma curve used and on nothing else. We don't clip dark detail in production. Ever.

You dont need another 15 stops of dynamic range to show dark detail.

If we are talking about "starfield" scenarios with 2000 nits (first: why?) actually the opposite will be true - where because of the specular highlight alone, your eyes will "dynamic range adjust" and dark detail will be harder to perceive - so the entire "benefit" people are talking about here is - them not knowing how different gamma curves emphasize detail near black - and the HDR EOTF being a new gamma curve thats not power line flat.
-

Also - I think I will make this a regular "thing" whenever I watch the AVS Forums "Home Theatre Geeks" podcast from now on, because the amount of BS and Marketing fads that get shoved into peoples minds over it directly, without any kind of even perceivable journalistic filter - is a problem.

cheers

h.

PS: I really mean it - this isn't something that can just continue... One of the most (if not the only) direct sources of industry information out there, at the same time has become a venue for unreflected and simply wrong BS, PR and missinformation.

Oh, and by he way? The reason this works is the same as always. The guy showing the demo has an investment in having bought "professional" screens (= what?) so he insists, that he purchased the perfect product - then gives demos to an industry that hasn't applied the principles of critical thought to anything in decades - when it comes to the public field.

Thanks, but no thanks.

Last edited by harlekin; 03-03-2017 at 02:34 AM.
harlekin is offline  
 
post #4 of 22 Old 03-03-2017, 01:32 AM
Member
 
Norixone's Avatar
 
Join Date: Dec 2009
Location: Italy
Posts: 89
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 54 Post(s)
Liked: 28
Interesting. Hope you do not get to many people following this topic or the whole industry will sue you.
Personally, I am not liking what I am seeing with HDR, but I guess it was difficult to sell new appliences based only on the 4K wagon as if you do not have a VPR projecting onto a 3 metre screen or greater, it is veryhard to justify the investment.
I have seen a few demos with VPR and TVs displaying HDR material and I was less than impressed. Any detail in the blacks simply dissapears and the bright colours seem to receive a boost in brightness. Why should I lose detail in the blacks in favour of a more highlighted colour?
I was shown (and can be found on line) a sequence of the martian when the protagonist is picking up some rocks and in the background there are larger rocks and the sun. In SDR you see the rocks and Mat Damon's shadow clearly, while in HDR you see a blob of light in the background that overcasts everything else. If this is HDR then no thank you. There are other examples like the scene in Sicario with the shed and police car in the foreground. The colours of the houses become very dull and nearly indistinguishable.
Glad I am not the only one to think (and I am no expert) that HDR is nice on paper, but currently not working. Give me DCI-P3 at 100% and I will be more than happy to sit back and enjoy.
Norixone is offline  
post #5 of 22 Old 03-05-2017, 06:13 PM
Senior Member
 
HammerJoe's Avatar
 
Join Date: Jun 2001
Posts: 297
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 120 Post(s)
Liked: 11
Quote:
Originally Posted by Norixone View Post
I was shown (and can be found on line) a sequence of the martian when the protagonist is picking up some rocks and in the background there are larger rocks and the sun. In SDR you see the rocks and Mat Damon's shadow clearly, while in HDR you see a blob of light in the background that overcasts everything else. If this is HDR then no thank you. There are other examples like the scene in Sicario with the shed and police car in the foreground. The colours of the houses become very dull and nearly indistinguishable.
This is something that has been bugging me about the statement that we calibrate the display so that we can see what the director wanted us to see and your example is a good one because how do we know that if the director wanted to us to see those rocks?
When the director was cutting the film did he really care about those rocks? I mean did he really see them and thought "look at those rocks, I will not tolerate that they are visible, lets surexpose the sun so all we see is a white blob".

And then in the end does it really matter that we are only seeing the director wants us to see? What difference does it make that we see those rocks or not? Does the film get a new meaning suddenly because of it?

Me what I want to see on the screen are true colors and true blacks and whites. I dont want to see clipped or crushed elements.
I dont know much about HDR to know if it offers that or not but thats another discussion.

So if you tell me that calibration is done to calibrate the tv so that it has accurate colors/blacks/whites I accept that, but if you tell me its that so we see what the Director wants us to see I roll my eyes a bit.
HammerJoe is offline  
post #6 of 22 Old 03-07-2017, 08:32 PM
Senior Member
 
Kamikaze_Ice's Avatar
 
Join Date: Jan 2007
Posts: 250
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 129 Post(s)
Liked: 61
Quote:
Originally Posted by Norixone View Post
Interesting. Hope you do not get to many people following this topic or the whole industry will sue you.
Personally, I am not liking what I am seeing with HDR, but I guess it was difficult to sell new appliences based only on the 4K wagon as if you do not have a VPR projecting onto a 3 metre screen or greater, it is veryhard to justify the investment.
I have seen a few demos with VPR and TVs displaying HDR material and I was less than impressed. Any detail in the blacks simply dissapears and the bright colours seem to receive a boost in brightness. Why should I lose detail in the blacks in favour of a more highlighted colour?
I was shown (and can be found on line) a sequence of the martian when the protagonist is picking up some rocks and in the background there are larger rocks and the sun. In SDR you see the rocks and Mat Damon's shadow clearly, while in HDR you see a blob of light in the background that overcasts everything else. If this is HDR then no thank you. There are other examples like the scene in Sicario with the shed and police car in the foreground. The colours of the houses become very dull and nearly indistinguishable.
Glad I am not the only one to think (and I am no expert) that HDR is nice on paper, but currently not working. Give me DCI-P3 at 100% and I will be more than happy to sit back and enjoy.
Please remember that not every movie gets the attention/quality they deserve so don't judge anything by a small sample size. Do you feel... lucky?
This is goes for all video, not just HDR. I mean look how many SDR movies have wrong black levels, for example.

Tom Roper made this little HDR clip (Grayspeak8bhdr_.mkv) that I think is a more honest example of excellent HDR application where "marketing" is kept away from interfering and making the HDR-ness overblown, tacky and exaggerated (see Samsung comment below).
There are some more decent examples at demo-uhd3d.com. I like the LG demos far more than the Samsung ones that look like someone went to nexusmods and picked up one of the shaders (colors overblown, very artificial look, tone mapping is to clean and synthetic). But this might because I have the matching display the files are designed for. What you experienced with Martian may be along these lines as well, and this will become less of an issue as the whole HDR stuff matures (Like color television or high-definition when it first started out (ex: 1366x768 resolution displays marketed as 1080p while they could get away with downscaling because "it supports 1080P resolution").
Kamikaze_Ice is offline  
post #7 of 22 Old 03-07-2017, 08:52 PM
Senior Member
 
Kamikaze_Ice's Avatar
 
Join Date: Jan 2007
Posts: 250
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 129 Post(s)
Liked: 61
Quote:
Originally Posted by HammerJoe View Post
So if you tell me that calibration is done to calibrate the tv so that it has accurate colors/blacks/whites I accept that, but if you tell me its that so we see what the Director wants us to see I roll my eyes a bit.
Seeing the Director's intent statements are usually taken at face value. It mean's basically what you're wanting, because if they don't want us to see it, it WON'T be there period.
How well a movie is produced/mastered/graded is a completely separate issue and effects us end users in all mediums (this is NOT restricted to HDR. As in my prior post, there are many movies STILL being done poorly ala raised black levels).

Calibration will (er, should) become more intuitive thanks to HDR, where luminance values are fixed and never changing (unlike SDR which is really a sliding scale). This means realism (i.e. if something super bright flashed in your eyes at night it would temporarily blind you. Imagine you had a light controled room to watch a movie that does this. With SDR it might make you squint a little, but with HDR you would experience a similar temporarily blinding effect. This adds immersion because it's realistic behavior and not dimming the content on the screen to simulate the same behavior without physically experiencing it.)

None of this will matter if content isn't created properly though. I'm sure you know how marketing likes to go overkill on things like "vivd" modes for displays on a showroom floor to show how much better they are than the ones around it. Or the chromatic aberration effect in games (light seperation as it passes through curved glass) where it doesn't belong (not looking through glass in a first person shooter...)
Kamikaze_Ice is offline  
post #8 of 22 Old 03-07-2017, 08:56 PM
Senior Member
 
Kamikaze_Ice's Avatar
 
Join Date: Jan 2007
Posts: 250
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 129 Post(s)
Liked: 61
Quote:
Originally Posted by harlekin View Post
the tldr; on HDR is, that if I have to hear one more comparison to iPhone photo HDR I will tend to want to slap you in your face, because the iPhone doesn't have an HDR display mode in place - so what ever BS you are talking about (third obvious lie in the current podcast) its HDR faked to be visible in an SDR display environment - but also that dark detail representation will not change at all - considering that the detail is in the image to begin with.
Um. Just let me leave this here.
https://support.apple.com/en-us/HT207470


*runs away*
Kamikaze_Ice is offline  
post #9 of 22 Old 03-07-2017, 09:04 PM
Advanced Member
 
acras13's Avatar
 
Join Date: Mar 2011
Location: Los Angeles , CA
Posts: 539
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 118 Post(s)
Liked: 112
I understand what you are trying to say in regard to Scott Wilkinson on the podcast , sometimes it bothers me that he seems a little soft and non-confrontational when something sounds fishy , but I don't think Home Theater Geeks is considered , or marketed as hard hitting journalism , it's an enthusiasts podcast . How long do you think the program would last if he flat out called B.S. on people or was confrontational with guests? I listen to the show every week , and overall enjoy it , but I understand that the guests have a vested interest in promoting their products and concepts in a positive light , it's my responsibility to look deeper or research the given product or concept to determine the B.S. quotient , as should anyone . Remember , the "average" consumer has never even heard of "Home Theater Geeks" , and the misinformation coming from the salesperson at Best Buy is what they are basing their purchase on , not a podcast that isn't slamming guests because of questionable claims .
I think calling for his removal because of the podcast is unwarranted , and similar to asking for his removal because he plays the tuba , and you don't feel that tuba music has any place on AVS . The podcast is a separate venture from AVS , he was doing it before he was on staff here , I don't think it has any bearing on his position with AVS .
DanBa and ray0414 like this.

It was like that when I got here
acras13 is offline  
post #10 of 22 Old 03-07-2017, 09:59 PM
AVS Forum Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 3,337
Mentioned: 10 Post(s)
Tagged: 0 Thread(s)
Quoted: 1609 Post(s)
Liked: 1281
The title of this thread -- that increasing colour volume being mere marketing BS -- is inflammatory and on its face, ignorant, no matter how many other truths or partial truths (or complete untruths) might have motivated this thread.

Scott goes easy on all his guests, that's pretty much his MO and that's likely a reason why he gets so many accomplished guests on.

Colour volume is a three dimensional space, encompassing, as we all know, chrominance on one plane (two dimensions), and luminance on the other. Increasing either colour gamut, or dynamic range, or both, will increase colour volume. It WAS a marketing gimmick when Sony TVs would add extra colour gamut support in their TVs with no content to show on it other than expanding gamut artificially. But now that we do in fact have rec 2020 content and HDR content, we can and should buy these displays which can best meet those specifications. And you can hit rec 2020 with RGB lasers, or even a series of lasers ala Pointer's Gamut. The incoming RGB encoded video doesn't force displays to use only three primaries. There are 3P laser projectors coming close to rec 2020, today, as well as 6P projectors which hit somewhere between P3 and 2020 (on purpose), so they can also do 3P per eye 3D. I own some Dolby 3D glasses, and indeed the colour separation / stereo contrast is way better than that achievable by polarized filters.

Also, whenever I hear any argument about "director's intent" I am instantly dubious and skeptical and my BS detector goes off. Which filmmakers and colorists prefer washed out, dull colours, exactly? I bet they are in the minority.

Not to bring up a dead horse, but "intent" is not a uniform "I win" trump card to derail any discussion about improving the state of the art of filmmaking.

It leads to circular reasoning similar to this one:

"24 frames per second is what films are recorded as historically, so it must be because all filmmakers prefer it so let's keep using it forever because all filmmakes will always love it so shut up, I win". Or, another way to put it: an appeal to tradition ("this is familiar so it must be good. No reason to ever improve things.")

If you ask Doug Trumbull or Stanley Kubrick or Ang Lee or Peter Jackson or James Cameron -- all highly respected and accomplished -- and you'll get a solid endorsement of high framerates and an explanation of why 24 fps is not enough. Research has shown (I'll find the link if you want) that when people are in fact given the choice, they do in fact prefer high frame rates to lower ones. Big surprise, huh! You wouldn't know it from reading AVS, an often truly regressive forum of anti-HDR and anti-HFR cynicism if I've ever seen one.

Why do we chase this "film-like" fetish, when things like excessive motion blur and judder have an objective, scientific name: temporal aliasing. Aliasing is another way of saying error. This is a science forum, is it not? So let's use scientific terms for them without marketing phrases such as "film-like" which seems like propaganda to me. A sales pitch: "buy this crappy product because it's familiar".

I bet any money the OP is a fan of 24p. In that case, it would be mendacious to assert that somehow reducing dE is of utmost importance to them. Why exactly is a given pixel's colour value so important, if you allow it to be completely smeared into its neighbours via motion blur induced by extremely low framerates?

Or, equally bad, to have completely the wrong colour because the viewer shouldn't be seeing that colour held on the screen for so long, and it suddenly "jumps" to somewhere else, as if the actor or moving object suddenly teleported from point A to point B without using any fancy sci-fi transporter. Pretty fake, huh. You cannot talk about colour accuracy without also mentioning blur and oversaturated / clipped whites due to insufficient dynamic range either in the cameras or the source material. Thankfully, there are plenty of film negatives around to harvest extra dynamic range from in order to remaster old content and show it to us in glorious, true to the camera. Any artistic intent arguments can and IMO should be be added in post, at least for digital film production pipelines and restoration projects.

HDR is the one thing to me and a lot of other game developers and filmmakers (and yes, I have worked in both industries, there is a ton of overlap, even within the same media conglomerates) think is NOT BS, and actually authentic and a true innovation.

Having an HDR content + content creation pipeline is still worth having even if you have SDR displays, i.e. delay dynamic range compression until the very last moment, post compositing and post effects. Videogame programmers figured this out fifteen years ago.

The idea that SDR is closer to "artistic intent" is also BS. Cinematographers put up with having poor contrast and dim projectors because they had to, not because they wanted to.

Scott goes easy on everyone, including "experts" like Joe Kane who spout nonsense like Gamma can be just as good as PQ which is totally wrong, mathematically. Anyone talking about colour volume and not mentioning the Barton threshold isn't really being serious.

LCD TVs have had 300-400 nits which is 3-4x what SDR content is mastered to, which means, guess what...you're NOT seeing it the way the "director intended" either. But people like bright TVs, naturally, so who wins? Well, with HDR signals, both can. People can get both a bright TV AND content on it which isn't garishly expanded.

When Dolby did their study showing 20k nits was actually what users preferred when given the choice, they actually undershot by delivering 10k peak nits in st.2084. What's my point? 10k nits is half of what users preferred, and yet people are complaining that it's somehow too much! Talk about irrational. So, either the OP is ignorant of these facts and studies, or being irrational.

My guess is perhaps a bit of both. It's not my intent to insult, just to clarify: expanded colour volume in TVs will only become "gimmicky" once they surpass 10k nits and rec 2020, and not one second before then. We can easily go to full rec 2020 by using multiple lasers of different wavelengths. Or even multiple LEDs or QD light sources + appropriate processing. I've written many such algorithms before in 3D graphics to span volumes using control points and it's actually quite trivial. Like the op said, there is nothing special about colour volumes per se, so it made me scratch my head to think that we will not reach rec 2020 in the consumer. And soon. But the idea that we shouldn't even bother because it's hard? Bah, I laugh at such tepid and unambitious pessimism and cynicism masquerading as wisdom.
WiFi-Spy and ray0414 like this.

Last edited by RLBURNSIDE; 03-07-2017 at 10:08 PM.
RLBURNSIDE is offline  
post #11 of 22 Old 03-08-2017, 09:47 AM
AVS Forum Special Member
 
GregLee's Avatar
 
Join Date: Jul 2002
Location: Waimanalo HI
Posts: 4,181
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 845 Post(s)
Liked: 288
Quote:
Originally Posted by RLBURNSIDE View Post
Colour volume is a three dimensional space, encompassing, as we all know, chrominance on one plane (two dimensions), and luminance on the other.
That is not my understanding, if you mean the standard measure reported as a percent figure of merit for displays. Instead, it is a percent ratio of color spaces, where the denominator is a low luminance part of a standard space, currently either the dci-p3 space or the rec.2020 space, the part having less luminance than the peak luminance of the display being measured.

Greg Lee

Last edited by GregLee; 03-08-2017 at 10:07 AM.
GregLee is offline  
post #12 of 22 Old 03-08-2017, 11:51 AM
AVS Forum Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 3,337
Mentioned: 10 Post(s)
Tagged: 0 Thread(s)
Quoted: 1609 Post(s)
Liked: 1281
Spaces can be two-dimensional if you're only talking about gamut, right? So when they say a % of rec 2020 they're making no mention of the peak nits or luminance of a display but only the relative surface area of one triangle vs another.

A volume is by definition three dimensional, which is spanned by exactly three orthonormal bases, X, Y, and Z, or put another way: Chrominance U, Chrominance V, and Luminance Y coordinates when talking about YUV colour space rather than XYZ. And as everyone knows, a volume is a scalar unit representing X * Y * Z, not just X * Y (which would be just the contribution from the gamut).

When you talk about RGB coordinates those are barycentric coordinates, i.e. points inside a triangle spanned by the primaries. So actual values are normalized to the colour gamut, which means that rec 2020 which is larger than both P3 and sRGB /rec 709 (which I'm sure you know already, share the same primaries), are simply smaller triangles. But the coordinates relative to each gamut are always normalized and barycentric coordinates. It's actually what the term "gamut" means, it's just a coordinate system relative to certain fixed points.

But a wider gamut doesn't necessarily mean a larger colour volume and vice versa, for example rec 2020 primaries with SDR luminance range is about five times smaller than HDR10. It's really the product of the two which delivers the final volume. Which is why, for example, HDR10 with rec 709 primaries represents a much wiser upgrade than rec 2020 with SDR. It's literally five times better to go with HDR without WCG than the other way around.
RLBURNSIDE is offline  
post #13 of 22 Old 03-08-2017, 12:51 PM
AVS Forum Special Member
 
GregLee's Avatar
 
Join Date: Jul 2002
Location: Waimanalo HI
Posts: 4,181
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 845 Post(s)
Liked: 288
Quote:
Originally Posted by RLBURNSIDE View Post
A volume is by definition three dimensional, ...
Yes, I know. It's confusing, and perhaps I should have referred to Vcrc instead of color volume. Nonetheless, the percent figure given as a measure of color volume in reviews is not a volume. It's a percent:
Quote:
The ICDM (International Committee for Display Measurement) established an evaluation method for color volume called Vcrc (Volume Color Reproduction Capability) based on measuring luminance (L*) and two color axes, red-green (a*) and blue-yellow (b*), also known as CIELAB.
from http://www.avsforum.com/samsung-colo...o-at-ces-2017/
And see http://www.avsforum.com/samsung-and-...e-measurement/.

Greg Lee
GregLee is offline  
post #14 of 22 Old 03-08-2017, 08:52 PM
AVS Forum Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 3,337
Mentioned: 10 Post(s)
Tagged: 0 Thread(s)
Quoted: 1609 Post(s)
Liked: 1281
Right, the colour coordinates in these volumes in question taper off towards the extremes of white or black, so they aren't merely extruded triangle wedges. That tapering is why (and you probably know this already) bright blue skies can't remain blue in an SDR image without crushing the midtones into grey. Anyone who is serious about colour reproduction must be a fan of HDR, otherwise they are just spreading noise.

I've read all those articles here on AVS, why do you think I'm such a fan of HDR? I was instantly convinced of the immense potential of HDR displays as soon as I knew they were working on them. Because I know what my eyes can see, and SDR is crap compared to my eyes. Compressed to smithereens. Deprecrated, mathematically, perceptually, undoubtedly, scientifically inferior. And that's why I laugh when people think they're being savvy against marku-speak by pretending if they don't upgrade their TVs they aren't missing anything special. They are. And thankfully the market has spoken already: it's a hit. HDR practically sells itself. You just need a pair of eyeballs and you're sold. I don't believe even someone with severe eyesight loss wouldn't benefit from an HDR TV, in fact they might even benefit more.
RLBURNSIDE is offline  
post #15 of 22 Old 03-09-2017, 12:24 AM
Advanced Member
 
dovercat's Avatar
 
Join Date: Apr 2008
Posts: 722
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 74 Post(s)
Liked: 33
Quote:
Originally Posted by RLBURNSIDE View Post
Right, the colour coordinates in these volumes in question taper off towards the extremes of white or black, so they aren't merely extruded triangle wedges. That tapering is why (and you probably know this already) bright blue skies can't remain blue in an SDR image without crushing the midtones into grey. Anyone who is serious about colour reproduction must be a fan of HDR, otherwise they are just spreading noise.
Using the same display technology surely SDR colour space is capable of brighter images than HDR colour space. The wider the colour gamut the more saturated, narrower the primaries so presumably the less bright the primaries. I would guess differences in nits or cdm2 is down to the HDR standard requiring a brighter display and higher contrast not down to a wider colour space inherently creating brighter whites and darker blacks.

I think what a wider colour space gets you is more saturated primaries that can appear brighter to the eye due to the Helmholtz–Kohlrausch effect. The Helmholtz–Kohlrausch effect however varies by individual. The more saturated the primaries the more metamerism failure issues, where different people perceive the display's colours differently.

Rec 2020 appears to have a design flaw. Any display using the full Rec 2020 colour space via primaries appears to fall foul of metamerism failure. A display system is not creating good colour reproduction if it is causing viewers to suffer metamerism failure.

Last edited by dovercat; 03-09-2017 at 12:28 AM.
dovercat is offline  
post #16 of 22 Old 03-09-2017, 01:33 AM
Advanced Member
 
dovercat's Avatar
 
Join Date: Apr 2008
Posts: 722
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 74 Post(s)
Liked: 33
Quote:
Originally Posted by RLBURNSIDE View Post
When Dolby did their study showing 20k nits was actually what users preferred when given the choice, they actually undershot by delivering 10k peak nits in st.2084. What's my point? 10k nits is half of what users preferred, and yet people are complaining that it's somehow too much! Talk about irrational. So, either the OP is ignorant of these facts and studies, or being irrational.
Actual users did not prefer 20k nits. The figure is a guesstimate based on an assumption.

Using a used a Dual Modulation Research Display capable of 0.004 cd/m2 to 20,000 cd/m2
Quote:
A dynamic range between 0.1 and 650 cd/m2 matched the average preferences. To satisfy 90% of the population, a dynamic range from 0.005 to about 3000 cd/m2
But the researchers came up with the figure of 20,000 cd/m2
Quote:
They claim that since a display should be able to produce values brighter than the diffuse white maximum, as in specular highlights and emissive sources, that the average preferred maximum luminance for highlight reproduction satisfying 50% of viewers is about 2500 cd/m2 increasing to marginally over 20000 cd/m2 to satisfy 90%
http://www.mediaandbroadcast.bt.com/...nal-v1.01-.pdf

S. Daly, T. Kunkel, S. Farrell & Xing Sun: Viewer Preferences for Shadow, Diffuse, Specular, and Emissive Luminance Limits of High
Dynamic Range Displays. SID Display Week 2013. http://onlinelibrary.wiley.com/doi/1...271.x/abstract

Also interesting is the simultaneous dynamic range of human vision fully adapted say viewing a display in a living room. Is apparently only around 5,000:1. If the room dictates viewer eye adaption then presumably a display capable of and using a far far higher contrast ratio is just going to at times appear dazzlingly bright or too dark to see what is going on. Even if the display dictates viewer eye adaptation, adaption is not instantaneous it takes time so fast changes from very bright to very dark or very dark to very bright are presumably going to be problematic.

Then there is resolution according to ITU-R BT.1845-1 the optimal viewing distance for a 4K display is 1.6 x image height. For 1080p its 3.2 x image height.

And then there is colour space with full Rec 2020 using colour primaries causing metamerism failure.

It maybe just as well that HDR displays on sale are mostly not capable of reaching the maximum brightness, contrast, colour specs of the HDR standards.
mo949 likes this.
dovercat is offline  
post #17 of 22 Old 03-09-2017, 07:21 AM
AVS Forum Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 3,337
Mentioned: 10 Post(s)
Tagged: 0 Thread(s)
Quoted: 1609 Post(s)
Liked: 1281
If content producers abuse people's eyes with extreme brightness shifts from one frame to the next, rather than one scene to the scene, whose fault is that, the TV's?

HDR can safely be used for highlights and the rest is a question of them learning the ropes to avoid fatiguing people's irises.

But the fact that your eyes can get tired through large absolute changes in APL just means displays are catching up to what reality offers us every day, and that's good.

I can see way more than 5k:1, and so can everyone, even in a bright showroom the black level difference between an edge-lit VA LCD with 5K native CR and an OLED next to it is obvious. And HDR is also super obvious to see so I don't buy the 5K limit, compared to SDR, from any distance, close or far away.

Just using your eyes can dispel a lot of incorrect info floating around. Seeing is believing. That's why HDR is popular, it's stunning and doesn't even need a BestBuy salesman to sell it. It sells itself.
RLBURNSIDE is offline  
post #18 of 22 Old 03-10-2017, 12:37 AM - Thread Starter
Advanced Member
 
harlekin's Avatar
 
Join Date: May 2007
Posts: 583
Mentioned: 56 Post(s)
Tagged: 0 Thread(s)
Quoted: 248 Post(s)
Liked: 77
To adress some of the more general points in here. The criticism provided by me - specifically never went the "HDR at 5000 or 10000 nits is not necessary" route - because, thats entirely open to interpretation.

Do I trust the Dolby "expermiental results"? No, I do not. They werent independently verified, so they are not science. Its as simple as that. That 10000 nits is a "target" whatsoever (its not that we are going to reach that anytime soon) - probably had more to do with the influence of the markting departement than anything else.

Please don't use this thread to drive the discussion towards "it all depends on the artistic intent" - as this is the same BS line we currently get fed by every participant in this industry. "We don't know how artists will use the capabilities."

Fine - but we know when and how we can't reproduce original intent in our living rooms, when the actual targets shift, when the way we deal with above max brightness information shifts, or when Samsung can go out to pull an entirely BS certification of "color volume matched!" out o a german certification outlet - and NONE of you is able to interpret what that actually means.

There is no standard for DCI-P3 HDR - so what color Volume at "a 100%" are they getting certified? The 65 nits we usually see in commercial cinemas? And again - if they want to proclaim that they are orienting themselves towards the mastering displays used - first they are not, because they don't deliver the capabilities in the home user segment, and second - even mastering "targets" are shifting all over the place - with every new nits increase studios are willing to spend money on. Why? Because presumably they have some dope colorists that wan't to try their hands at being artists.
--

But lets now turn to the current episode of AVSForums Home Theatre Geeks, and let me give you another summery of whats becoming harder and harder to swallow as an somewhat intelligent person, watching this field.

You have four "industry influences/calibrators/journalists" who don't know a thing about what they are talking about and instead are exchanging generalized set phrases to seem like they know what they are doing.

2017 LG OLEDs are supposed to show more black detail out of the box. According to the jonty panel thats because LG purposefully crushed black detail on their 2016 OLEDs to hide impurities out of black. None of them is offering up information on the gamma curves used. The entire industry has no concept at all of what gamma to stick to, thanks to bt.1886, and the "impurities" arent specified in any way - also, you have some people on the panel indicating that they "knew this all along and its great that the issue finally gets adressed" -

Here is what happened for real. None of the people on the panel probably has ever looked at the curve that dictates what out of black looks like. Because they are using Calman workflows and Calman tends to hide that setting stage from you - resulting in entirely botched display shootouts in the past, where all panels at display were calibrated to different gamma targets.

None of them had noticed the near black crush issue in the past. And none of them has put it in their "reviews" of previous generation panels.

None of them has played around with different gamma presets to see how they can adjust out of black (PLG 2.4 is not what looks right - regardless what bt.1886 tells you on "perfect black level devices).

None of them has seen artefacting near black on the previous generation in the past.

None of them is able to quantify it - because the dE 2000 formula tells us, that all of those chenges are "below the visible threashhold" - which of course isn't true.

And none of them ha been able to quantify it - because bt.1886 is constantly shifting targets around as well - so as an industry you have to litteraly pry out calibration monitors form color corrction facilities and look at the gamma levels they are setting.
-

"i noticed a slight green tint on the 2016 OLEDs" is driven by aggregated criticism, that the out of the box calibration on LG Oleds was botched - sometimes with no way to correct for it (Dolby Vision preset is locked down) -- and what Scott Wilkonson is doing there is called mirroring or pandering, depending on your standpoint. The "green tint" out of the box always was above dE 10 on most devices, so everyone with a keen eye was able to spot it instantly - even in showroom conditions, if they had to.

But then - Scott Wilkenson didn't until it was thrown around in forums - and he read it as part of criticism he hadn't expected. Probably. Good news is, that this is all on tape - because they taped an entire calibration where "perfect, oh - so great" was thrown around by them while Calman showed them dEs exceeding 15.
-

None of the bunch except Robert Haron seems to be interested in Colorchecker values with the new 3D Lut (meaning, they didn't conceptualize at all what this is about), but instead they are very, very grateful for being able to adjust the "brightness slider" more gradually.

For all I care - it could be taped to the TV - because you don't mess with "true black" once you've got it. No one cares if 3 or 7 iRE look "better" if true black is gone. More cranularity in this slider is almost useless - if you wan't to adjust low IRE values independent of the black level - you adjust the gamma curve. As there is no real target for gamma in bt.709 anyhow you practically are free to do anything in there.

But the four celebrate a more granular brighness control like it would bring any improvements - it doesnt. at all. At least not for people that can calibrate a gamma curve. And that those four arent able to talk about how gamma looked on the LG 2017 models we already discussed.

The next highlight is how all of a sudden viewing ange stability improved because of the filter layer on new TVs (thats probably correct), and it is used again - without any attempt to objectify it - and just devert from the fact, fact - that the have a hard time conceptualicing that there can be a green tint overall and a red tint on black at the same time - better move in a potential deniability excuse of "oh it must have been the viewing angles at that point".

Oh and by the way - the red tint on black in 2016 OLEDs? Is closer to neutral black (chroma whise (CIE triangle)) than on any other TV i've ever measured. Granted other TVs usually tend to move "panel black" more towards blue than towards red - and that is no remedy for you being able to see the tint in a lit room (can you?) - but then, why has no one ever talked about those things in the past?

Answer - because acording to the dE 2000 formula those issues all don't exist (they ll fall below the visually perceptable threshhold).

But the overarching point is still there - they get fed their list of "improvements" from the manufacturer - then proclaim that they have always known about the "issue" in the past - although none of them has written about it. (Presumably, but with a close to 100% likelyhood.).
-

Spectracal trying to get interpretation dominance over the "golden reference value" is at least an interesting move (because up until now that guaranteed, that Dolbyvision had "secret knowledge" to calibrate a display, that came into being by them talking in private with a display manufacturer - so comparability was not there, and you couldnt look at the math and possibly criticise it for what it was doing). So maybe thats a positive.

But then comes the part where the three of them talk about being fed a 10000 nits testpattern (on Oleds, right...) and being able to make out details up to 5000 nits (2016 model) or 7000 nits (2017 model) - without realizing thatn both of them are transformations. Because none of the TVs can display content that bright.

The subsequent discussion where "clipping is bad" is so full of openly displayed ignorance - that iphysically pains me - and I had to pause watching - to write this. If you are watching a 10000 nits test pattern - guess what - it clips on an OLED. AND ITS A FREAKING GOD THING IT DOES - because if you try to cram in the bright details you either have to do so non uniformly (not according to the EOTF) - or move all other colors away from target by so much - that you are finally able to get the entire "contrast range" mapped on a display a 10th as bright.

So no - clipping is not bad, considering that thats what your eyes would do in the first place (until they accommodate) and that the TVs capabilities arent there.

Yes - you probably wan't to scale little before you clip (thats the we saw details up to 5000 nits (no you didn't) or 7000 nits (no you didnt)), but at some point - you probably wan't to clip. And the point you are chosing is "magic". Meaning - either the Dolby "golden reference value" we can't look at - or whatever Calman tries to cut out of it as their "trademarked magic point" in the future.

To then jump on "directors intent" when you are talking about peak brighness detail is laughable. Tell me that peak brighness detail ("you have to reach 7000 nits to appreciate my intent") is used as an stilistic vice - and I will laugh at you. Currently its used for effect - and if you can't see the sun spots on the sun, directly after exiting a cave - "oh my"let say the director most likely won't obsess over it, when the movie gets screened on a Stewards Film Screen at 65 nits tops.

Clipping high brighntness detail (above a reasonable threashhold (= "planes shouldnt vanish out of the sky") is what EVERYONE DOES. And if they don't your entire scenes color representation gets out of whack quickly. Golden reference dictates where clipping starts and how "agressively it" is in practice. Calman is trying to move in on that and take the dominance of interpretaion away from Dolby. But in the Panel we have four people that agree that they would rather see a 5000 or 7000 nits detail on a 700 nits display - even if it means, that the entire luminance range has to be compressed just to make that possible.

The idiocy is close to borderless.

And I'm not even speaking about the fact that current movies are mastered on 2000 and 4000 nits displays - and then go (ideally) through a process where they are viewed again on consumer grade devices to spot if any "significant image detail" gets lost - but leave it to out panel of four morons (I think thats fair) being impressed by a 10000 nits test pattern on some OLED TVs. And in the influence of that proclaim that "clipping is always bad" - because thats what they learned with SDR.

Honestly - what do you do IF none of the journalists out there have the smarts to even fathom what they ought to be reporting on. You can't put them in school - because that stuff gets made up as we go along - and watching them trying to deduce valuable inormation from set pieces - made up by a marketing department is painfull to watch.

Heck - they apperantly had hours to "calibrate those TVs" but where unable to come up with a gamma curve that shows us what black detall they actually saw.

And the only statement any one of those four would make about color accuracy was "that color checker colors all were below the visible threadhold" - no numbers, no screens - nothing. So maybe - instead of just hawking the PR line of how much more granular this years 3D Lut in some TVs is - show us some representation of that in action.

You were let loose in a room of TVs with measuring equipement and all you have to tell us are the same marketing lines, the PR department fed you?

Another week another "great" showcase of where this industry is lacking in transparancy, knowledge and the ability to trust the "influencers" (they talk about display being perfect until there is a public consensus, that they are not - than they jump on that - without knowing what it indicates - or jump on the next improvement of a thing they haven't noticed in the past -- thats you "Home Theatre journalism of 2017".) Also the level of shear lack of knowledge about stuff they are supposedly reporting on "objectively" is still outrageous.

An the same pundits then try to condense the information down even further to give people a sense of what to buy. Its a joke - really. In reality they are happy for all pointers manufacturers can give them to copy - because they wouldn't be able to navigate "what changed" without them. They aren even able to sort or weigh those changes afterwards. (Heron at least tries (he at least is able to apply some knowledge about calibration to the aspects he is supposed to test, but the majority of "HiFI Journalists" have seemingly long abandoned the notion to try to understand whats going on.)

Wilkenson for all that is woth is still stuck in the cognitive dissonance loop - of "last time they said it was perfect - and this time they said it was more perfect -- I don't know what to tell my peers.. Haha - Ha..." - which prompts the question, are you a panelist in a format like "the view" or are you an actual journalist - that at some point isn't primaraly "shocked" that marketing does its job. Get your act together. Please.

Last edited by harlekin; 03-10-2017 at 12:57 AM.
harlekin is offline  
post #19 of 22 Old 03-10-2017, 02:14 AM - Thread Starter
Advanced Member
 
harlekin's Avatar
 
Join Date: May 2007
Posts: 583
Mentioned: 56 Post(s)
Tagged: 0 Thread(s)
Quoted: 248 Post(s)
Liked: 77
Oh and by the way you have the permission to forcibly "active facepalm" anyone that trys to explain any aspect of the whole HDR proposition and where it currently sits by comparing it to "something that went on with Dolby Audio....".

First - I know that thats a hard one to get right - but audio is not video. Try to read that with your ears...

...and second -the difference between compressing the whole container depending on the brightest detail information "as mastered" down to the capabilities of every TV - means nothing more or less - than to introduce a large amount of differing colors for little reason - even down to the point where all colors that could have been displayed correctly - get displayed wrongly because of one scene in the movie.

The dynamic metadata approach - which prevents that - is more than just format wars poltics. Once HDR10 has its equivalent of dynamic metadata in place - you can pull that argument out of the drawer again - but up to that point - maybe don't think about "how it was with sound - a few years ago" - but actually try to understand whats happening here.

Again - those guys are supposed to be journalists for G's sake...

The logic in those panel discussions right now is on the level of "he killed the suspect with a frozen pudding, that then melted, and then put sirup on the corpses and ate them all so we could not find them - thats why we call him the sirup killer". And for as non sequitur as this comment seems, its actually a plotline in a Gameboy game I played yesterday - so it stuck with me as an example of "freeform logic". You could also go back to the example of greek gods and how the world worked according to a consensus back in those days.

Last edited by harlekin; 03-10-2017 at 02:19 AM.
harlekin is offline  
post #20 of 22 Old 03-10-2017, 08:53 AM
AVS Forum Special Member
 
ConnecTEDDD's Avatar
 
Join Date: Sep 2010
Location: Athens, Greece
Posts: 4,697
Mentioned: 27 Post(s)
Tagged: 1 Thread(s)
Quoted: 1191 Post(s)
Liked: 1695
Quote:
Originally Posted by harlekin View Post
Answer - because acording to the dE 2000 formula those issues all don't exist (they ll fall below the visually perceptable threshhold).
CalMAN 5.x is doing a perceptual filtering of dE at low end/low luminance measured patches to the dE calculated numbers, this has been reported from 2012 (from Chad also) here: http://www.spectracal.com/forum/view...hp?f=92&t=4471

Huge dE94 differences between ChromaPure and CalMAN

Ted's LightSpace CMS Calibration Disk Free Version for Free Calibration Software: LightSpace DPS / CalMAN ColorChecker / HCFR
S/W: LightSpace CMS, SpaceMan ICC, SpaceMatch DCM, CalMAN 5, CalMAN RGB, ChromaPure, ControlCAL
V/P: eeColor 3D LUT Box - P/G: DVDO AVLab TPG
Meters: JETI Specbos 1211, Klein K-10A, i1PRO2, i1PRO, SpectraCAL C6, i1D3, C5
ConnecTEDDD is online now  
post #21 of 22 Old 03-11-2017, 12:50 AM
Advanced Member
 
dovercat's Avatar
 
Join Date: Apr 2008
Posts: 722
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 74 Post(s)
Liked: 33
Quote:
Originally Posted by RLBURNSIDE View Post
I can see way more than 5k:1, and so can everyone, even in a bright showroom the black level difference between an edge-lit VA LCD with 5K native CR and an OLED next to it is obvious. And HDR is also super obvious to see so I don't buy the 5K limit, compared to SDR, from any distance, close or far away.
In a bright showroom due to screen reflectance I expect the OLED has an effective contrast ratio of under 5,000:1 and the LCD under 1,000:1. In a bright lit showroom for typical luminance video content it could be an effective contrast ratio under 1,000:1 for the OLED and under 100:1 for the LCD. A typical display is what screen reflectance 5%, a good one 2.2% and a outstanding one 1.2%. In a bright showroom you are not going to see the display producing anything in the ball park of the spec sheet display native contrast ratio.

Last edited by dovercat; 03-11-2017 at 12:56 AM.
dovercat is offline  
post #22 of 22 Old Today, 06:46 PM
Advanced Member
 
rak306's Avatar
 
Join Date: Oct 2004
Location: Syracuse NY
Posts: 861
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 493 Post(s)
Liked: 188
Quote:
Originally Posted by RLBURNSIDE View Post
...
If you ask Doug Trumbull ... you'll get a solid endorsement of high frame rates and an explanation of why 24 fps is not enough. Research has shown (I'll find the link if you want) that when people are in fact given the choice, they do in fact prefer high frame rates to lower ones. Big surprise, huh! ...
Good for sports, lousy for movies, unless you like soap operas . I'm sure there is a compromise, where 120 fps capture can be temporally filtered to give a 24 fps movie look without the motion judder. But there is no doubt, high frame rate has a video studio look, and takes you out of the movie.

Quote:
Originally Posted by RLBURNSIDE View Post
Why do we chase this "film-like" fetish, when things like excessive motion blur and judder have an objective, scientific name: temporal aliasing. Aliasing is another way of saying error. This is a science forum, is it not? So let's use scientific terms for them without marketing phrases such as "film-like" which seems like propaganda to me. A sales pitch: "buy this crappy product because it's familiar".
I was about 15 when I asked myself this question, why does video look like video, and why does film, even when shown on video, look like film. The answer is (mostly) the frame rate.

Yes motion blur and judder are very annoying, but not as annoying as the soap opera look. If you look at the C.M.D. function of the JVC projectors, (set to Low), the judder is (mostly) eliminated while retaining that 'film look', which is real.

Without some motion blur, video starts to look strobed, which does not look natural.

Quote:
Originally Posted by RLBURNSIDE View Post

Scott goes easy on everyone, including "experts" like Joe Kane who spout nonsense like Gamma can be just as good as PQ which is totally wrong, mathematically.
I am not a fan of Mr. Kane, but he was favoring 12 bit gamma. Compared to the 10 bit PQ being delivered in UHD HDR, 12 bit gamma has similar step sizes.

Quote:
Originally Posted by RLBURNSIDE View Post
When Dolby did their study showing 20k nits was actually what users preferred when given the choice, they actually undershot by delivering 10k peak nits in st.2084. What's my point? 10k nits is half of what users preferred, and yet people are complaining that it's somehow too much!
How large was the screen used in those Dolby studies? I bet if they had used a 100" wide, the preferred peak brightness would have been much lower.
rak306 is online now  
Sponsored Links
Advertisement
 
Reply Display Calibration

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off