or Connect
AVS › AVS Forum › Display Devices › Display Calibration › The Official ChromaPure thread
New Posts  All Forums:Forum Nav:

The Official ChromaPure thread - Page 129

post #3841 of 5348
I just listened to Scott Wilkinson response to manufacturer's boast about their displays having a much wider gamut. Scott correctly points out that content (broadcast and Blu-ray) are mastered to a narrower gamut than what wide gamut displays are capable of, so he asks, what good is a wide gamut display? The manufacturer's response is that the content was originally filmed using a much wider gamut, which was shrunk only when transferred to Blu-ray or mastered for broadcast. The wide gamut display, the argument goes, merely restores these missing colors.

Let me explain why this is nonsense. In the first place this is nothing new. Manufacturers have been offering wide gamut displays for a long time. The original JVC LCoS projector, the RS1, had an extremely wide gamut. This is an old argument, but manufacturers keep recycling it as a way of convincing gullible consumers to purchase their products.

Second, while it is true that content is originally filmed using a wider gamut, most of the colors we see don't take advantage of the extra colors. Skin tones, trees, grass, sky are all composed of colors that fall well within the Rec. 709 HD standard. Only a relatively small percentage of the colors are lost when the gamut is shrunk to fit the HD standard.

Third, and most importantly, when remapping colors from a wide gamut source to a narrower gamut medium, the ONLY colors that are affected are those small number that are outside the HD gamut. The rest of the colors--the vast majority--stay the same. Think for a moment why this is true. If a skin tone naturally falls well within the HD gamut, you wouldn't want to change that color when remapping wide gamut source material to HD. That would visibly distort the skin tone. Only the relatively few very saturated colors will be remapped to fall within the HD boundary.

This explains why attempting to restore the original gamut by playing HD content on a wide gamut display is a bad idea. What the wide gamut display does is oversaturate ALL colors--including the skin tones I mentioned in the previous example. The display has no way of distinguishing between those colors that were compressed during the mastering process and those that weren't, so it indiscriminately increases the saturation of all of the colors.

BTW, there is one practical advantage to a wide gamut display for some users. If you have a good color management system, then it is useful to have as a starting point oversaturated colors that can then be desaturated to the correct HD standards. Other than this, manufacturer claims about wide gamut displays are unhelpful marketing hype that should be ignored.
Edited by TomHuffman - 2/7/13 at 11:42am
post #3842 of 5348
Quote:
Originally Posted by TomHuffman View Post

Small windows are fine. What is not fine is full fields.

Thank you.
What is the latest version of Chromapure ?
I'm using the 2.3.2.4xxx version.

And as far as your post just above is concerned, are you talking about the 4:4:4 option in some BRD players ?
Do you recommend staying at 4:2:2 instead ?

Remi.
post #3843 of 5348
Quote:
Originally Posted by TomHuffman View Post

I just listened to Scott Wilkinson response to manufacturer's boast about their displays having a much wider gamut. Scott correctly points out that content (broadcast and Blu-ray) are mastered to a narrower gamut than what wide gamut displays are capable of, so he aks, what good is a wide gamut display? The manufacturer's response is that the content was originally filmed using a much wider gamut, which was shrunk only when transferred to Blu-ray or mastered for broadcast. The wide gamut display, the argument goes, merely restores these missing colors.

Let me explain why this is nonsense. In the first place this is nothing new. Manufacturers have been offering wide gamut displays for a long time. The original JVC LCoS projector, the RS1, had an extremely wide gamut. This is an old argument, but manufacturers keep recycling it as a way of convincing gullible consumers to purchase their products.

Second, while it is true that content is originally filmed using a wider gamut, most of the colors we see don't take advantage of the extra colors. Skin tones, trees, grass, sky are all composed of colors that fall well within the Rec. 709 HD standard. Only a relatively small percentage of the colors are lost when the gamut is shrunk to fit the HD standard.

Third, and most importantly, when remapping colors from a wide gamut source to a narrower gamut medium, the ONLY colors that are affected are those small number that are outside the HD gamut. The rest of the colors--the vast majority--stay the same. Think for a moment why this is true. If a skin tone naturally falls well within the HD gamut, you wouldn't want to change that color when remapping wide gamut source material to HD. That would visibly distort the skin tone. Only the relatively few very saturated colors will be remapped to fall within the HD boundary.

This explains why attempting to restore the original gamut by playing HD content on a wide gamut display is a bad idea. What the wide gamut display does is oversaturate ALL colors--including the skin tones I mentioned in the previous example. The display has no way of distinguishing between those colors that were compressed during the mastering process and those that weren't, so it indiscriminately increases the saturation of all of the colors.

BTW, there is one practical advantage to a wide gamut display for some users. If you have a good color management system, then it is useful to have as a starting point oversaturated colors that can then be desaturated to the correct HD standards. Other than this manufacturer claims about wide gamut displays are unhelpful marketing hype that should be ignored.

Well said tom, as usual wink.gif
post #3844 of 5348
Quote:
Originally Posted by TomHuffman View Post

I just listened to Scott Wilkinson response to manufacturer's boast about their displays having a much wider gamut. Scott correctly points out that content (broadcast and Blu-ray) are mastered to a narrower gamut than what wide gamut displays are capable of, so he aks, what good is a wide gamut display? The manufacturer's response is that the content was originally filmed using a much wider gamut, which was shrunk only when transferred to Blu-ray or mastered for broadcast. The wide gamut display, the argument goes, merely restores these missing colors.

Let me explain why this is nonsense. In the first place this is nothing new. Manufacturers have been offering wide gamut displays for a long time. The original JVC LCoS projector, the RS1, had an extremely wide gamut. This is an old argument, but manufacturers keep recycling it as a way of convincing gullible consumers to purchase their products.

Second, while it is true that content is originally filmed using a wider gamut, most of the colors we see don't take advantage of the extra colors. Skin tones, trees, grass, sky are all composed of colors that fall well within the Rec. 709 HD standard. Only a relatively small percentage of the colors are lost when the gamut is shrunk to fit the HD standard.

Third, and most importantly, when remapping colors from a wide gamut source to a narrower gamut medium, the ONLY colors that are affected are those small number that are outside the HD gamut. The rest of the colors--the vast majority--stay the same. Think for a moment why this is true. If a skin tone naturally falls well within the HD gamut, you wouldn't want to change that color when remapping wide gamut source material to HD. That would visibly distort the skin tone. Only the relatively few very saturated colors will be remapped to fall within the HD boundary.

This explains why attempting to restore the original gamut by playing HD content on a wide gamut display is a bad idea. What the wide gamut display does is oversaturate ALL colors--including the skin tones I mentioned in the previous example. The display has no way of distinguishing between those colors that were compressed during the mastering process and those that weren't, so it indiscriminately increases the saturation of all of the colors.

BTW, there is one practical advantage to a wide gamut display for some users. If you have a good color management system, then it is useful to have as a starting point oversaturated colors that can then be desaturated to the correct HD standards. Other than this manufacturer claims about wide gamut displays are unhelpful marketing hype that should be ignored.

Tom: Thank You
post #3845 of 5348
Quote:
Originally Posted by mlg33 View Post

Thank you.
What is the latest version of Chromapure ?
I'm using the 2.3.2.4xxx version.

And as far as your post just above is concerned, are you talking about the 4:4:4 option in some BRD players ?
Do you recommend staying at 4:2:2 instead ?

Remi.
It depends on the Bluray players, some have good color space conversions, other have bad conversions...

it's better to test and choose the color space that not have chromatic errors...

see doug post : http://www.avsforum.com/t/1454748/blurry-player-settings-for-calibration#post_22891380
Quote:
If there is no obvious difference between YCbCr modes and RGB modes, YCbCr 4:2:2 is the best mode to use... period. It delivers 12-bit video while the other options are all limited to 8 bits (discussed in other forum threads and is NOT what most people THINK happens -- most people think all format options are 8-bits unless you enable a higher-bit mode in the player... but that's not true). Even Lumagen, makers of the best video processors available in the consumer market, recommend 4:2:2 unless there is a compelling reason to use some other mode (like RGB looks better with some specific video displays).

In my experience, about 60% of the time, every mode looks identical. Somewhere around 5%-10% of the time, RGB looks better (video display determines this) and the remainder of the time YCbCr looks better. So you have a 90-95% chance of YCbCr 4:2:2 being the best option.
post #3846 of 5348
Yes Tom, once that color info is discarded/remapped to fit in a REC709 colorspace it can never be recreated properly. Hopefully if 4K comes along it will have a wider colorspace with more bit depth (personally I'd rather see 10-12 bit color before a wider CS tho).
post #3847 of 5348
Quote:
Originally Posted by TomHuffman View Post

I just listened to Scott Wilkinson response to manufacturer's boast about their displays having a much wider gamut. Scott correctly points out that content (broadcast and Blu-ray) are mastered to a narrower gamut than what wide gamut displays are capable of, so he aks, what good is a wide gamut display? The manufacturer's response is that the content was originally filmed using a much wider gamut, which was shrunk only when transferred to Blu-ray or mastered for broadcast. The wide gamut display, the argument goes, merely restores these missing colors.

Let me explain why this is nonsense. In the first place this is nothing new. Manufacturers have been offering wide gamut displays for a long time. The original JVC LCoS projector, the RS1, had an extremely wide gamut. This is an old argument, but manufacturers keep recycling it as a way of convincing gullible consumers to purchase their products.

Second, while it is true that content is originally filmed using a wider gamut, most of the colors we see don't take advantage of the extra colors. Skin tones, trees, grass, sky are all composed of colors that fall well within the Rec. 709 HD standard. Only a relatively small percentage of the colors are lost when the gamut is shrunk to fit the HD standard.

Third, and most importantly, when remapping colors from a wide gamut source to a narrower gamut medium, the ONLY colors that are affected are those small number that are outside the HD gamut. The rest of the colors--the vast majority--stay the same. Think for a moment why this is true. If a skin tone naturally falls well within the HD gamut, you wouldn't want to change that color when remapping wide gamut source material to HD. That would visibly distort the skin tone. Only the relatively few very saturated colors will be remapped to fall within the HD boundary.

This explains why attempting to restore the original gamut by playing HD content on a wide gamut display is a bad idea. What the wide gamut display does is oversaturate ALL colors--including the skin tones I mentioned in the previous example. The display has no way of distinguishing between those colors that were compressed during the mastering process and those that weren't, so it indiscriminately increases the saturation of all of the colors.

BTW, there is one practical advantage to a wide gamut display for some users. If you have a good color management system, then it is useful to have as a starting point oversaturated colors that can then be desaturated to the correct HD standards. Other than this manufacturer claims about wide gamut displays are unhelpful marketing hype that should be ignored.
hello Tom
Nice info.
What about the case where we use the display to view pictures and movies that we take ourselves with a digital camera.. I do not know digital cameras that well. Do they have wider gamut capability, besides standard sRGB? If yes, would that make a case for wide gamut displays?
post #3848 of 5348
Quote:
Originally Posted by realzven View Post

It depends on the Bluray players, some have good color space conversions, other have bad conversions...

it's better to test and choose the color space that not have chromatic errors...

see doug post : http://www.avsforum.com/t/1454748/blurry-player-settings-for-calibration#post_22891380
Thank you.
Too bad I calibrated my TV in a 4:4:4 format !

Let's do this again then !!! This is so much fun anyway... wink.gif
post #3849 of 5348
Quote:
Originally Posted by mlg33 View Post

Thank you.
What is the latest version of Chromapure ?
I'm using the 2.3.2.4xxx version.

And as far as your post just above is concerned, are you talking about the 4:4:4 option in some BRD players ?
Do you recommend staying at 4:2:2 instead ?

Remi.
4:2:2 and 4:4:4 are different types of chroma subsampling. In effect, color information is compressed during transmission and then uncompressed for display. In the overwhelming number of cases, your eyes should not be able to see the difference between them (Blu-ray is compressed even more at 4:2:0). Nonetheless, a lot of displays expect 4:2:2, so sending it in that format may reduce by one the number of format conversions required. The bottom line is that it shouldn't make any difference.
post #3850 of 5348
Quote:
Originally Posted by JimP View Post

Nice rightup Tom.

What colors would benefit from a Wide colorspace system?

On my 2012 Panasonic 65VT50, there is a menu setting to change the color space to "Wide". The description is to increase greens and blues. Sounds like an artificial way to pump up football games and nature content.
As long as the content we feed into displays--Blu-ray and HD broadcast--were mastered to Rec. 709, there is NO benefit to a wide color space, other than the practical usefulness for CMS adjustments. The VT50's Standard gamut is very, very close to the correct gamut.
post #3851 of 5348
Quote:
Originally Posted by Geof View Post

Yes Tom, once that color info is discarded/remapped to fit in a REC709 colorspace it can never be recreated properly. Hopefully if 4K comes along it will have a wider colorspace with more bit depth (personally I'd rather see 10-12 bit color before a wider CS tho).
There is a new standard that may be adopted for 4K material--Rec. 2020. You can see the specs here. Rec. 2020 requires at least 10-bit color.

For those who don't know, this just means that within the defined gamut there are a larger number of individual colors, each defined by a unique RGB triplet. 8-bit color, which is what we have been using for HD is 16-235 (I am ignoring above white and below black colors). This constitutes over 10,000,000 colors (219^3) and 219 shades of gray. 10-bit color is 64-940, which is 672 million colors (876^3) and 876 shades of gray. Note: this has nothing to do with the size of the gamut, but the number of discreet colors within the prescribed gamut.
post #3852 of 5348
Quote:
Originally Posted by turboman123 View Post

hello Tom
Nice info.
What about the case where we use the display to view pictures and movies that we take ourselves with a digital camera.. I do not know digital cameras that well. Do they have wider gamut capability, besides standard sRGB? If yes, would that make a case for wide gamut displays?
Sure. I am just talking about commercially produced video content. Photographic images are a whole different issue and are not limited by the HD standard.
post #3853 of 5348
I agree with everything you say Tom. But i would clarify that the Rec 2020 standard besides going to a longer bit length does have a color space double the area of the rec 709 space. The increae in bit length is really need to get rid of banding but the greater area space is necessary for the marketing of 4K and making the consumer want it especially when the consumer is married to 55 inch flat panel displays.
post #3854 of 5348
Quote:
Originally Posted by TomHuffman View Post

4:2:2 and 4:4:4 are different types of chroma subsampling. In effect, color information is compressed during transmission and then uncompressed for display. In the overwhelming number of cases, your eyes should not be able to see the difference between them (Blu-ray is compressed even more at 4:2:0). Nonetheless, a lot of displays expect 4:2:2, so sending it in that format may reduce by one the number of format conversions required. The bottom line is that it shouldn't make any difference.
Thank you Tom.
post #3855 of 5348
hello
I have a question about clipping. When setting contrast, clipping should be avoided. But I always wondered how to recognize clipping in Chromapure.

Here is how I do it. I measure greyscale and adjust offset and gain to get a good gresycale. Lets assume gain was adjusted to get a low dE deviation at and around 80IRE. If I see that R or G or B greyscale curve deviates downwards at 100%, then that would be an indication of clipping.
Question is what happens when R, G and B are clipping at the same time, and in the right proportion. Then I assume greyscale would stay perfect 100% but gamma goes up at 100%.
But I think this is only a theoretical case and in reality, R, G and B do not clip at the same time.
So greyscale is the one to watch.

That is how I would check for clipping. Is that OK? Any comments anyone?
post #3856 of 5348
Quote:
Originally Posted by TomHuffman View Post

There is a new standard that may be adopted for 4K material--Rec. 2020. You can see the specs here. Rec. 2020 requires at least 10-bit color.

For those who don't know, this just means that within the defined gamut there are a larger number of individual colors, each defined by a unique RGB triplet. 8-bit color, which is what we have been using for HD is 16-235 (I am ignoring above white and below black colors). This constitutes over 10,000,000 colors (219^3) and 219 shades of gray. 10-bit color is 64-940, which is 672 million colors (876^3) and 876 shades of gray. Note: this has nothing to do with the size of the gamut, but the number of discreet colors within the prescribed gamut.
That will be progress!
Thanks for the info and link Tom.
post #3857 of 5348
Quote:
Originally Posted by turboman123 View Post

hello
I have a question about clipping. When setting contrast, clipping should be avoided. But I always wondered how to recognize clipping in Chromapure.

Here is how I do it. I measure greyscale and adjust offset and gain to get a good gresycale. Lets assume gain was adjusted to get a low dE deviation at and around 80IRE. If I see that R or G or B greyscale curve deviates downwards at 100%, then that would be an indication of clipping.
Question is what happens when R, G and B are clipping at the same time, and in the right proportion. Then I assume greyscale would stay perfect 100% but gamma goes up at 100%.
But I think this is only a theoretical case and in reality, R, G and B do not clip at the same time.
So greyscale is the one to watch.

That is how I would check for clipping. Is that OK? Any comments anyone?
I've used the Spears and Muncil disc to check for clipping...
post #3858 of 5348
The AVS HD 709 disc is even easier and you can decide how much, if any, RGB clipping you allow. An excellent disc for this and many reasons.
post #3859 of 5348
Quote:
Originally Posted by Geof View Post

I've used the Spears and Muncil disc to check for clipping...

+1
post #3860 of 5348
Quote:
Originally Posted by T3b_vat View Post

The AVS HD 709 disc is even easier and you can decide how much, if any, RGB clipping you allow. An excellent disc for this and many reasons.
hello
Thanks for the reply. Unfortunately, I am a bit visually impaired, so I would like to rely on measurements.
Tom: can you please give me an input on how to use the capabilities of Chromapure for how to check for clipping. See my original post nr3886. Thanks.
post #3861 of 5348
Quote:
Originally Posted by turboman123 View Post

hello
Thanks for the reply. Unfortunately, I am a bit visually impaired, so I would like to rely on measurements.
Tom: can you please give me an input on how to use the capabilities of Chromapure for how to check for clipping. See my original post nr3886. Thanks.
The only objective measurement for clipping is a dramatic drop-off in gamma at 90%.
post #3862 of 5348
yes, if you measure R,G,B gamma (using small windows if you have a plasma) you will see one of them start diverging from the others if it starts to clip. You'll need to run patterns up to 109% however if you want to test above white clipping position. Not quite sure how to do a reference for the above white levels though, you'd probably have to calculate one based on 100%.
post #3863 of 5348
Quote:
Originally Posted by TomHuffman View Post

The only objective measurement for clipping is a dramatic drop-off in gamma at 90%.

How dramatic are we talking about? I also have this frustration with Gamma.

Hello. What is one of the best ways to go about doing gamma. I use chromapure with a id3 pro calibrated by Tom Huffman. Last night I did multiple tweaks to the gamma one by one then measure adjust, repeat....Everytime I noticed that even when using the Gamma module with the 100% white value in place and setting my gamma at 2.22 that when I take a post greyscale run my gamma in the 90% white drops to 2.16. 80% and 70% drop as well. I have done multiple adjustments and then post runs but the gamma still drops once a new white value is calculated from the post run. What I am trying to get at is what is the best approach in calibrating the gamma when you adjust one interval but then it could effect another. Say adjust 90 and then 80 then check 90? any tips to get it more linear and near target specially at 70, 80, 90 % white. It's a samsung d6500 plasma.

This is a reply from a message exchange with Micheal Chen.

Gamma is always calculated based on what 100 is. If you get it perfect in the module, it gets messed up in the post because the program wants an all new read of 100. Is this a bug?

What you can do is forget about the module and just do gamma in the post cal page. This one you know the reading will stick ... until you take another 100 reading of course.

Add to that, sometimes the formulas in one module get messed up a bit in another module. Calman has the same issues. Numbers don't translate the same when generating a report.

Try this first.

Gamma ... I like to set 100 first .. then get 50 right ... then 60 ... then 40 ...then 70 ... then 30 ...work out from the middle. Some TVs say start at 100 and work down ... 90 80 70 ... so forth ...
post #3864 of 5348
Windows 8

I just finished Windows 8 compliance testing using a Display 3 colorimeter and encountered no issues. CP runs fine using the new operating system.
post #3865 of 5348
Quote:
Originally Posted by hungro View Post

How dramatic are we talking about?
Say from 2.2 to 1.6.
post #3866 of 5348
Quote:
Originally Posted by TomHuffman View Post

Windows 8

I just finished Windows 8 compliance testing using a Display 3 colorimeter and encountered no issues. CP runs fine using the new operating system.
I also have been using Chromapure with Win8 and i1Display3PRO without problems.
post #3867 of 5348
Quote:
Originally Posted by TomHuffman View Post

The only objective measurement for clipping is a dramatic drop-off in gamma at 90%.
Thanks Tom for the reply. I tried out a few things.
Following is a greyscale on my panel with Contrast setting 84 compared to 80. Clearly 84 causes red clipping, and 80 not. Just a small change in gamma though, but clipping is easier to recognize in the greyscale I think.
CalibrationSummaryDetailed-Clipping.pdf 1323k .pdf file

Of course, these are only measurements in 10% steps. So question is, where does clipping begin exactly.
For that I made measurements with AVSHD near white patterns: 96%, 97% etc until 100%. For convenience, these measurements were put on the graph at 60%, 70% up to 100%.
First a measurement with clipping at Contrast 84. Then another measurement with Contrast 82, which shows that 99%peak white does not show clipping, but 100% shows clipping. That would show me that Contrast 81 would just completely avoid clipping.
CalibrationSummaryDetailed-Clipping-Fine.pdf 1303k .pdf file

Tom: how do you consider this procedure? Does it do the job as good as SM or AVSHD visual clipping pattern observation? I think this measurement procedure is better, less subjective, less dependent on panel uniformity or viewing angle.
Edited by turboman123 - 2/15/13 at 5:39am
post #3868 of 5348
Hello,

I have tried to get my White balance as neutral as possible.

I focused on the 30% and 80% IRE.
Should I insist in getting a better dE at 20% ?

What do you think of my results ?



Thank you.

( i1 Display 3 PRO & CP)
post #3869 of 5348
your greyscale is good, like low end meters the results of the D3 PRO is not totally accurate below 30%... wink.gif
post #3870 of 5348
Quote:
Originally Posted by realzven View Post

your greyscale is good, like low end meters the results of the D3 PRO is not totally accurate below 30%... wink.gif
The D3 PRO has no problem with accuracy at low light levels. It can have a problem with repeatability, but only on plasmas and only below 1 cd/m2, which is typically 10%.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Display Calibration
AVS › AVS Forum › Display Devices › Display Calibration › The Official ChromaPure thread