or Connect
AVS › AVS Forum › Video Components › Home Theater Computers › Theory About Intel's HDMI Quantization Range Setting (Full 0-255)
New Posts  All Forums:Forum Nav:

Theory About Intel's HDMI Quantization Range Setting (Full 0-255) - Page 3

post #61 of 184
Quote:
Originally Posted by Foxbat121 View Post

The only reason we need this override is for LCD monitors that has HDMI port but doesn't support limited range (many < $200 cheap LCDs do this). If you have a TV or AVR, there is no reason to use this fix (it doesn't give you any benefits). Out of box, intel's choice works for majority of the TVs and AVRs.

Actually, the 16-235 setting, while giving you correct black & white levels, will probably introduce banding artifacts. So it is probably not a good choice if want to have good image quality. It's a better idea to switch your TV to full range (or whatever the option is named in your TV) and force the Intel GPU to output 0-255.

Why banding artifacts? Well, it's easy enough to understand. When using 0-255, there are 256 different shades of gray available. If the GPU stretches this to 16-235, and outputs the result in 8bit, there are only 220 different shades of gray left. So if you have a image/video with very smooth gradiants, you will see visible banding artifacts. E.g. instead of gray shades "0,1,2,3,4,5,6,7,8,9" you will get "16, 17, 18, 18, 19, 20, 21, 22, 22, 23" or something like that. This is visible to the human eye, especially in motion (video).
post #62 of 184

offtop: madshi, you permanent wish to bring knowing for the wide masses is pleasantly surprised. But, IMHO, it is better if you concentrate on your main target :pAs user of madVR with multiple monitors can I ask the option to show statistics not over video surface, but in separate window (with text edit control maybe)?

post #63 of 184
That would be quite a bit of extra work, and you're the first to ask for this. So I'll have to say "no" for now. In the same time I would need to implement your wish I could implement other features that a bigger number of madVR users would benefit from. I only have limited resources, so I have to implement those features first which are useful to the highest number of users.
post #64 of 184
Quote:
Originally Posted by madshi View Post

Actually, the 16-235 setting, while giving you correct black & white levels, will probably introduce banding artifacts. So it is probably not a good choice if want to have good image quality. It's a better idea to switch your TV to full range (or whatever the option is named in your TV) and force the Intel GPU to output 0-255.

Why banding artifacts? Well, it's easy enough to understand. When using 0-255, there are 256 different shades of gray available. If the GPU stretches this to 16-235, and outputs the result in 8bit, there are only 220 different shades of gray left. So if you have a image/video with very smooth gradiants, you will see visible banding artifacts. E.g. instead of gray shades "0,1,2,3,4,5,6,7,8,9" you will get "16, 17, 18, 18, 19, 20, 21, 22, 22, 23" or something like that. This is visible to the human eye, especially in motion (video).

Not entirely true. Videos are originated with 16-235 limited range in majority of the cases (100% if you consider only commercial available video source). If you use 0-255 full range, the video will need to be stretched to full range and artifacts may occur. A lot of TVs internally use 16-235 range anyway. So even if you have a full-range video goes into the display, there is nothing to say that the TVs itself won't convert that into 16-235 again. Of course, PC desktop and games are typically in 0-255 full range and it can be a degradation if you squeeze them into limited range. There is no perfect solution for this. It all depends on which type of sacrifice you are willing to take.

Not to mention, in full range 0-255 mode, there is no way to calibrate your display because BTB and WTW information is clipped.

Film industry does not like 0-255 full range. Videos are already produced on the basis that limited range will be used.
post #65 of 184
They key is to keep everything the same, and ensure the compatibility with your gear.

For me- my monitors are full range so on my PC I require full range everything.

Most quality gear (displays or projectors) do indeed accept full range BTW.

If you stretch or change you need to be mindful of dithering and artifacts.
post #66 of 184
Quote:
Originally Posted by Foxbat121 View Post

Not entirely true.

Actually, yes, it is.

Quote:
Originally Posted by Foxbat121 View Post

Videos are originated with 16-235 limited range in majority of the cases (100% if you consider only commercial available video source). If you use 0-255 full range, the video will need to be stretched to full range and artifacts may occur.

Videos are encoded in YCbCr. HTPCs and displays "think" in RGB. At some point in the processing chain, your HTPC will usually convert the YCbCr video to RGB, and at that point you have a matrix multiplication with floating point values. Which means that you will end up with floating point RGB values. These will almost never match exactly any of the 0-255 or 16-235 range steps. Even black & white movies are often slightly "colored". Artifacts will occur if you just take the floating point RGB values, round or truncate them (to either 0-255 or 16-235) and then send them as 8bit to the display. The only way to avoid artifacts is to dither the floating point RGB values down to the output bitdepth, or to send them in a higher bitdepth to the display. And once you do that, it doesn't matter much whether the output range is 0-255 or 16-235.

So why did I mention problems with the GPU outputting 16-235? Because the GPU is taking the carefully rendered Windows output and stretches it to from 0-255 to 16-235 behind the back of Windows and behind the back of the video renderer, and it (probably) does that without applying dithering. The end result will be pretty bad, as you might imagine.

Or let me make it even clearer: If you set the GPU to limited range output, that automatically means that the video renderer must render to 0-255, because the GPU will take the video renderer's output and stretch it from 0-255 to 16-235 behind the video renderer's back. This double stretching is not lossless (when done in 8bit, which it usually is). So having the GPU stretch from 0-255 to 16-235 is a bad idea for quality in any case. It would be a better idea to tell the GPU to output 0-255 and then to tell the video renderer to render to 16-235. This way no stretching would be involved, but you'd still send 16-235 to the display.

Quote:
Originally Posted by Foxbat121 View Post

A lot of TVs internally use 16-235 range anyway. So even if you have a full-range video goes into the display, there is nothing to say that the TVs itself won't convert that into 16-235 again.

You need to step away from the ranges a bit and see the "bigger picture". Converting between different ranges is only a problem because HTPCs usually work in 8bit (per channel). If we were using floating point values, we could convert back between different ranges in the HTPC a thousand times, send the data in any range to the display, and have the display convert ranges another thousand times without any quality degradation. The problem is not the range conversion in itself. The problem is doing such a range conversion at a low bitdepth without dithering, because doing that will introduce very big quantization errors.

As long as the display takes the input and converts it to another range in high enough internal bitdepth, the range conversion is not a problem.

Quote:
Originally Posted by Foxbat121 View Post

Of course, PC desktop and games are typically in 0-255 full range and it can be a degradation if you squeeze them into limited range. There is no perfect solution for this.

Yes, there is: Use full range for everything. Render the video with proper dithering at the end of the processing chain. This will result in "perfect" image quality for everything, including desktop, games, photos and video playback.

Quote:
Originally Posted by Foxbat121 View Post

Not to mention, in full range 0-255 mode, there is no way to calibrate your display because BTB and WTW information is clipped.

You don't need BTB and WTW for calibration. Of course BTB and WTW test patterns/videos make it easier to adjust the black and white levels correctly. But it's possible without BTB/WTW, too.
post #67 of 184
You forgot most GPUs, intel's IGP included, have the option to chose how to render videos. So, you don't have to render video into 0-255 and then into 16-235 at all.

As for how to convert YCbCr into RGB, that is a fixed algorithm based on which color space you choice. The result, although can be float point, is within 16-235. Stretch into 0-255 will cause degradation, regardless.

Full-range RGB is a PC problem. Rest of the video world adopted limited range and perfect tuned for that.
post #68 of 184
Quote:
Originally Posted by Foxbat121 View Post

You forgot most GPUs, intel's IGP included, have the option to chose how to render videos. So, you don't have to render video into 0-255 and then into 16-235 at all.

That depends on the GPU, the renderer, the presentation mode the renderer is using, and which API the renderer is using to do the color conversion. I know all about it. I even know which APIs are affected by these settings by which GPU manufacturers (it varies!). For example I know that when using the DXVA APIs to do the color conversion, Intel drivers ignore the flag which defines whether you want to get 0-255 or 16-235 levels, while NVidia and AMD drivers respect those flags. You don't seem to know who you're discussing with... tongue.gif

But yes, you're right that with *some* renderers and *some* GPUs, depending on the renderer and driver settings, it is possible to get 16-235 output without rendering to 0-255 first. It is very hard to get this, though! Basically the only way is to use hardware overlay mode, and that's not what EVR is normally using. Furthermore AMD hardware doesn't even support hardware overlay in Windows Vista, 7 or 8.

Let me give you an example of what happens with NVidia: There are 2 range modes. There's the "global" mode, which defines whether the GPU stretches the Windows desktop output from 0-255 to 16-235 or not. And then there's the "Video -> Adjust video color settings -> Advanced -> Dynamic range" setting, which modifies how YCbCr -> RGB color conversion is done by the driver. Many renderers are using the "D3D9->StretchRect()" API to do color conversion. So there's what StretchRect() does with current NVidia drivers, depending on the "global mode" and the "dynamic range" setting:

(a) global mode = Full; dynamic range = Full ---> StretchRect produces 0-255
(b) global mode = Full; dynamic range = Limited ---> StretchRect produces 16-235
(c) global mode = Limited; dynamic range = Full ---> StretchRect produces 0-255
(d) global mode = Limited; dynamic range = Limited ---> StretchRect produces 0-255

As you can see, as soon as the GPU stretches the Windows desktop from 0-255 to 16-235, the StretchRect() API automatically always converts YCbCr to RGB with the 0-255 range. So any renderer using StretchRect() will result in double stretching in this situation. And that is what most users will get, when they set the "global mode" to Limited Range. And that is why I strongly recommend to not use this mode, because it seriously harms image quality.

Quote:
Originally Posted by Foxbat121 View Post

As for how to convert YCbCr into RGB, that is a fixed algorithm based on which color space you choice. The result, although can be float point, is within 16-235. Stretch into 0-255 will cause degradation, regardless.

Have you read the BT.601, BT.709 and BT.2020 standards? That "fixed algorithm" you're talking about is usually a floating point matrix multiplication based on the assumption that the YCbCr values are in the range 0.0 - 1.0 (black = 0.0; white = 1.0).
post #69 of 184
You are correct on the complexity of drivers and renders and applications. There is no easy way to ensure that. But that doesn't mean full range is the best choice all the time.

You are also correct on the BT.601, 709 etc that the output is normalized to 1.0 but I assume the input to that is from a video capture device that produced 16-235 range RGB values, processed that way and converted to YCbCr at the end. So, when you reverse back, 16-235 gives you least amount of distortion.

My point is you can choose either full range or limited range. There are pros and cons on both bepending on what is most important to you.
post #70 of 184
madshi -

What happens when you record an HD OTA signal? I’m under the impression the MPEG2 signal gets recorded to the HDD as an 16-235 MPEG2 signal.

I know you said a PC works in 0-255, but why would it have to change a 16-235 image?

IOW if I look a photo (jpeg, tiff, etc) on my PC & it has no black or dark pixels the PC does not stretch or change the picture image to make parts of it black. Why can’t the PC just leave a video image the way it is?
post #71 of 184
on a PC, 0 typically means darkest black. On a video, 16 means darkest black (0 to 15 is called Blacker-Than-Black and can't be displayed). How do you propose to display it correctly on your screen if the PC just leave it alone?
post #72 of 184
@Mike99,

the MPEG2 video stream after decoding is in YCbCr format (separate brightness and color information). You can't just leave the video image the way it is because displays work in RGB and not in YCbCr. Furthermore the MPEG2 stream has Y at 16-235, but Cb and Cr are at 16-240. Also Cb and Cr have a lower resolution than Y. There are different YCbCr -> RGB conversion matrixes available, depending on the decoding matrix (BT.601, BT.709, BT.2020), the source range (YCbCr full range, YCbCr limited range) and the target range (RGB 0-255 or RGB 16-235). After YCbCr -> RGB conversion you end up with either limited or full range RGB, depending on which color decoding matrix you're using. Furthermore you end up with floating point RGB values, but displays usually need 8bit (or 10bit or 12bit) integer RGB values. So you also need to convert the floating point values to integer.

As you can see, displaying a video is much more complicated than just showing it "the way it is". There are several conversion steps necessary to turn the decoded MPEG2 video stream into something your display can show.
post #73 of 184
Quote:
Originally Posted by Foxbat121 View Post

on a PC, 0 typically means darkest black. On a video, 16 means darkest black (0 to 15 is called Blacker-Than-Black and can't be displayed). How do you propose to display it correctly on your screen if the PC just leave it alone?

I’m presuming the same way the HD OTA signal is handled by the HDTV.
post #74 of 184
Quote:
Originally Posted by madshi View Post

@Mike99,

the MPEG2 video stream after decoding is in YCbCr format (separate brightness and color information). You can't just leave the video image the way it is because displays work in RGB and not in YCbCr. Furthermore the MPEG2 stream has Y at 16-235, but Cb and Cr are at 16-240. Also Cb and Cr have a lower resolution than Y. There are different YCbCr -> RGB conversion matrixes available, depending on the decoding matrix (BT.601, BT.709, BT.2020), the source range (YCbCr full range, YCbCr limited range) and the target range (RGB 0-255 or RGB 16-235). After YCbCr -> RGB conversion you end up with either limited or full range RGB, depending on which color decoding matrix you're using. Furthermore you end up with floating point RGB values, but displays usually need 8bit (or 10bit or 12bit) integer RGB values. So you also need to convert the floating point values to integer.

As you can see, displaying a video is much more complicated than just showing it "the way it is". There are several conversion steps necessary to turn the decoded MPEG2 video stream into something your display can show.

I’m not trying to challenge anyone, I’m just trying to learn more about this.

If digital video does not contain any image information in the 0-15 range, and HDTVs display 0-255, then the TV has to convert 16-235 to 0-255 in order to display the image. If so wouldn’t that mean that OTA HD would be stretched & therefore also be subject to the banding issue?
post #75 of 184
HDTVs are designed to display 16-235 limited range (all your cable box, DTV box, TiVo, BD player and DVD players all output limited range). You have to specifically configure some HDTVs to accept 0-255 for full range support. Some HDTVs don't even support full range at all. Only PC monitors natively use full range RGB as this thread is all about.
post #76 of 184
This is a photo of a cheap Vizio 26" TV fed by an i7-2600 (HD2000 graphics). I've had no trouble getting 0-255 on either of my Vizios, both of which are fed by Intel HD graphics.


Here's the whole testchart:
testchart.zip 122k .zip file
Edited by olyteddy - 10/4/13 at 6:51pm
post #77 of 184
Intel IGD will always convert 0-255 into 16-235 on your HDTV via HDMI (read this thread). You will have to use the registry key override posted in last page to force intel not to convert. So, did you use that registry hack when you posted that?
post #78 of 184
Quote:
Originally Posted by Foxbat121 View Post

Intel IGD will always convert 0-255 into 16-235 on your HDTV via HDMI (read this thread). You will have to use the registry key override posted in last page to force intel not to convert. So, did you use that registry hack when you posted that?
No. I simply left it in RGB mode:



If I change to YCbCr it goes into 16-235 mode and the blacks get crushed.
post #79 of 184
Quote:
Originally Posted by Mike99 View Post

I’m not trying to challenge anyone, I’m just trying to learn more about this.

If digital video does not contain any image information in the 0-15 range, and HDTVs display 0-255, then the TV has to convert 16-235 to 0-255 in order to display the image. If so wouldn’t that mean that OTA HD would be stretched & therefore also be subject to the banding issue?

Depends on the quality of the video renderer. If the video renderer is using proper dithering, there should be no banding.
post #80 of 184
Quote:
Originally Posted by olyteddy View Post

No. I simply left it in RGB mode:



If I change to YCbCr it goes into 16-235 mode and the blacks get crushed.

it doesn't matter if you choose RGB mode or not. It will output and convert everything into 16-235 unless you use registry hack. There is no other way. Please read the thread. The easiest way to verify is connect to a cheap PC monitor with HDMI, and you will find your black is actually just hazy gray.
post #81 of 184
Quote:
Originally Posted by olyteddy View Post

I've had no trouble getting 0-255 on either of my Vizios, both of which are fed by Intel HD graphics.

 

 

No, you only "no trouble" to get 0,255 only values by this test, not 0-255 ;) Take picture with _gradient_ from 0 to 255, try to view with "Pixel Exact" mode of TV (so called "IT mode" of HDMI specs), maybe then you understand top post of madshi on this page.

post #82 of 184
Thread Starter 
Wow... lots of great developments since I started this thread almost 4 months ago. smile.gif

I can confirm that the registry setting works on both Sandy Bridge and Haswell iGPU. Glad this has finally been resolved after all these years.

Combined with the best available refresh rate accuracy for 23.976 and 59.940 content, the Haswell iGPU is now unbeatable for most HTPC usage.
post #83 of 184
Quote:
Originally Posted by Wizziwig View Post

...after all these years...

 

;) You are slightly unlucky -  I was first encountered Intel+HDMI pair at Apryl 2013 and interested in the topic problem near a month ago.

 

To generalize possible solutions (not only intel and hdmi) it can be done by 3 other ways:

1. at the sink side

a. by firmware patching

b. by EDID reprogramming in some cases

 

2. at the source side(as PC+Win)

a. drivers update

b. drivers settings

c. EDID override by common (like MS EDID override) or driver specific (note my first post) ways.
 

3. by intermediate hardware device

 

I try 1b, 2c (both ways), 2b

post #84 of 184
Quote:
Originally Posted by sneals2000 View Post

It does annoy me that consumer gear almost always "just works" - but those of using HTPCs always end up in a world of hurt when it comes to 16-235 vs 0-255 video.

16-235 is a standard we've had, for very good reason, for over 30 years..

There hasn't been a good reason for it for 20 years. Once we went digital, it should have been done away with.
post #85 of 184
Quote:
Originally Posted by Wizziwig View Post

Combined with the best available refresh rate accuracy for 23.976 and 59.940 content, the Haswell iGPU is now unbeatable for most HTPC usage.

Ivy Bridge rate accuracy not so bad as you may think :):

post #86 of 184
Thread Starter 
Quote:
Originally Posted by SweetLow View Post

Ivy Bridge rate accuracy not so bad as you may think smile.gif :

Is that a custom resolution or the default 23hz preset? In any case, since audio clock is almost never exactly 48Khz, what matters most is the time between frame repeats or drops. Your Ivy results look excellent !
post #87 of 184
Quote:
Originally Posted by SweetLow View Post

Ivy Bridge rate accuracy not so bad as you may think smile.gif :


What program are you using that displays this data?
post #88 of 184
I am seeing only very old Intel driver GUIs here. Does that mean that this only works with older intel drivers, i.e. before they introduced the the 'new' metro style?.
Also all these reports seem hardly reproducible without knowing exactly what HW is being used (mobo, chip, avr, tv)
post #89 of 184
It has nothing to do with driver GUI or older vs new drivers. Read the thread. If you don't know or don't see any problem, there is no need to mess with it.

It's really only affect a small portions of ppl who use HDMI port on the motherboard to connect to a new LCD PC monitor (not HDTV) with HDMI port.
post #90 of 184
Actually it affects all madVR users who use HDMI (at least those who look for best image quality).
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Home Theater Computers
AVS › AVS Forum › Video Components › Home Theater Computers › Theory About Intel's HDMI Quantization Range Setting (Full 0-255)