AVS Forum banner
1 - 20 of 128 Posts

·
Registered
Joined
·
283 Posts
Discussion Starter · #2 ·
1. Theories

Movies are normally encoded in YUV 4:2:0 with luma range 16-235, but most graphics card works in RGB 4:4:4 colour space with luma range 0-255, so conversion of both chroma and luma is required.
1.1 Luma resampling - why WTW and BTB is important

Luma resampling refers to expand video level (16-235) to PC level (0-255), and as you can see that the algorithm produces fractional numbers which even HDMI 1.4 can not carry. In order to send luma thru HDMI, fractional numbers are rounded to integers as the following example shows:

Orig Expanded Rounded

16 0 0

17 1.163636364 1

18 2.327272727 2

19 3.490909091 3

20 4.654545455 5

21 5.818181818 6

22 6.981818182 7

23 8.145454545 8

24 9.309090909 9

25 10.47272727 10

26 11.63636364 12

There are several problems with such algorithm:

1) BTB and WTW is cut off before luma expansion.

2) Banding is introduced. See the above example, the transition from 19-20 is mapped to 3-5.

3) Some graphics card later convert 0-255 back to 16-235 when using HDMI output which potentially causes more information lost.


The luma information outside video range (0-16 and 235-255) is also important though you don't normally see them in a movie. IMO they provide the following benefits:

1) It gives you the baseline when you calibration the brightness and contrast of your display.

2) It shows that the video luma is not cut off/expanded along the video pipeline.

3) Some movies contains information above 235, therefore it's important to reserve white up to 240.


Finally comes some suggestions:

1) Avoid luma expansion/compression if possible. It's fine that your HTPC output 0-255 without expanding luma and let your display to cut BTB and WTW.

2) If you have to do luma conversion, make sure that you have dither enabled to reduce banding. You can use FFDshow with dither enabled or MadVR to archieve this.

1.2 Chroma upsampling - why more bits is good

Chroma upsampling applies when converting YUV 4:2:0 to RGB/YUV 4:4:4. You can find more information about how it works below:
http://en.wikipedia.org/wiki/Chroma_subsampling

Chroma upsampling generates colour information which doesn't exist in the original video, therefore the results are different from different renderers. Some chroma algorithms are compared by Madshi here:
http://forum.doom9.org/showthread.php?t=146228

MadVR is a unique renderer that uses 16bits processing, the rest only uses 10bits or even 8bits. The ATI's internal video process pipeline uses 10 bits for example. Below is the description from Madshi with regards to why more bits are important:

"I've seen many comments about HDMI 1.3 DeepColor being useless, about 8bit being enough (since even Blu-Ray is only 8bit to start with), about dithering not being worth the effort etc. Is all of that true?

It depends. If a source device (e.g. a Blu-Ray player) decodes the YCbCr source data and then passes it to the TV/projector without any further processing, HDMI 1.3 DeepColor is mostly useless. Not totally, though, because the Blu-Ray data is YCbCr 4:2:0 which HDMI cannot transport (not even HDMI 1.3). We can transport YCbCr 4:2:2 or 4:4:4 via HDMI, so the source device has to upsample the chroma information before it can send the data via HDMI. It can either upsample it in only one direction (then we get 4:2:2) or into both directions (then we get 4:4:4). Now a really good chroma upsampling algorithm outputs a higher bitdepth than what you feed it. So the 8bit source suddenly becomes more than 8bit. Do you still think passing YCbCr in 8bit is good enough? Fortunately even HDMI 1.0 supports sending YCbCr in up to 12bit, as long as you use 4:2:2 and not 4:4:4. So no problem.

But here comes the big problem: Most good video processsing algorithms produce a higher bitdepth than you feed them. So if you actually change the luma (brightness) information or if you even convert the YCbCr data to RGB, the original 8bit YCbCr 4:2:0 mutates into a higher bitdepth data stream. Of course we can still transport that via HDMI 1.0-1.2, but we will have to dumb it down to the max HDMI 1.0-1.2 supports.

For us HTPC users it's even worse: The graphics cards do not offer any way for us developers to output untouched YCbCr data. Instead we have to use RGB. Ok, e.g. in ATI's control panel with some graphics cards and driver versions you can activate YCbCr output, *but* it's rather obvious that internally the data is converted to RGB first and then later back to YCbCr, which is a usually not a good idea if you care about max image quality. So the only true choice for us HTPC users is to go RGB. But converting YCbCr to RGB increases bitdepth. Not only from 8bit to maybe 9bit or 10bit. Actually YCbCr -> RGB conversion gives us floating point data! And not even HDMI 1.3 can transport that. So we have to convert the data down to some integer bitdepth, e.g. 16bit or 10bit or 8bit. The problem is that doing that means that our precious video data is violated in some way. It loses precision. And that is where dithering comes for rescue. Dithering allows to "simulate" a higher bitdepth than we really have. Using dithering means that we can go down to even 8bit without losing too much precision. However, dithering is not magic, it works by adding noise to the source. So the preserved precision comes at the cost of increased noise. Fortunately thanks to film grain we're not too sensitive to fine image noise. Furthermore the amount of noise added by dithering is so low that the noise itself is not really visible. But the added precision *is* visible, at least in specific test patterns (see image comparisons above).

So does dithering help in real life situations? Does it help with normal movie watching?

Well, that is a good question. I can say for sure that in most movies in most scenes dithering will not make any visible difference. However, I believe that in some scenes in some movies there will be a noticeable difference. Test patterns may exaggerate, but they rarely lie. Furthermore, preserving the maximum possible precision of the original source data is for sure a good thing, so there's not really any good reason to not use dithering.

So what purpose/benefit does HDMI DeepColor have? It will allow us to lower (or even totally eliminate) the amount of dithering noise added without losing any precision. So it's a good thing. But the benefit of DeepColor over using 8bit RGB output with proper dithering will be rather small.

"

The MPC-HC internal YV upchroma shader also produce a very close result, in order to make it work you need to feed NV12 to MPC-HC and select EVR renderer.

1.3 Resizing algorithms

There are also several resizing/scaling algorithms that you can choose from different renderers, for example bicubic in EVR or VMR9 and nearest neighbor in overlay and VMR7. In general, bicubic provides better quality than other scaling algorithms which gives an advantage of using EVR renderer over VMR/overlay.

There is a comparison among different scaling algorithms which can be found here: http://audio.rightmark.org/lukin/gra...house_more.htm


Since EVR, Halli and MadVR provide superior scaling algorithm, and I would suggest to use these renderers instead of others. If for some reason you have to stick with overlay, you can use ffdshow to do scaling instead.

1.4 De-Interlacing

I don't have much knowledge in this area, and the ATI hardware de-interlacing satisfies my requirements.

Someone who knows more about it, please help me here, I think it's also a quite important factor.

1.5 Smooth 24P playback - avoid 3:2 pulldown

The FPS(frame per second) in different video files might not be the same, for example BBS content is normally in 25P. However, most of movies are 24P which means the 23.976 frames per second (23.976 comes from24/1.01). In order to play such video at 60HZ, 3:2 pulldown is introduced by ATI. 3:2 pulldown repeats the first frame 3 times, and then 2nd frame 2 time, so for every 2 frames it generates 5 (24/60=2/5). The potential problem of 3:2 lies in the fact that some frames stay on the screen longer than others which can cause noticable judder. True 24P playback doesn't need 3:2 pulldown and the playback is much smoother. In order to enable true 24P playback you need to make sure that:

1) Your display accepts 1080/24P input

2) You can choose either 23HZ or 24HZ refresh rate in display setup. Be aware that you should choose 23HZ if both 23 and 24 exist in your display settings.

When you TV or projector receives 24P signal, it normally either do 5:5 pulldown(display each frame 5 times) and use creative frame creation to generate intermidiate frames(generate 4 frame between every 2 frames). Personally I prefer frame creation which is available in my panasonic projector, but nevertheness, both options should give you smooth playback (comparing to 3:2 pulldown). If your display doesn't support 24HZ refresh rate, then try 48HZ (2:2 pulldown) or 72HZ (3:3 pulldown) instead.

I choose 23.976HZ (23HZ in CCC) when I play 24P, 50HZ when I play 25P, and 59.94HZ(59HZ in CCC) when I play 30P or 30i.

End of theory. Please leave your feedback so that I know if I should continue :)

Thanks for reading and bearing with my poor English.
 

·
Registered
Joined
·
283 Posts
Discussion Starter · #3 ·
2. HTPC calibration
2.1 Possible output from your graphics card
2.2 Decoders and Renderers - which one to choose
2.3 Check luma output
2.4 Check banding - minimize luma resampling
2.5 Check red, green and blue colours - make sure colour conversion is correct
2.6 Check chroma upsampling quality
2.7 Smooth playback - Why ReClock
2.8 Check lip-sync


I have to stop writing for a while because I noticed that my graphics card HD5770 behaves differently from others. Some people get 0-255 output when RGB full is chosen as pixel format while I only get 16-235. It would be nice if I can find a solution to this before I post wrong information. Any discussion regarding ATI HD5XXX or HD4XXX is welcomed here, please do post your observation here so that I can try to analyse the problem. When you do test, please use the test video here:
http://www.avsforum.com/avs-vb/showthread.php?t=948496

In Misc section, there is a grey dump video, and that video shows two things:

1) BTB and WTW if they're reserved.

2) Banding which shows the luma has been compressed or expanded during the process pipe.

One example of the video is as follows:


There is serious banding in the attached image as you can see.

Please use ATI AVIVO DXVA and set Dynamic range to 16-235, also check the pixel format which should be RGB 4:4:4 full. See the test steps below:

1. If your TV support 0-255, please choose that then watch the test video. Do you see WTW and BTB? If not it's due to wrong configuration, pls check your dynamic range is set to 16-235. Do you see grey black or white? When you see that, it means your HTPC is outputting compressed 16-235 rather than 0-255. Last question, do you see banding?

2. If your TV support 16-235, please choose that then watch the test video. Do you see WTW and BTB? If you do, your HTPC is outputting compressed 16-235. If not, try raise the brightness and check if you see BTB. If you do, that means your TV is getting 0-255 and nothing cuts BTB in between.


In both cases, if you're sure that your HTPC output 0-255 not compressed 16-235, could you please leave your configurations:

Do you have dual screen? Do you use HDMI port on the graphics card or ATI HDMI adaptor which connects to your DVI output? Do you go through an AMP? If so, what's the model of your AMP? Do you have any video processing in your AMP that might convert 16-235 to 0-255? What is your display device? Do you use EDID overwrite trick? Do you have Realtek HDMI driver? What is the version of CCC/graphics driver do you use? What does your desktop looks like, do you get 0-255 output also for your desktop?


For people who don't get 0-255, you're also welcomed to post your setup here. I don't get 0-255, my setup is as follows:

XFX 5750 HDMI port -> Onkyo TXNR-906 (HDMI through) -> Panasonic PT-AE3000 projector

I do suspect it was EDID in my projector tells my graphic card it prefer 16-235 YCrCb, but I'm not confirmed.

Please submit your result in this thread, thank you very much!!

More description:

There are three places to control video levels:

Dynamic range: 0-255 means cut BTB WTW and expand 16-235 to 0-255. 16-235 means output as it is with NO expansion. Dynamic range is only relevant to AVIVO and it takes effect ONLY when you use DXVA decoder.

Pixel format: RGB limited outputs RGB with luma 16-235 and RGB full outputs 0-255 (at least for some users). And as you can see the output level has nothing to do with the video level.

TV input: you can probably choose between video level or PC level, that is to decide when getting input from your HTPC if 0 or 16 is black and 235 or 255 is white.


You should try choose DR 16-255 (so that the decoder output untouched 0-255) and then set RGB full in CCC (so that 0-255 is output without compressing according to ATI), then try play with your TV input settings, and there can be two cases:

1) when choose 16-235, you don't see BTB/WTW, and when choose 0-255 you see both BTB/WTW and you don't get grey black or washed white.

2) when choose 16-235 you see BTB/WTW, and when choose 0-255 you also see BTB/WTW but you get grey black or washed white.


Basically 1) means graphics card output 0-255 correctly according to CCC setting and 2) means graphics card compress 0-255 to 16-235 YUC and you still get BTB/WTW but the range is COMPRESSED. In both cases BTB/WTW should not be clipped by any component. To do the above test, it's important to rest brightness and contrast in your TV to make sure it doesn't re-expand 16-235 to 0-255.


Andy is in case 1 and I'm in case 2.

My guess

ATI compress all RGB pixels to YUC 16-235 pixels for some of us. Only when you output 0-255 YUC pixels get you 0-255 output. This means:

1) no matter what pixel format you choose, the desktop looks the same (always 16-235 compressed)

2) when you play BD with powerDVD 9 and choose YCbCr output (both works), you get 0-255 output.
 

·
Registered
Joined
·
1,650 Posts

Quote:
Originally Posted by somy /forum/post/18187884


Please let me know if anyone is interested, thanks.

Just about the perfect timing for me.

I've started reading up about Display Calibration a few days ago and looking into Calibration Meters and what not.

Your guide would fit the bill perfectly.

Keep it up; I'll be following it every day.
 

·
Registered
Joined
·
115 Posts
Very very very interesting thread in a subject where I've never had a clear vision.


I have read your posts in the ATI 5xxx thread on this forum and in the MadVR thread on doom9 and have found very interesting the distinction between the BTB-WTW passthrough and 0-255 levels: you can have BTB-WTW even with 16-235 because of luma compression!.


Go on with the guide.
 

·
Registered
Joined
·
1,650 Posts

Quote:
Originally Posted by somy /forum/post/18187879


End of theory. Please leave your feedback so that I know if I should continue :)

Thanks for reading and bearing with my poor English.[/b]

Definitely continue!

Looks good enough to be a 'sticky', maybe soon.
 

·
Registered
Joined
·
216 Posts
After moving to bitstreaming using ATI 5670 (and getting it all working) this is perfect timing. I've never had a clear understanding of this topic and other endless related threads herein have been brain melting.
 

·
Registered
Joined
·
442 Posts
Im using windows 7 and ati 5870. I have a samsung 52a850 (1080p hdtv), hdmi input is set to 16-235. Im sending 1920x1080 24Hz YCbCr 4:2:2 to the screen. Playing HD mkvs in mpc-hc it looks correct.


I guess I should choose 23Hz and RGB 4:2:2 in CCC instead?


Its just that my tvs hdmi input has been calibrated using my panasonic blu-ray player (set to send 1080p 24p YCbCr 16-235). and both my computer and the panny is going through my pioneer receiver and reaching the same hdmi input on my tv.
 

·
Registered
Joined
·
129 Posts
Somy, as you told me in the other thread, I post my setup here too:


OS: Windows7 x64

No dual screen

Card: HD4850

TV: Pioneer KRP-500A

Connection: HDMI. My graphics card doesn't have a direct HDMI port, so I have to use the ATI DVI-HDMI adaptor. The cable goes from tha adaptor to the TV directly (no AMP).

Video Driver: Provided with Catalyst 9.12

Audio Driver: Realtek HDMI Audio 2.42

Output Pixel Format (configured in Catalyst): Full RGB 4:4:4

Input Pixel Format (configured in TV): Full RGB 0-255

¿EDID overwrite trick?: Not using. BTW, I've heard about that but I don't know exactly what it is. Can you tell me?

Player: MPC-HC

Renderer: madVR set to TV Levels (Full Range)


And using your test pattern:


1. With 0-255 set on my TV, I can see BTB/WTW.

I see total black at 0 ant total white at 255 (not grey at that levels).

I don't see banding. I only have it if I enable ffdshow's levels filter to calibrate to 16-245, but that banding is produced by the filter (that is 8-bit), not by the card or the TV.

2. Yes, my TV also supports 16-235, and when I choose it, WTW and BTB are NOT shown with default brightness/contrast at the display.

If I raise the brightness, then I can see BTB but only down to the 7-8 level. Below that, all is clipped.
 
1 - 20 of 128 Posts
Top