AVS Forum banner

ATI Pixel Format

83K views 44 replies 13 participants last post by  kocikoc  
#1 ·
what to chose?

i got a Sony KDL-40W2000 and a AMD 780G Chipset

and i can see a option in ATI Control Center, so what is it?
Image
 
See less See more
Preview image for a collapsed post.
1
#2 ·
Think it like this.


A BD movie is basicly 8bit 4:2:0, but your display is capable of showing greater then this. So the signal gets converted when it starts displaying. So what makes the best conversion your player or your display?


Its not so much different from upconversion of DVD. What makes the best upconversion, the player or the display?


Since I dont know much about your display, I can only recommend testing yourself and see if you can see any difference.


EDIT


Test YCbCr 4:4:4 Then the computer handles the chroma upsampling. In 4:2:0 there is just one chromasample per 4 luma samples. So basicly the computer recreate one chroma sample per luma sample.
 
#3 ·
This is a simple option with very complex set of issues behind it. The video on blu-ray and other digital video is in YUV (YCbCr) domain. Computer screen/graphics however, is in RGB. One can convert from one domain to the other (as set up in this menu) but you lose something in translation.


Now, if you were just going to play movies on your PC, you would want the least number of translations. So best would be to select the YUV choices. Unfortunately, they do not give the option of what is on disc: 4:2:0 so you are stuck with interpolation to 4:4: 4 on the graphics card and hope that it is done right.


Alas the situation is more complicated. You need to know what pipeline the video player is using. If it is creating a YUV overlay plane for the video (because that is what is on disc), then the above works. But the player may operate in RGB domain when it mixes say, menus over video. In that case, you would want to output in RGB, and not format convert back to YUV.


Now, your display may take that YUV and convert it to RGB. So maybe you want to let the graphics card do it. Or maybe not
Image
.


Of course, if you deal with pictures and other graphics work on your PC, you want to run in RGB mode, not YUV. And for sure RGB 4:4:4 as 4:2:2 would reduce the color resolution so for example, red text on blue background would have softer edges.


Then there is the issue of "set-up." PC video has 8-bit samples which go from 0-255. Video however, can have "setup" so that it goes from 16-255 (what ATI is calling Limited RGB). If you try to expand this to 0-255 and don't dither, you get banding. So for video, you may want this with 16-255 but for graphics/PC use you want 0-255.


Quite a mess. No? It is very difficult to get a proper pipeline to work and work for all applications. So if you want one answer, I say use the one you have highligted.


If you want the right answer for video, put in a test disc and see which mode produces the best ramps, etc.
 
#8 ·
Wouldn't it be better to choose limited RGB if watching Blu Ray movies on your PC as choosing full RGB will result in black levels being off and details being clipped.


I know on my Playstation 3 that limited RGB is for movies and full RGB is intended for games although i choose not to use RGB...Is this not the same for your PC ?
 
#9 ·

Quote:
Originally Posted by FoxyMulder /forum/post/15069685


Wouldn't it be better to choose limited RGB if watching Blu Ray movies on your PC as choosing full RGB will result in black levels being off and details being clipped.


I know on my Playstation 3 that limited RGB is for movies and full RGB is intended for games although i choose not to use RGB...Is this not the same for your PC ?

If you use limited RGB, the player cut out information, it should better your display does it when you calibrate the display.
 
#10 ·

Quote:
Originally Posted by MovieSwede /forum/post/15070864


If you use limited RGB, the player cut out information, it should better your display does it when you calibrate the display.

Yes you cut out information but for movies you need to do this as the display will not be correct otherwise....Full RGB is for games and possibly graphic design programs but you don't use full RGB for movie watching ....There was a full explanation in the Playstation 3 thread about this and how black levels are incorrect if using full RGB for movie watching...Is it different for computers because i thought a Playstation was basically a computer.


My display is calibrated.
 
#11 ·
For ATI users there has been a lot of talk about color spaces, the HDMI dongle and a series of regtweaks. PhaedrusGalt's post here pretty much sums it up:

http://www.avsforum.com/avs-vb/showt...5#post14370685


So the solution for ATI users to get a consistent color space for everything has been to apply this regtweak, which expands everything to PC levels. The HDMI dongle then compresses (observe, not "cuts off") the signal to video levels, which can be correctly displayed on a TV set to video levels.


What I wonder now is what this Pixel Format-setting in CCC does in this equation. Does the compression in the HDMI dongle still happen independently of this, or does the Pixel Format-setting actually control what the dongle does? If I still want to let the dongle perform the compression, I guess I should set pixel format to full range, that is 4:4:4, correct? But which one of the two, RGB or YCbCr? This is very confusing...
Image
 
#12 ·

Quote:
Originally Posted by Seeco /forum/post/15076516


I guess I should set pixel format to full range, that is 4:4:4, correct? But which one of the two, RGB or YCbCr? This is very confusing...
Image

If your source is only going to be Blurays and DVD, then staying in the YCbCr format would do least conversions.


It gets more tricky when you have mixed sources on your display.
 
#15 ·
Yeah, that's what's mentioned in the post I linked to earlier. The DVI to HDMI converter that comes with modern ATI graphics cards supposedly compresses pc levels to video levels (without clipping them).

Quote:
if you use the DVI->HDMI dongle supplied with the 4000 series cards, the dongle will compress (not clip) the colorspace from 0-255 down to 16-235, so by combining that with the registry tweak, you expand all sources on the PC to 0-255, then the dongle compresses all output on the line to 16-235, and your HDTV is happy
 
#17 ·
Well, the video material is 16-235 to begin with, so I have always assumed that first converting it to 0-255 and then back to 16-235 would not make a very big impact, maybe this is wrong? Anyway I don't have many alternatives since my HTPC never is going to be dedicated to video playback only. I guess it might result in some lost tonality in material which is originally 0-255 (games, photos?), but that's the price I have to pay as far as I can understand.


ATI is a company which is all about producing graphics hardware for gaming, video editing, 3d modelling etc. The HDMI format is pretty much a standard nowadays. Wouldn't it be strange if they supplied a DVI to HDMI converter with their cards that ruins tonality in all material which isn't 16-235 video?


At least the way it is now I can't see any banding, and my blacks and whites look allright. I have checked it with both the DVE calibration DVD and a 720p x.264 calibration video, and I can see all the black and white levels fine with both.
 
#19 ·

Quote:
Originally Posted by Seeco /forum/post/15076516


For ATI users there has been a lot of talk about color spaces, the HDMI dongle and a series of regtweaks. PhaedrusGalt's post here pretty much sums it up:

The goals are good in that post. But there is not enough explanation to know if the right solution is found. For example, it is NOT sufficient to see levels below 16 to know that the conversion to video levels has been done correctly. Let me explain in detail.


To go from 16-235 video levels to 0-255, you first subtract the "setup" value of 16 and then multiply by 1.16 (ratio of 16 to 235 relative to ratio of 0-255). Let's do the math for a series of values starting from black. Since we are doing floating point (fractional) math but the target has to be fixed, integer value, I am going to show the math both ways: one in floating and second when rounded to an integer:

Code:
Code:
16      0               0
17      1.163636364     1
18      2.327272727     2
19      3.490909091     3
20      4.654545455     5
21      5.818181818     6
22      6.981818182     7
23      8.145454545     8
24      9.309090909     9
25      10.47272727     10
26      11.63636364     12
Do you see a problem? If I have a smooth ramp from black to white, as I get between samples 19 and 20 in video, I get a jump of 3 to 5 in graphics/RGB mode! Result is banding as we no longer have a proper gradient.


Solution is to dither. That is, add a random value to samples as to make that step between 19 and 20 be fuzzy. The end result is that we get our ramp back but we wind up with noise added to the video.


Now, if the dongle converts this back to 16-235, then it would need to not only perform the reverse math but also add more dither. Not adding dither causes banding to occur in that conversion. But adding dither means even more noise.


This is why I said if your main goal is to watch video on the PC, then damn the graphics mode accuracy and put the system end-to-end in YUV video mode.


The alternative is to have the conversion from YUV 16-235 to RGB 0-255 happen in one step. In that scenario, you still have to add dither but at least, you are doing it once, not twice (i.e. once in conversion to RGB and second in level range).


The only way to know if any of this is occuring correct is with a DVE like test disc and looking at the ramps on an accurate display that doesn't have banding itself.
 
#20 ·
There's just no end to it, is there? I definitely accept your explanation, although it is quite discouraging. Going from 16-235 to 0-255 should be like Bilbo says in LOTR, "like butter spread over too much bread"
Image



This dithering you mention, does that mean that if it (the codec?) didn't apply dithering when going from 16-235 to 0-255, then it could convert back to 16-235 without any loss whatsoever (it would just have to "do the math")?


Right now what I'm seeing is banding in PC games on my HTPC, what does this mean? If it works as it should, the games (which are 0-255 as everything else) should be converted to 16-235 via the HDMI dongle. How does this produce banding?


According to you and to what I can see myself, what I have now is video with noise and games with banding. I guess that if I applied the tweak setting everything to 0-255, didn't use the HDMI dongle and set my TV to 0-255, then everything would be perfect except for video which would show banding, right? I guess PC:s really aren't meant for this...
Image
 
#21 ·

Quote:
Originally Posted by Seeco /forum/post/15077633


There's just no end to it, is there?

That is what I said in my first reply
Image
. Most people are not watching accurate video on their PCs. Sad, given where I used to work
Image
.

Quote:
This dithering you mention, does that mean that if it (the codec?) didn't apply dithering when going from 16-235 to 0-255, then it could convert back to 16-235 without any loss whatsoever (it would just have to "do the math")?

Yes and no. If like I showed, rounding was used, then inverse conversion works. But if the values were simply truncated, then accuracy is lost and inverse conversion gives you something different.
Quote:
Right now what I'm seeing is banding in PC games on my HTPC, what does this mean?

Well banding could be a function of your display not handling the source properly. That is above and beyond the case we are talking about here.


Now, if the display is set to 16-235 and proper conversion with dithering is not done, then that could be responsible for it.
Quote:
If it works as it should, the games (which are 0-255 as everything else) should be converted to 16-235 via the HDMI dongle. How does this produce banding?

If you do any conversion where there is fractional values, as is in this case, you are going to need to dither. If you don't dither, you get banding. In this case, you may also be going to YUV domain and as such, have color space compaction to boot. And dithering is required here too for proper operation.
Quote:
According to you and to what I can see myself, what I have now is video with noise and games with banding. I guess that if I applied the tweak setting everything to 0-255, didn't use the HDMI dongle and set my TV to 0-255, then everything would be perfect except for video which would show banding, right? I guess PC:s really aren't meant for this...
Image

PC is a multi-function device. Unfortunately, the people who designed video standards and PCs did not talk to each other much. The result is a complete mess as we have discussed.


In an ideal case, your display would go into RGB 0-255 for graphics/desptop/game work and YUV 16-235 for video. Even if it did this, then your video better be in full screen as otherwise, either the graphics of video would be wrong.


From memory, Windows Media Center gets around this problem by rendering its graphics in the same mode as video. That way, you can set up your display for video and have it all be right. Of course, that only resolves the graphics displayed while inside MCE, not outside.


So the best solution really is to dedicate a PC to video and have that path be correct. Or, just don't let the errors bother you
Image
.
 
#22 ·
I see. Well I have fiddled about some more, and I have come up with even more strange fenomena. On my TV (Pioneer PDP-5080XA) I can set the colour space to either 4:2:2, 4:4:4, 16-235 and 0-255. Because I have assumed that the HTPC (via the HDMI dongle) delivers a 16-235 colour space, I have been using the corresponding colour space in my TV. There is also an "auto" setting, which is supposed to read the incoming colour space and match it. When I tried setting it to auto, all of a sudden I have grey blacks in DTV, better low levels in x.264 material, and a brighter picture (although still with good blacks and unfortunately still with banding) in games. I then tried setting it to the 0-255 mode manually and it looks excactly the same, obviously this is the mode it activates when set to auto. This just amazes me! Doesn't this imply that everything isn't really converted the way it should be with the regtweak? And that the signal coming out of the HTPC isn't really 16-235?
 
#24 ·
This is so strange, I'll have to trace the information about whatever effect the dongle is supposed to have.


I'm having a hard time wrapping my head around all this, so many variables... If I change between 16-235 and 0-255 when watching something, the effect is very much different depending on what I'm watching:


- If I watch TV

16-235 produces banding and deep blacks

0-255 produces banding and grey blacks


- If I watch x.264 material

16-235 produces a rather dark picture with deep blacks, much detail is lost

0-255 produces equally deep blacks, but the picture is otherwise elevated and much more detail comes out


- If I play games it is the same as with x.264 material, setting 0-255 just makes the picture better (but here I also have banding).


But the whole purpose of the regtweak was to make everything equal to begin with! If now even the dongle isn't doing it's job, then what can I count on? I'm not even upset anymore, I can only laugh at all this
Image
 
#25 ·

Quote:
Originally Posted by Seeco /forum/post/15078177


- If I watch TV

16-235 produces banding and deep blacks

0-255 produces banding and grey blacks

This is as it should be. TV has black = 16. If you force your TV to think levels are from 0 to 255, then "16" means something well above black = grey.

Quote:
- If I watch x.264 material

16-235 produces a rather dark picture with deep blacks, much detail is lost

0-255 produces equally deep blacks, but the picture is otherwise elevated and much more detail comes out

You never know how someone has encoded such material. Mistakes are made all the time when people don't understand the differences here.


Putting that aside, expansion of the range may cause such detail to be amplified but that is not necessarily how the material is supposed to be views.


For example, if you turn up the brightness above reference, you also see more detail in darker areas, but that doesn't make it correct. When the material was produced, it was viewed in a reference environment and as such, that detail was invisible to the person making the creative decision.

Quote:
- If I play games it is the same as with x.264 material, setting 0-255 just makes the picture better (but here I also have banding).

In this case, for sure you want 0-255. Setting it otherwise, forces compression of the range and detail is lost. Banding here I believe is due to your display not being able to handle full range of what a PC game can create on the fly.


Graphics can do things that never happen in real life. For example, the pixels you see on your computer can go from zero to black in one pixel. Such a thing never happens with video/film camera. So what looks good as a video signal, may not look perfect as synthetic graphics. Flat panels are still not perfect in how they handle gradients.

Quote:
But the whole purpose of the regtweak was to make everything equal to begin with! If now even the dongle isn't doing it's job, then what can I count on? I'm not even upset anymore, I can only laugh at all this
Image

It is a tough problem. I think the solution starts with understanding what the sources are, and then searching for answers. People may have come up with configurations of PC/GPU/Displays that do the best here.
 
#26 ·
Yes hmm, but I suspect that your answer is overly relativistic
Image
Even if there are limitations to how good it can get in the end, I know it can get a lot better than this. At least I would like to see what compressing everything to 16-235 really looks like.


I have now tried re-applying the regtweak, and this time it actually seemed to do it's job - now I have black crush everywhere, HD material, TV, pictures and games. As before though, setting the TV to 16-235 produces black (and I suppose white) crush, and setting it to 0-255 brings out much more detail without affecting the deepest black. I also have banding everywhere, even in HD material although it isn't as noticeable there for some reason.


Surely this must point to the fact that SD material now is indeed expanded to PC levels, but that the dongle isn't doing it's job. Nothing becomes re-compressed, therefore the 0-255 TV setting brings out all the detail but produces banding. I guess spreading the original 16-235 video spectrum across a 0-255 one would do that.


Of course, this doesn't explain why I get banding in PC games which should be happy with 0-255 all the way. I know for a fact that it isn't a matter of what my TV can reproduce, Before this I used an Nvidia 9600GT, and I never had any banding problems there (although due to the settings I used video looked off).


It is also still a mystery to me what the "pixel format" setting actually does. Does it control the actual output from the card, or does it control some pre-output conversion? To be honest I haven't been able to tell a difference at all when playing around with it.
 
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.