Originally Posted by doveman
Originally Posted by sawfish
Just to be clear, replacing 16 and below with 0 wouldn't quite do the trick. It would need to expand Video Levels (16-235) to PC Levels (0-255). Then the pixel values in Bar 16 and below would be 0, Bar 17 would contain 1, and so forth, up to Bar 235 and above containing 255. (Of course, it can't strictly increase by 1 in between Bars 16 and 235, as the target range has 256 values while the source range only has 220.) Then it would match the XBMC GUI in the screenshot, and it would work on a display configured for PC Levels.
OK, I understand that. From what you said before (quoted below) regarding the way it expands Video Levels to PC Levels when playing video, it seems that is doing what you've described above, so I assume that's being done correctly?
OTOH, the screenshot taken while the video was playing is at PC Levels, because examining the pixel values, Bar 16 and below have been mapped to zero, 17 has been mapped to 1, etc. This is the bottom end of what is meant by expanding Video Levels (16-235) to PC Levels (0-255). It looks like it should on my PC monitor, which again is at PC Levels. If the desktop were at Video Levels, blacks would appear crushed.
If the screenshot you took of XBMC playing the file matches what you observe on the TV, and Brightness is set close to zero, then that would suggest that XBMC is outputting PC Levels, and the TV is configured for PC Levels (HDMI/DVI Range = Nonstandard). That would be a valid configuration.
A possible complication is that the screenshot may not reflect what the video card is actually putting out on the wire. I'm assuming it is the same.
I'm still a bit confused about this:
The Black Clipping thumbnail screenshot has black at 16, whereas the XBMC UI all around it has black at zero. So it would seem that the XBMC thumbnail generator captured the image using Video Levels, and something about it eliminated BTB, converting all values below 16 to 16. That is not a useful thing for it to do, as it makes it more difficult to set Brightness. Viewed on my PC monitor, the pattern appears grey, which is expected, for my PC desktop is at PC Levels, where black is 0. The desktop would have to be using Video Levels for this pattern to appear correct.
as I believe my RPi/PC is using Video levels now but the pattern still doesn't appear correctly and is much brighter/greyer all over than it should be.
I'm confused, too.
Forget the thumbnail. If XBMC is playing the video, and it appears gray all over with TV Brightness close to 0, that could happen if the RPi is outputting Video Levels (black = RGB(16)), and the TV is set to PC Levels (black = RGB(0)).
Going back to another thing you said:
As for "full range", what that does is expand the normal video levels from 16-235 to 0-255, throwing away all the BTB and WTW information. You won't see any bars below 16 or above 235 when using full range, because that information is gone. So in a sense, "full range" is the one that is "limited", as it throws away information. Based on your description of the missing bars, your RPi is outputting full range, not limited range, or the player is clipping BTB for some stupid reason.
I assume you were referring just to video, as surely PC games use Full range (PC Levels) and aren't expanding Limited range (Video Levels) or throwing away any information?
Yep. That's why people who want to configure their PC for both video and desktop usage (including games) typically compromise by configuring video to use PC Levels. That has much less deleterious effect on video than configuring the desktop to use Video Levels would have on desktop programs.
Do all DVDs/Blu-Rays and any other format all use video levels, so that making the source use Full range would always make it do this expansion before outputting to the TV (which would also have to use Full range to display it properly) or are there some video sources that are encoded using Full range?
Yes, the video standard is Video Levels, and I don't have any counter-examples. That said, I'm sure that for some definition of "video", there are video files out there that are encoded with PC Levels, and they won't look right on a TV set to the standard Video Levels.
I think one important thing I've learnt from this discussion is that whilst Limited range (Video Levels) only really uses 16-235 for the material, it actually still outputs 0-255 so that the BTB and WTW detail is sent to the display
That's the way it should work, yes. Again, BTB never appears in real material, but WTW can. I'd recommend reading the Spears and Munsil articles on Brightness and Contrast for more on all this:
and that any clipping that is done which stops 0-16 and 235-255 being output (or cut at the display side, with the same effect) has nothing to do with what range/levels are selected on the source/TV. Does that sound right?
I don't think I would say that. There's what they should do, and then there's what they actually do.
An RGB Limited source should take what's in the video and pass it through untouched. The display should display it untouched. This includes the standard video range 16-235, BTB, and WTW, for the full 0-255 range. However, because BTB never appears in real material, it would be legitimate though unhelpful for either source or display to clip BTB. Naughty devices may even clip WTW; IIRC, my 2008 Sony LCD does this. It's not a huge sin, but still undesirable.
For RGB Full, the source device has to expand Video Levels to PC Levels, which sacrifices BTB and WTW. Then the TV has to display it untouched, as the full 0-255 range is being used for video then.
The good news is, the 2013 Panasonics that are the subject of this thread behave as they should. So does my HTPC. So does my BD player. The bad news is, in general, YMMV.
OK. I guess I thought that if bar 16 and below are all outputting at 0, that there should never be any dither visible in that range, regardless of what I set the Brightness to but perhaps the appearance of dither is just related to the panel and the brightness being too high and nothing to do with the input and could appear with no input at all. It still seems a bit strange that tons of dither would suddenly appear when increasing Brightness +1 from the correcly calibrated setting but if you say that sounds normal to you, I guess it's just something I'll have to live with.
You said, in part, "Notching it up to +1 makes bar 17 visible but at +2 the dither appears everywhere, right down to 2."
If your TV is in RGB Limited mode (HDMI/DVI RGB Range = Standard (16-235)), that behavior would be consistent with bars below 17 all containing 16, like in the thumbnail in the screenshot you posted.
If your TV is in RGB Full mode (Nonstandard), that behavior would be consistent with bars below 17 all containing 0, like I believe they did in the screenshot of the actual video you posted.
If all the bars contain the pixel values they are labeled with, and they do for my system, I see the lower numbered bars progressively light up as I increase Brightness. It's not all or nothing like in your case. I use RGB Limited, and my system preserves BTB, for both the HTPC and Sony BD player. I have to jack Brightness up to +31 before I start seeing dithering in RGB(0), and that's because the TV is in RGB Limited, where reference black is RGB(16). If I put my TV in RGB Full, where reference black is RGB(0), the dithering in RGB(0) would show up right away if I were to bump Brightness up by 1, and it was properly adjusted to begin with.