Originally Posted by geearf
Do you mean with proper settings or with any settings?
I meant that with some settings I could see a lot more, whereas before using limited range it was impossible whatever the settings.
This message explains Video and PC Levels and what you should and shouldn't see in things like Black Clipping, when source and destination levels match (as they must in a properly configured system), and when they don't:
AVS HD 709 - Blu-ray & MP4 Calibration
Originally Posted by alluringreality
My PC has an AMD card, and I'm unable to get it to output both video and the desktop at video levels without scaling the video. I think the pixel format setting can force a video level output on my computer, but the video gets scaled to full-range and then back to video levels, so I just let my PC output the full-range default.
I keep hearing about this levels round trip. Can anyone tell me how to demonstrate it? (I ask every time it comes up.) Is it just an AMD thing, or does it affect Nvidia and Intel, too?
In particular, if the end result preserves BTB and WTW, then the first leg of the round trip, the expansion from Video to PC Levels, is not losing this information, so if this is the case, it's not at all like actually outputting PC Levels, which loses this information. I would also think/hope it is not limited to integers when expanding the 220 values in the 16-235 range into the 256 values in the 0-255 range, which must be done when actually outputting PC Levels.
Originally Posted by alluringreality
If you have a display that can accept PC levels I would suggest probably using a full-range (PC) signal with a computer, unless there's a specific reason you need to use video levels. My Sony can accept both a video input or a full-range input without issue, so I just let the PC output full-range.
Until somebody can tell me how to demonstrate the presumably hugely negative effects of the levels round trip, I will continue to believe the only reason to use PC Levels is to get consistency between desktop and video output, so one collection of TV settings works for both.
For the OP, here are several specific reasons I use Video Levels with my HTPC and its Nvidia GT430, along with a potentially important caveat.
1. Outputting PC Levels sacrifices BTB and WTW, which makes it harder to set Brightness and Contrast, and losing WTW is arguably bad because rarely there can be valid information in 235 and above.
2. For Nvidia cards and the ST60 and Sony LCDs I've hooked up to them, using PC Levels required me to adjust Brightness and Contrast of video in the Nvidia Control Panel, whereas using Video Levels did not; I could do everything on the TV, which is what I want.
3. Using Video Levels also achieved consistency with my Sony S5100 BD player, which unlike my S350 from 2008, does not support PC Levels, and achieving consistency with other devices is a consideration for people going through an AVR or other switch that has only one output.
4. Finally, being able to leave the Nvidia Control Panel Video section at "With the video player" avoids the problem I described here:
All that said, using Video Levels on the TV does create a levels mismatch for my Nvidia card for desktop graphics, which causes some color fidelity issues for desktop graphics, but they do not impact the usage of the programs I use on my TV, WMC and XBMC, in any way, as I use them solely for video; more on that here:
NVIDIA HTPC 0-255 or 16-235?
I find it's a good trade-off if you don't care about gaming, photo display, etc. I do it on my HTPC for the TV connection, but I do use PC Levels on my gaming machine. YMMV with your specific hardware and software, but these are the sorts of things to be thinking about.