Originally Posted by orion2001
Okay, a few things:
On HTPCs, you want to always use 0-255 output. There are a couple of reasons for doing so.
1) All GPUs process video data as 0-255 internally. So, if you start off with source content at video levels, they are already going to be converted to 0-255 internally. Then to output at Video levels, they need to be converted back to 16-235 again internally in the GPU and this results in some inaccuracies as the GPUs have less precision than if this conversion was performed in software (i.e. in CPU).
2) A lot of GPUs will just clip video levels so you only get 16-235 levels and this will result in clipping of whiter-than-white (WTW) content.
For (1), I again ask for a way to demonstrate this round-trip. If it were happening, I would expect BTB and WTW to be lost in the expansion from Video to PC Levels. I would expect to see anomalies such as banding in gradients or inconsistencies with my standalone BD player. I don't see any of those things. That said, I recently figured out how to force a round-trip that had the expected effect on BTB and WTW by setting my Nvidia GT430 to output YCbCr444, but it's just a curiosity:
Note that the Nvidia YCbCr444 option compresses all output to 16-235, including the desktop. This does not happen when outputting RGB. I can think of only one reason to use YCbCr444, and that is for a display that cannot use the full RGB range but is limited to 16-235. It would be a way to get the desktop looking sort of right.
For (2), simply no. You may be confusing the "clipping" with the expansion to PC levels you're advocating, which irreversibly destroys BTB and WTW. The messages I linked to earlier explain this.
As for outputting video at PC Levels, I've found one reason to do so, and that is to achieve consistency with the desktop, so that one calibration handles everything on the PC. As I've written earlier, I've found several reasons to leave video alone and output it at its native Video Levels, which is actually the full RGB range (well, 1-254 being strictly legal values) with black at 16 and reference white at 235, and I'm going to repost it here:
Until somebody can tell me how to demonstrate the presumably hugely negative effects of the levels round trip, I will continue to believe the only reason to use PC Levels is to get consistency between desktop and video output, so one collection of TV settings works for both.
For the OP, here are several specific reasons I use Video Levels with my HTPC and its Nvidia GT430, along with a potentially important caveat.
1. Outputting PC Levels sacrifices BTB and WTW, which makes it harder to set Brightness and Contrast, and losing WTW is arguably bad because rarely there can be valid information in 235 and above.
2. For Nvidia cards and the ST60 and Sony LCDs I've hooked up to them, using PC Levels required me to adjust Brightness and Contrast of video in the Nvidia Control Panel, whereas using Video Levels did not; I could do everything on the TV, which is what I want.
3. Using Video Levels also achieved consistency with my Sony S5100 BD player, which unlike my S350 from 2008, does not support PC Levels, and achieving consistency with other devices is a consideration for people going through an AVR or other switch that has only one output.
4. Finally, being able to leave the Nvidia Control Panel Video section at "With the video player" avoids the problem I described here:
All that said, using Video Levels on the TV does create a levels mismatch for my Nvidia card for desktop graphics, which causes some color fidelity issues for desktop graphics, but they do not impact the usage of the programs I use on my TV, WMC and XBMC, in any way, as I use them solely for video; more on that here:
NVIDIA HTPC 0-255 or 16-235?
I find it's a good trade-off if you don't care about gaming, photo display, etc. I do it on my HTPC for the TV connection, but I do use PC Levels on my gaming machine. YMMV with your specific hardware and software, but these are the sorts of things to be thinking about.