HTPC XBMC Calibration - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 3Likes
  • 1 Post By sawfish
  • 1 Post By zoyd
  • 1 Post By madshi
 
Thread Tools
post #1 of 29 Old 10-08-2014, 07:34 AM - Thread Starter
Senior Member
 
drpete12's Avatar
 
Join Date: May 2009
Posts: 273
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 106 Post(s)
Liked: 49
HTPC XBMC Calibration

Hi All,

I know that your first thought is I should post this in HTPC section. But it deals more with calibration.

This is what I was wondering. You go ahead and calibrate your projector (professionally) and get it all setup. If I use htpc and xbmc as my main bluray source than how is it calibrated (or is it)

Do you run a calibration disk at this point and make any alterations in computer and or XBMC?

Just wondering how others are doing this

Thanks
drpete12 is offline  
Sponsored Links
Advertisement
 
post #2 of 29 Old 10-08-2014, 10:34 AM
AVS Forum Special Member
 
sawfish's Avatar
 
Join Date: Feb 2013
Posts: 1,000
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 146 Post(s)
Liked: 150
Ideally, you won't have to mess around with Brightness/Contrast/etc in the video card settings, and the PC's output will be consistent with whatever source was used for the calibration. The primary consideration is whether you're going to output Video or PC Levels. I described what I do here:

https://www.avsforum.com/forum/139-di...l#post25938273

See also end of this message:

https://www.avsforum.com/forum/139-di...l#post26956129

More about Video and PC Levels here:

https://www.avsforum.com/forum/139-di...l#post26449881

I think you will find XBMC to be unsatisfactory as your "main Bluray source". I would use a standalone player for physical BDs.
WiFi-Spy likes this.
sawfish is offline  
post #3 of 29 Old 10-08-2014, 06:21 PM
Advanced Member
 
|Tch0rT|'s Avatar
 
Join Date: Nov 2006
Location: Lower mid Michigan
Posts: 898
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 196 Post(s)
Liked: 139
From what I remember there aren't any or very many options in XBMC for video calibration really at least nothing you couldn't already do with controls built into your display. If you already have a meter and you want to calibrate I'd suggest abandoning XBMC (unless you can tolerate having XBMC launch and external player) and checking out JRiver or Mediabrowser3 which incorporates madVR which can utilize 3DLUT's and calibrate your display far beyond what you can do with it's internal controls.

madVR & 3DLUT info:
https://www.avsforum.com/forum/139-di...argyllcms.html

XBMC can't use madVR unless you set it up to launch an external player like MPC-HD but IMO it kills a lot of the user friendly aspects of playback if you use an IR remote.

My Home Theater/Video Gaming/HTPC/2 Channel rig (Mitsubishi, MartinLogan, Marantz, DIYMA, and others)

|Tch0rT| is offline  
Sponsored Links
Advertisement
 
post #4 of 29 Old 10-10-2014, 01:02 AM
Advanced Member
 
harlekin's Avatar
 
Join Date: May 2007
Posts: 709
Mentioned: 66 Post(s)
Tagged: 0 Thread(s)
Quoted: 308 Post(s)
Liked: 140
XBMC outputs as full rgb (0-255) always, thats it. Apart from that it behaves like every other image source.

On some linux versions its even possible to get the output converted to limited (16-255) via a menu option if you need to. This option is not available in Windows or MacOS versions atm, afaik.
-

The answer to your question is, that you should use a software pattern generator, like the one built into HCFR for example - because its much faster than manually switching patterns - if you plan on doing multiple measurement runs, which you will do eventually.

Before you do - you have to verify that you have configured the signal chain correctly AND the PCs video card doesnt introduce color errors - against a known "correct source" (Bluray Player or PS3 (high distribution amongst consumers, so worth mentioning)) with a known correct image pattern source (AVS HD 709 disc).

Once you have verified the output of the software pattern generator (like the one built into HCFR) to be correct - you calibrate your TV for a full RGB signal chain (0-255 - settings in the PCs graphics card and the software pattern generator (HCFR for example), AND the TV _if needed_ (most nowadays have an auto setting to handle input specific limited/full RGB switching) and that should be it.

You can later verify your calibration by measuring patterns (AVS HD 709) played through XBMC while in windowed mode (alt+Enter). They (should and do) line up with the normal full RGB calibration. ATTENTION - If you want to verify it for yourself and are using HCFR you HAVE TO move each "OK prompt window" out of the center of the screen before you do ANY measurements (thats 10 times for greyscale, 6 times for prim/sec colors and so on) - otherwise HCFR does WRONGLY measure the fading out "OK prompt window" in the center of the screen before it measures the color field - which will produce WRONG readings. This only happenes if you use HCFR on the same PC you are measuring XBMCs color patterns with. If you use a second (laptop f.e.) computer to run HCFR on - the OK prompts will not be on the source (XBMC) screen.

SPECIAL NOTICE - If you use an Integrated Intel Graphics processor - update to THE LATEST DRIVERS. They _just_ fixed an issue with brighness/gamma being totally wrong when using hardware acceleration for video decoding.
-

As mentioned before, the madVR renderer can not and should not be used - when XBMC is the main video source. So stick to your on device (TV menu) calibration options.

Last edited by harlekin; 10-10-2014 at 01:17 AM.
harlekin is offline  
post #5 of 29 Old 10-10-2014, 01:57 AM
AVS Forum Special Member
 
sawfish's Avatar
 
Join Date: Feb 2013
Posts: 1,000
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 146 Post(s)
Liked: 150
Quote:
Originally Posted by harlekin View Post
XBMC outputs as full rgb (0-255) always, thats it.
Nope. That wasn't true even for XBMC pre-Gotham if you were using DXVA2 and chose Limited RGB in the Nvidia Control Panel, which would cause XBMC to use Video Levels. Fortunately, Gotham implements a proper 16-235 option that doesn't require messing around with the Nvidia Control Panel, which I observed to have some subtle deleterious effects, and I linked to my xbmc.org message on that in one of the posts I linked to earlier.

Quote:
On some linux versions its even possible to get the output converted to limited (16-255) via a menu option if you need to. This option is not available in Windows or MacOS versions atm, afaik.
Again, nope. It's definitely in Windows, and "limited" AKA Video Levels is 16-235, not 16-255. Of course, "limited" is a misnomer, as when implemented correctly, it's really passthrough of the RGB values than can legally be encoded in video, which at 1-254, is almost the full RGB range. It's not "converted", and actually "conversion" happens when you output video at PC Levels, 0-255, which expands 16-235 unevenly into 0-255 and throws away BTB and WTW. See the messages I linked to for more.
sawfish is offline  
post #6 of 29 Old 10-10-2014, 07:07 AM
Advanced Member
 
orion2001's Avatar
 
Join Date: Jun 2008
Posts: 741
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 462 Post(s)
Liked: 313
Okay, a few things:

On HTPCs, you want to always use 0-255 output. There are a couple of reasons for doing so.
1) All GPUs process video data as 0-255 internally. So, if you start off with source content at video levels, they are already going to be converted to 0-255 internally. Then to output at Video levels, they need to be converted back to 16-235 again internally in the GPU and this results in some inaccuracies as the GPUs have less precision than if this conversion was performed in software (i.e. in CPU).

2) A lot of GPUs will just clip video levels so you only get 16-235 levels and this will result in clipping of whiter-than-white (WTW) content.

For these reasons, it is always recommended to operate your GPU to output in full RGB mode. This is also why YPbPr mode should not be used on GPUs. Most TVs can be set accordingly to expect 0-255 content and it would be preferable for the TV to handle the display of 0-255 content. If not, you can set the conversion back to video levels in Software.

And that comes to the second main item - Your playback software:
I'd highly recommend using MPC-HC with madVR for BluRay playback. XBMC is nice, and I usually use the internal player for TV shows, but not for 1080p BluRay rips. MPC-HC is far more powerful, has amazing PQ and a lot of customization. It handles all the colorspace/video level conversions beautifully. It even lets you set custom video levels if needed, and also allows for resetting the display refresh rate to match the source content (if your display has the capability to display judderless 24p content) for smooth movie playback.

The other advantage of using MPC-HC with madVR is that you can then use MadTPG (MadVR test pattern generator) as your pattern generator in HCFR for performing your display calibration. This allows you to be 100% sure that whatever calibration you achieve using HCFR and madTPG is exactly matched to the videos being played back via MPC-HC. Moreover, madVR allows for 3D LUT based calibrations that can help you take things to the next level if you have a powerful enough GPU to handle the 3D LUT correction in real time.

And as a final note, MadVR also has the option of SmoothMotion which uses frame blending to reduce judder on displays that cannot handle 4:4 type pulldown of 24p content. It works really well and has a very very small amount of SOE like effect that is far better than what you see with frame interpolation. Unfortunately I am too sensitive to SOE so even the SmoothMotion based frameblending bugs me. Foruntately, my TV can handle 4:4 pull down processing of 24p content (CinemaSmooth on Samsung TVs).

-------------------------------------------------------------------------
Recommended calibration settings for Samsung PN60F5300B
orion2001 is offline  
post #7 of 29 Old 10-10-2014, 09:06 AM
AVS Forum Special Member
 
sawfish's Avatar
 
Join Date: Feb 2013
Posts: 1,000
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 146 Post(s)
Liked: 150
Quote:
Originally Posted by orion2001 View Post
Okay, a few things:

On HTPCs, you want to always use 0-255 output. There are a couple of reasons for doing so.
1) All GPUs process video data as 0-255 internally. So, if you start off with source content at video levels, they are already going to be converted to 0-255 internally. Then to output at Video levels, they need to be converted back to 16-235 again internally in the GPU and this results in some inaccuracies as the GPUs have less precision than if this conversion was performed in software (i.e. in CPU).

2) A lot of GPUs will just clip video levels so you only get 16-235 levels and this will result in clipping of whiter-than-white (WTW) content.
For (1), I again ask for a way to demonstrate this round-trip. If it were happening, I would expect BTB and WTW to be lost in the expansion from Video to PC Levels. I would expect to see anomalies such as banding in gradients or inconsistencies with my standalone BD player. I don't see any of those things. That said, I recently figured out how to force a round-trip that had the expected effect on BTB and WTW by setting my Nvidia GT430 to output YCbCr444, but it's just a curiosity:

https://www.avsforum.com/forum/139-di...l#post26970873

Note that the Nvidia YCbCr444 option compresses all output to 16-235, including the desktop. This does not happen when outputting RGB. I can think of only one reason to use YCbCr444, and that is for a display that cannot use the full RGB range but is limited to 16-235. It would be a way to get the desktop looking sort of right.

For (2), simply no. You may be confusing the "clipping" with the expansion to PC levels you're advocating, which irreversibly destroys BTB and WTW. The messages I linked to earlier explain this.

As for outputting video at PC Levels, I've found one reason to do so, and that is to achieve consistency with the desktop, so that one calibration handles everything on the PC. As I've written earlier, I've found several reasons to leave video alone and output it at its native Video Levels, which is actually the full RGB range (well, 1-254 being strictly legal values) with black at 16 and reference white at 235, and I'm going to repost it here:

https://www.avsforum.com/forum/139-di...l#post26956129

Until somebody can tell me how to demonstrate the presumably hugely negative effects of the levels round trip, I will continue to believe the only reason to use PC Levels is to get consistency between desktop and video output, so one collection of TV settings works for both.

For the OP, here are several specific reasons I use Video Levels with my HTPC and its Nvidia GT430, along with a potentially important caveat.

1. Outputting PC Levels sacrifices BTB and WTW, which makes it harder to set Brightness and Contrast, and losing WTW is arguably bad because rarely there can be valid information in 235 and above.

2. For Nvidia cards and the ST60 and Sony LCDs I've hooked up to them, using PC Levels required me to adjust Brightness and Contrast of video in the Nvidia Control Panel, whereas using Video Levels did not; I could do everything on the TV, which is what I want.

3. Using Video Levels also achieved consistency with my Sony S5100 BD player, which unlike my S350 from 2008, does not support PC Levels, and achieving consistency with other devices is a consideration for people going through an AVR or other switch that has only one output.

4. Finally, being able to leave the Nvidia Control Panel Video section at "With the video player" avoids the problem I described here:

http://forum.xbmc.org/showthread.php?tid=180884

All that said, using Video Levels on the TV does create a levels mismatch for my Nvidia card for desktop graphics, which causes some color fidelity issues for desktop graphics, but they do not impact the usage of the programs I use on my TV, WMC and XBMC, in any way, as I use them solely for video; more on that here:

NVIDIA HTPC 0-255 or 16-235?

I find it's a good trade-off if you don't care about gaming, photo display, etc. I do it on my HTPC for the TV connection, but I do use PC Levels on my gaming machine. YMMV with your specific hardware and software, but these are the sorts of things to be thinking about.
sawfish is offline  
post #8 of 29 Old 10-10-2014, 09:37 AM
Advanced Member
 
orion2001's Avatar
 
Join Date: Jun 2008
Posts: 741
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 462 Post(s)
Liked: 313
Quote:
Originally Posted by sawfish View Post
For (1), I again ask for a way to demonstrate this round-trip. If it were happening, I would expect BTB and WTW to be lost in the expansion from Video to PC Levels. I would expect to see anomalies such as banding in gradients or inconsistencies with my standalone BD player. I don't see any of those things. That said, I recently figured out how to force a round-trip that had the expected effect on BTB and WTW by setting my Nvidia GT430 to output YCbCr444, but it's just a curiosity:

https://www.avsforum.com/forum/139-di...l#post26970873

Note that the Nvidia YCbCr444 option compresses all output to 16-235, including the desktop. This does not happen when outputting RGB. I can think of only one reason to use YCbCr444, and that is for a display that cannot use the full RGB range but is limited to 16-235. It would be a way to get the desktop looking sort of right.

For (2), simply no. You may be confusing the "clipping" with the expansion to PC levels you're advocating, which irreversibly destroys BTB and WTW. The messages I linked to earlier explain this.

As for outputting video at PC Levels, I've found one reason to do so, and that is to achieve consistency with the desktop, so that one calibration handles everything on the PC. As I've written earlier, I've found several reasons to leave video alone and output it at its native Video Levels, which is actually the full RGB range (well, 1-254 being strictly legal values) with black at 16 and reference white at 235, and I'm going to repost it here:

https://www.avsforum.com/forum/139-di...l#post26956129

Until somebody can tell me how to demonstrate the presumably hugely negative effects of the levels round trip, I will continue to believe the only reason to use PC Levels is to get consistency between desktop and video output, so one collection of TV settings works for both.
Well, if you don't trust me, I think you should trust the knowledge and expertise of @madshi , the author of madVR on this matter:
https://www.avsforum.com/forum/139-di...l#post26583769
Quote:
Then how do you suggest should an HTPC user who wants to use the PC for video playback, photo display, games etc, setup his playback chain? Do you recommend to set the GPU to 16-235 output? If you really recommend that, then you seemingly don't know much about HTPCs work inside. Windows internally thinks/works in PC levels. If you set the GPU to 16-235 output, the GPU will stretch all the rendered desktop/video/game pixels from PC levels to TV levels behind the back of the applications and behind the back of Windows itself, and usually the GPU does this in 8bit without using dithering. Which means you'll get banding artifacts.

There's nothing wrong with using PC output levels, as long as your display properly supports that. Ok, so maybe HCFR needs to be tweaked a bit to fix the issues we discovered, but I'm sure this will be fixed soon enough.

Btw, you're saying "by scaling Video to PC". But in reality it's rather the other way round: Since the PC natively works in PC levels, if you set the GPU to TV output, video will first be rendered to PC levels, and then stretched afterwards to TV levels. So this way you have more stretching and more conversions going on than when using PC output levels. The thing is: Video is encoded in YCbCr. This needs to be converted to RGB first, anyway, and this YCbCr -> RGB conversion consists of a matrix multiplication. You can in one step convert YCbCr to either TV RGB or PC RGB. It's the same operation, just with a different matrix multiplication. So basically you can render video in either TV RGB or PC RGB without stretching or scaling anything at all. Rendering a YCbCr video in PC RGB does not require any additional scaling or stretching operations. However, if you set the GPU to TV levels output, you introduce an additional stretching operation (and that often in low bitdepth without dithering).
The point is that you prevent one extra "lossy" conversion back to video levels at 8 bit depth. Moreover, a good renderer like madVR will use dithering to represent floating point level values that preserves quality much better with regards to the source rather than having levels rounded off to nearest integer when conversions are done internally to the GPU.

It may be that some GPUs might handle the conversions better than others, but all the same, outputting to PC levels as long as your TV can handle it, is the safer and better option when using a capable video renderer like madVR.

-------------------------------------------------------------------------
Recommended calibration settings for Samsung PN60F5300B

Last edited by orion2001; 10-10-2014 at 09:40 AM.
orion2001 is offline  
post #9 of 29 Old 10-10-2014, 10:03 AM
AVS Forum Special Member
 
zoyd's Avatar
 
Join Date: Sep 2006
Location: Planet Dog
Posts: 6,804
Mentioned: 150 Post(s)
Tagged: 0 Thread(s)
Quoted: 1783 Post(s)
Liked: 1349
zoyd is offline  
post #10 of 29 Old 10-10-2014, 10:14 AM
AVS Forum Special Member
 
madshi's Avatar
 
Join Date: May 2005
Posts: 7,725
Mentioned: 520 Post(s)
Tagged: 0 Thread(s)
Quoted: 2584 Post(s)
Liked: 3457
This is truely a complicated topic. To make things even more complicated, there are 2 different GPU settings for PC vs. TV levels, one is exposed in the NVidia GPU driver control panel and one is not. One affects only certain graphics APIs and one effects everything (even the desktop). And then, depending on which APIs which video renderer uses, some video renderers are affected by both of these settings and some just by one of the two. <sigh>

Furthermore video playback can run through hardware overlay or through fullscreen exclusive mode or through normal windowed desktop composition. All 3 can behave differently in terms of video vs. PC levels. It really is complicated, isn't it?

There is one important thing I'm always trying to point out: The normal desktop composition used by Windows, by the desktop, by all applications, by the games and by most video renderers ends up in the same GPU frame buffer, after all is said and done. In this frame buffer usually 0 means black and 255 means white. If you set the GPU to global 16-235 mode, it will stretch this frame buffer in such a way that 0 is moved to 16 and 255 is moved to 235. So the output is 16-235 and a normal TV can handle this data just fine as TV RGB data. This logic works well enough in that aspect that black and white levels are correct. However, the GPU does this 0-255 -> 16-235 conversion with its own algorithms, behind of back of the applications, even behind the back of Windows. This conversion used to be done in 8bit, without dithering, which introduces some banding artifacts. It's possible that newer GPUs with newer drivers could do this in higher quality, but I can't say for sure if they do. If you use a video renderer which uses hardware overlay, things can behave quite differently, though.

When using madVR I usually recommend to tell the GPU to leave the levels alone, by forcing the GPU into 0-255 mode. If you do that, you can still switch between PC and TV levels in madVR. madVR will then either render black to 0 or to 16, just as you like. However, if you set madVR to 16-235 output and the GPU is set to 0-255, you will get TV levels output from madVR, but the desktop, games and other applications will produce PC levels, so the levels will be inconsistent. I suppose that the XBMC 16-235 option works similar to what madVR does, so it might also produce a black level inconsistency between video/XBMC and the desktop. In order to avoid this problem I usually recommend to set everything to 0-255, including the display.

If you're concerned about losing WTW then just using TV levels alone is *not* good enough. Why? Because normally a display is calibrated in such a way that the highest possible brightness (peak white) is assigned to 235. So if a pixel with value 236 comes along, the display can't show it any brighter than it can show a 235 pixel. So if you setup your source display to output TV levels without clipping WTW and if you want to see some of the WTW data you also have to change your display calibration, so that peak white is not assigned to 235, but to e.g. 245 instead. Or alternatively you can tell madVR to render some more of WTW. Either way, you may be gaining some highlight detail with *some* (few) movies, but you'll lose peak brightness for *every* movie. Hope you're aware of that...
orion2001 likes this.
madshi is online now  
post #11 of 29 Old 10-10-2014, 10:36 AM
AVS Forum Special Member
 
sawfish's Avatar
 
Join Date: Feb 2013
Posts: 1,000
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 146 Post(s)
Liked: 150
I think you're misunderstanding both of us.

Quote:
Originally Posted by madshi View Post
Then how do you suggest should an HTPC user who wants to use the PC for video playback, photo display, games etc, setup his playback chain? Do you recommend to set the GPU to 16-235 output? If you really recommend that, then you seemingly don't know much about HTPCs work inside. Windows internally thinks/works in PC levels. If you set the GPU to 16-235 output, the GPU will stretch all the rendered desktop/video/game pixels from PC levels to TV levels behind the back of the applications and behind the back of Windows itself, and usually the GPU does this in 8bit without using dithering. Which means you'll get banding artifacts.
I'm not recommending setting the GPU to 16-235 to compress all output from 0-255 to 16-235. Outputting video at native Video Levels doesn't imply that. If the GPU is outputting 0-255, which mine is, it can output video at Video Levels without touching the video. It is passthrough, in the sense that the bars in AVS HD 709 clipping patterns contain the pixel values they are labeled with, including BTB and WTW, and the video card outputs those values consistently with my standalone BD player as measured by my i1D3. I wrote about all this in detail in the messages I linked to.

Though as I said it's just a "curiosity", read what I wrote in my last message about Nvidia YCbCr444. I believe it's what madshi is describing, and it has nothing to do with the rest of what I wrote. It indeed causes the video card to do things "behind the back", which is what I was talking about in the supporting message I linked to when I described AVS HD 709 Black Clipping Bar 17 containing RGB 1 yet mysteriously flashing at my normal TV Brightness setting, with the TV set to Video Levels, where black is of course 16:

https://www.avsforum.com/forum/139-di...l#post26970873

I described it in the part, "Nvidia Pixel Format set to YCbCr444, which I don't use, but otherwise the Nvidia Control Panel is at its defaults: WMC".

So I believe I agree completely with madshi. However, AFAICT, he's talking about what I called a "curiosity", and it has nothing to do with the other 99.9% of what I've been talking about. My video card outputs 0-255. You and many others appear to think that even when the card is outputting 0-255, outputting video at Video Levels means converting video from Video Levels to PC Levels and back to Video Levels, and if that's happening, I have no idea how to demonstrate it, and I've presented several pieces of evidence to suggest it isn't happening when I'm not forcing it by using Nvidia YCbCr444. When you convert video to RGB, you get values in the range 1-254. Output it untouched to a TV configured for Video Levels, and you'll get BTB, WTW, and calibration consistency with video devices like BD players. It's what I get and what I've been talking about. I also talked about the inconsistency it introduces with the desktop and why if I cared about one calibration to rule them all, I would output video at PC Levels.

Last edited by sawfish; 10-10-2014 at 10:46 AM.
sawfish is offline  
post #12 of 29 Old 10-10-2014, 10:40 AM
AVS Forum Special Member
 
madshi's Avatar
 
Join Date: May 2005
Posts: 7,725
Mentioned: 520 Post(s)
Tagged: 0 Thread(s)
Quoted: 2584 Post(s)
Liked: 3457
Quote:
Originally Posted by sawfish View Post
I'm not recommending setting the GPU to 16-235 to compress all output from 0-255 to 16-235. Outputting video at native Video Levels doesn't imply that. If the GPU is outputting 0-255, which mine is, it can output video at Video Levels without touching the video. It is passthrough, in the sense that the bars in AVS HD 709 clipping patterns contain the pixel values they are labeled with, including BTB and WTW
Agreed.
madshi is online now  
post #13 of 29 Old 10-10-2014, 12:04 PM
Advanced Member
 
orion2001's Avatar
 
Join Date: Jun 2008
Posts: 741
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 462 Post(s)
Liked: 313
Quote:
Originally Posted by sawfish View Post
So I believe I agree completely with madshi. However, AFAICT, he's talking about what I called a "curiosity", and it has nothing to do with the other 99.9% of what I've been talking about. My video card outputs 0-255. You and many others appear to think that even when the card is outputting 0-255, outputting video at Video Levels means converting video from Video Levels to PC Levels and back to Video Levels, and if that's happening, I have no idea how to demonstrate it, and I've presented several pieces of evidence to suggest it isn't happening when I'm not forcing it by using Nvidia YCbCr444. When you convert video to RGB, you get values in the range 1-254. Output it untouched to a TV configured for Video Levels, and you'll get BTB, WTW, and calibration consistency with video devices like BD players. It's what I get and what I've been talking about. I also talked about the inconsistency it introduces with the desktop and why if I cared about one calibration to rule them all, I would output video at PC Levels.
Ah, ok, in that case I did misunderstand what you are saying. I found this post confusing: https://www.avsforum.com/forum/139-di...l#post26956129

In any case, let me summarize my understanding of what you are stating to see if we are on the same page:

1) You have your GPU set to output 0-255. This makes sense, and as you mentioned YCbCr isn't a good idea.
2) Software renderer setting -> This is where I am hazy. What software are you using for video playback and how is it setup to decode YCbCr to RGB? What setting are you using to essentially leave the video levels untouched and outputted to the GPU?
3) However you achieve your video passthrough, you now have your GPU outputting 0-255, but since this is at video levels, black is at 16 and white at 235. So now you set your TV to an appropriate setting so it assumes that the HDMI input from your GPU is at TV level. This will force the TV to interpret level 16 as black (rather than level 0, if it had been set to PC mode).

In this scenario above, obviously you will have a significant discrepancy between colors and black levels between your desktop, other apps v/s video content. The reason being that all other applications and Windows OS are going to send Black as level 0 and white as level 255 to your GPU, but your TV is still in TV Level mode and will essentially crush black levels (levels 0-16). I guess in your case, you choose to operate in this manner because you don't care about the mismatch and you only care about your video content calibration and this way your calibration settings are better matched with native devices like BluRay players.

Assuming I haven't totally misunderstood things so far, my question is with regards to why you wouldn't want to just have your video renderer scale video levels to 0-255 and set the TV to expect PC levels so that your desktop and video experience is consistent. This also makes life a lot easier if you use photoshop or view/display pictures on your TV via your HTPC. At least with MadVR, you can set the scaling so that you don't clip WTW content (in fact, you can customize the scaling so that you don't necessarily go all the way to level 255, but perhaps clip at 245 instead, so you don't sacrifice too much overall contrast to preserve whiter than white content). Clipping of BTB content is a non-issue since BTB content is never supposed to be displayed in any case. Personally, I think WTW content is overrated and I'd much rather take the improved contrast by calibrating to level 235 white and sacrificing some minor WTW content than reduce overall contrast across the board just to accommodate WTW content.

-------------------------------------------------------------------------
Recommended calibration settings for Samsung PN60F5300B

Last edited by orion2001; 10-10-2014 at 12:08 PM.
orion2001 is offline  
post #14 of 29 Old 10-10-2014, 12:44 PM
AVS Forum Special Member
 
sawfish's Avatar
 
Join Date: Feb 2013
Posts: 1,000
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 146 Post(s)
Liked: 150
Quote:
Originally Posted by orion2001 View Post
In any case, let me summarize my understanding of what you are stating to see if we are on the same page:

1) You have your GPU set to output 0-255. This makes sense, and as you mentioned YCbCr isn't a good idea.
2) Software renderer setting -> This is where I am hazy. What software are you using for video playback and how is it setup to decode YCbCr to RGB? What setting are you using to essentially leave the video levels untouched and outputted to the GPU?
For WMC, I do nothing. It outputs Video Levels by default. For XBMC, I use the new Gotham option for 16-235. These are the only programs I use for video. The Nvidia Control Panel is at its defaults. I have no codec packs or renderers installed.

Quote:
3) However you achieve your video passthrough, you now have your GPU outputting 0-255, but since this is at video levels, black is at 16 and white at 235. So now you set your TV to an appropriate setting so it assumes that the HDMI input from your GPU is at TV level. This will force the TV to interpret level 16 as black (rather than level 0, if it had been set to PC mode).
Yes, and to be painfully clear, the "this" in the first sentence refers to the "video passthrough".

Quote:
In this scenario above, obviously you will have a significant discrepancy between colors and black levels between your desktop, other apps v/s video content. The reason being that all other applications and Windows OS are going to send Black as level 0 and white as level 255 to your GPU, but your TV is still in TV Level mode and will essentially crush black levels (levels 0-16). I guess in your case, you choose to operate in this manner because you don't care about the mismatch and you only care about your video content calibration and this way your calibration settings are better matched with native devices like BluRay players.
Exactly. I spoke in detail about all that in the messages I linked to. As I said, I don't do this for my gaming PC, where obviously I want the desktop to look right.

Quote:
Assuming I haven't totally misunderstood things so far, my question is with regards to why you wouldn't want to just have your video renderer scale video levels to 0-255 and set the TV to expect PC levels so that your desktop and video experience is consistent. This also makes life a lot easier if you use photoshop or view/display pictures on your TV via your HTPC. At least with MadVR, you can set the scaling so that you don't clip WTW content (in fact, you can customize the scaling so that you don't necessarily go all the way to level 255, but perhaps clip at 245 instead, so you don't sacrifice too much overall contrast to preserve whiter than white content). Clipping of BTB content is a non-issue since BTB content is never supposed to be displayed in any case. Personally, I think WTW content is overrated and I'd much rather take the improved contrast by calibrating to level 235 white and sacrificing some minor WTW content than reduce overall contrast across the board just to accommodate WTW content.
Just a couple of messages back, I reposted several reasons I do this and explained the scenarios in which it makes sense; see the section beginning with "here are several specific reasons I use Video Levels with my HTPC and its Nvidia GT430, along with a potentially important caveat":

https://www.avsforum.com/forum/139-di...l#post28109394

I gave a more detailed example of the consequences of the desktop levels mismatch here, which may or may not be reachable in the chain of messages I linked to:

Quote:
Originally Posted by sawfish View Post
...The downside is that the desktop is crushed for RGB < 16 and > 235; a practical example is the purple Windows 8.1 background, which appears blue on the TV due to the low level red being crushed. It's not a problem for video, and it's not a problem for how I use my HTPC, which extends the desktop to the TV, on which I only watch video; the XBMC, WMC, and Windows UIs are perfectly usable even if the colors are a little out of whack for the latter two. Now if you want the desktop to be perfect for games, photo display, etc, then you will want to use PC Levels for everything.
As for the contrast issue, I don't think it is one, as I would be targeting the same light output regardless of the levels choice. I don't see how it matters whether the pixel value that produces that light output value is 235 or 255.
sawfish is offline  
post #15 of 29 Old 10-10-2014, 12:46 PM
AVS Forum Special Member
 
zoyd's Avatar
 
Join Date: Sep 2006
Location: Planet Dog
Posts: 6,804
Mentioned: 150 Post(s)
Tagged: 0 Thread(s)
Quoted: 1783 Post(s)
Liked: 1349
Quote:
Originally Posted by sawfish View Post
You and many others appear to think that even when the card is outputting 0-255, outputting video at Video Levels means converting video from Video Levels to PC Levels and back to Video Levels, and if that's happening, I have no idea how to demonstrate it, and I've presented several pieces of evidence to suggest it isn't happening when I'm not forcing it by using Nvidia YCbCr444.
I find this sentence structure convoluted and hard to understand. I agree there is no "there and back" when you output RGB video levels through a GPU set at 0-255. But what does that have to do with Y'CC? There will be rounding errors if you do RGB->Y'CC for either video or full levels.

Last edited by zoyd; 02-09-2015 at 04:07 AM.
zoyd is offline  
post #16 of 29 Old 10-10-2014, 12:55 PM
Advanced Member
 
harlekin's Avatar
 
Join Date: May 2007
Posts: 709
Mentioned: 66 Post(s)
Tagged: 0 Thread(s)
Quoted: 308 Post(s)
Liked: 140
To clear things up a bit. My answers were (hopefully) accurate for Frodo builds of XBMC - I actively neglected, that I didn't even update to Gotham to keep versions consistent between multiple machines (some of which showed problems with HW accelerated decoding on newer builds). This is my fault.

XBMC Frodo to my understanding does output in full RGB only.
(- when left at default settings)
- when using the software renderer
- when using "allow HW accelleration" - with DVXA2 checked and renderer left at Auto
- when telling the graphics driver to let the program choose levels
-

Thank you for pointing out that with Gotham the limited/full RGB toggle is in the windows version as well - I'll defiantly start a comparison in the next few weeks.

Also 16-255 should have read 16-235 (superwhite clipped), so thank you for correcting me there.
--

Two additional comments on things mentioned in this thread:

- XBMC is also capable of outputing 24p correctly - this is not a feat only accomplished by using MPC-HD.

Quote:
For Nvidia cards and the ST60 and Sony LCDs I've hooked up to them, using PC Levels required me to adjust Brightness and Contrast of video in the Nvidia Control Panel, whereas using Video Levels did not; I could do everything on the TV, which is what I want.
- This - from my personal experience - is wrong. If the TV is able to identify that it is fed a full RGB signal - and I've tested this positively with two different Sony LCDs (last years models) - brightness and contrast do not have to be changed, but "fall in place" accordingly. Driver issue? (edit: see below - TVs not capable is more likely)

Last edited by harlekin; 10-10-2014 at 01:18 PM.
harlekin is offline  
post #17 of 29 Old 10-10-2014, 01:06 PM
Advanced Member
 
orion2001's Avatar
 
Join Date: Jun 2008
Posts: 741
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 462 Post(s)
Liked: 313
Quote:
Originally Posted by sawfish View Post
https://www.avsforum.com/forum/139-di...l#post28109394

I gave a more detailed example of the consequences of the desktop levels mismatch here, which may or may not be reachable in the chain of messages I linked to:
So it looks like the main issues are due to WTW and BTB clipping (which isn't an issue if using madVR) and the other issue seems to be due to the case that your TV doesn't support a PC mode. Would that be correct? I can see how that would be an issue for TVs that do not know how to handle PC level input.

In the case of recent TVs that pretty much all support PC levels via a menu setting, and for those using MPC-HC+MadVR, it would seem that there shouldn't be a need to do a video level passthrough. FWIW, on my Samsung Plasma, my calibration settings in PC mode with 0-255 output seem to match very well with settings other folks have arrived at when calibrating the same set via a BluRay player. So in my case, things still work as expected and I am able to enjoy the benefit of having all applications, Windows OS and videos appearing the same and being calibrated.

Quote:
For WMC, I do nothing. It outputs Video Levels by default. For XBMC, I use the new Gotham option for 16-235. These are the only programs I use for video. The Nvidia Control Panel is at its defaults. I have no codec packs or renderers installed.
Does 16-235 in Gotham clip BTB/WTW content or does it similarly act as a pure pass through while preserving BTB and WTW content? I'd like to add that the fact that WMC or some other software renderer does "nothing" by default and passes video levels through untouched is unfortunately not always the case and that is what can lead to a lot of issues/inconsistencies a lot of the times. Some software may by default scale the video levels to PC levels. Others may pass through video levels but clip WTW/BTB content. Care has to be taken that the software being used to render content is setup correctly (and allows for) so as to either pass through video levels untouched, or allow scaling to PC levels while allowing for preservation of WTW content.

Quote:
As for the contrast issue, I don't think it is one, as I would be targeting the same light output regardless of the levels choice. I don't see how it matters whether the pixel value that produces that light output value is 235 or 255.
Ah, I guess the caveat is that this is more applicable for Plasma panels which are typically set to output at max luminance as they aren't very bright displays to begin with. For Plasma panels, setting level 255 to the max luminance the set is capable of achieving without color clipping results in an overall lower contrast ratio for 99.9% of content. Some would choose to clip with rare WTW content for the improved overall image contrast for the majority of video content. I find that customizing the WTW levels being clipped via MadVR settings is a decent tradeoff in this situation if you want to still retain some WTW capabilities.

-------------------------------------------------------------------------
Recommended calibration settings for Samsung PN60F5300B
orion2001 is offline  
post #18 of 29 Old 10-10-2014, 01:09 PM
Advanced Member
 
orion2001's Avatar
 
Join Date: Jun 2008
Posts: 741
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 462 Post(s)
Liked: 313
Quote:
Originally Posted by zoyd View Post
I find this sentence structure convoluted and hard to understand. I agree there is no "there and back" when you output RGB video levels through a GPU set at 0-255. But what does that have to do with YCC? There will be rounding errors if you do RGB->YCC for either video or full levels.
zoyd, I think he's referring to the fact that if you setup your GPU driver to output YCC instead of RGB, it still does the processing internally with RGB 0-255 and then re-converts back to 16-235 YCC while also potentially clipping BTB/WTW content (I know the Intel drivers do this)

-------------------------------------------------------------------------
Recommended calibration settings for Samsung PN60F5300B
orion2001 is offline  
post #19 of 29 Old 10-10-2014, 01:10 PM
AVS Forum Special Member
 
zoyd's Avatar
 
Join Date: Sep 2006
Location: Planet Dog
Posts: 6,804
Mentioned: 150 Post(s)
Tagged: 0 Thread(s)
Quoted: 1783 Post(s)
Liked: 1349
Quote:
Originally Posted by madshi View Post
If you're concerned about losing WTW then just using TV levels alone is *not* good enough. Why? Because normally a display is calibrated in such a way that the highest possible brightness (peak white) is assigned to 235.
This is not always true, plasmas for example usually start to clip one of the channels prior to peak white. Also, 235 white should be set based on environmental conditions as well as to avoid eye strain and this often ends up with a lower than peak white assignment to 235. In any case though I agree that capturing WTW information should not be the driver in the calibration.
zoyd is offline  
post #20 of 29 Old 10-10-2014, 01:13 PM
Advanced Member
 
orion2001's Avatar
 
Join Date: Jun 2008
Posts: 741
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 462 Post(s)
Liked: 313
Quote:
Originally Posted by harlekin View Post
- XBMC is also capable of outputing 24p correctly - this is not a feat only accomplished by using MPC-HD.
Is there a setting I am missing somewhere in XBMC? I wasn't aware that you could have XBMC change display refresh rate to 24p for playback.
Quote:
Originally Posted by harlekin View Post
- This - from my personal experience - is wrong. If the TV is able to identify that it is fed a full RGB signal - and I've tested this positively with two different Sony LCDs (last years models) - brightness and contrast do not have to be changed, but "fall in place" accordingly. Driver issue?
I agree with this. That being said, perhaps his TV doesn't support full RGB signal, in which case he would have to resort to the setup as he has described.

-------------------------------------------------------------------------
Recommended calibration settings for Samsung PN60F5300B
orion2001 is offline  
post #21 of 29 Old 10-10-2014, 01:27 PM
Advanced Member
 
harlekin's Avatar
 
Join Date: May 2007
Posts: 709
Mentioned: 66 Post(s)
Tagged: 0 Thread(s)
Quoted: 308 Post(s)
Liked: 140
@orion2001 : There is a setting to change the display refresh rate according to the video source, I'm using it in Frodo right now (and the TV info key confirms > 24p (with the right source material)). Search for: "Adjust display refresh rate to match video" on the following wiki page: http://wiki.xbmc.org/index.php?title=Settings/Videos
harlekin is offline  
post #22 of 29 Old 10-10-2014, 01:41 PM
AVS Forum Special Member
 
sawfish's Avatar
 
Join Date: Feb 2013
Posts: 1,000
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 146 Post(s)
Liked: 150
Quote:
Originally Posted by orion2001 View Post
So it looks like the main issues are due to WTW and BTB clipping (which isn't an issue if using madVR) and the other issue seems to be due to the case that your TV doesn't support a PC mode. Would that be correct? I can see how that would be an issue for TVs that do not know how to handle PC level input.
Not really. When I set my Panasonic ST60's HDMI/DVI Dynamic Range to Non-standard (aka PC Levels), its gives me PC Levels with Black at 0 at White at 255. After setting the players to PC Levels, to get AVS HD 709 clipping patterns looking correct, I have to make minor changes to Brightness/Contrast in the Nvidia Control Panel, which is inconvenient and causes problems for video I linked to in an earlier message (that would be the xbmc.org message I linked to). With players and TV at Video Levels, I can do everything on the TV, and it looks the same as my Sony S5100 BD player, which BTW doesn't support PC Levels. (My old S350 did, so they dropped support at some point.)

Quote:
In the case of recent TVs that pretty much all support PC levels via a menu setting, and for those using MPC-HC+MadVR, it would seem that there shouldn't be a need to do a video level passthrough. FWIW, on my Samsung Plasma, my calibration settings in PC mode with 0-255 output seem to match very well with settings other folks have arrived at when calibrating the same set via a BluRay player. So in my case, things still work as expected and I am able to enjoy the benefit of having all applications, Windows OS and videos appearing the same and being calibrated.
If desktop/video consistency is important to you, by all means, that is what you should do, and I have been careful to point this out every time I've talked about the pros and cons of the two alternatives and why I do what I do.

Quote:
Does 16-235 in Gotham clip BTB/WTW content or does it similarly act as a pure pass through while preserving BTB and WTW content?
Pure passthrough, same as WMC.

Quote:
I'd like to add that the fact that WMC or some other software renderer does "nothing" by default and passes video levels through untouched is unfortunately not always the case and that is what can lead to a lot of issues/inconsistencies a lot of the times. Some software may by default scale the video levels to PC levels. Others may pass through video levels but clip WTW/BTB content. Care has to be taken that the software being used to render content is setup correctly (and allows for) so as to either pass through video levels untouched, or allow scaling to PC levels while allowing for preservation of WTW content.
All that complexity is not an insignificant part of why I stick to WMC (CableCard) and XBMC (all other video). I've happy with the output quality and the calibration results (Avg dE, Max dE) I've achieved with my i1D3 profiled against a ColorMunki Photo and Calman:

10 point grayscale (.4, .69)
21 point grayscale (.47, 1.53)
25% 10 point Saturation Sweep (.61, 1.48)
50% 10 point Saturation Sweep (.61, 1.4)
75% 10 point Saturation Sweep (.59, 1.89)
100% 10 point Saturation Sweep (.95, 2.69)
ColorChecker (.69, 1.94)
ColorChecker Full, the one with a gazillion points (.81, 2.23)

Even if my players supported MadVR and its 3D LUT, I don't know that there would be a huge benefit anyway.
sawfish is offline  
post #23 of 29 Old 10-10-2014, 02:22 PM
Advanced Member
 
harlekin's Avatar
 
Join Date: May 2007
Posts: 709
Mentioned: 66 Post(s)
Tagged: 0 Thread(s)
Quoted: 308 Post(s)
Liked: 140
Quote:
Not really. When I set my Panasonic ST60's HDMI/DVI Dynamic Range to Non-standard (aka PC Levels), its gives me PC Levels with Black at 0 at White at 255. After setting the players to PC Levels, to get AVS HD 709 clipping patterns looking correct, I have to make minor changes to Brightness/Contrast in the Nvidia Control Panel, which is inconvenient and causes problems for video I linked to in an earlier message (that would be the xbmc.org message I linked to).
I can confirm that with the current Intel drivers (Haswell integrated graphics), this is NOT the case (anymore - it was fixed in the intel drivers a short while ago) - at least not when using Frodo builds. When examining the AVS HD testpatterns, brightness (and contrast) do _not_ have to be changed.

Also the greyscale and color output (for me) measures exactly as it should (confirmed against HCFR integrated patterns, and HCFR against AVS HD patterns on a PS3 set to Full RGB - a while back).

Sony TVs can be left at Auto or set to Full for that input.

edit: Also, using this signal path, the whole greyscale graph is visible (min maxing the brightness slider) - so there is no obvious signal clipping introduced by XBMC. [edit: WRONG. There actually still is BTB clipping caused by XBMC - double checked this morning.]

Last edited by harlekin; 10-11-2014 at 12:23 AM.
harlekin is offline  
post #24 of 29 Old 10-10-2014, 02:27 PM
AVS Forum Special Member
 
sawfish's Avatar
 
Join Date: Feb 2013
Posts: 1,000
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 146 Post(s)
Liked: 150
Quote:
Originally Posted by zoyd View Post
I find this sentence structure convoluted and hard to understand. I agree there is no "there and back" when you output RGB video levels through a GPU set at 0-255. But what does that have to do with YCC? There will be rounding errors if you do RGB->YCC for either video or full levels.
I'm not sure what you're asking. I guess your YCC comment refers to what I said about the Nvidia YCbCr444 option. As I described it earlier, it is a "curiosity" that does cause the Video->PC->Video Levels round trip I've heard so many others claim happens when you output video as Video Levels. I was just glad to finally be able to demonstrate it a few weeks ago after hearing it for years, and I brought it up after hearing it again in this thread. It has no relevance to what I've been talking about, as the option compresses all card output to 16-235, and I don't do that. I don't think I implied I do in anything I've written, either. My video card outputs 0-255.
sawfish is offline  
post #25 of 29 Old 10-10-2014, 02:41 PM
Advanced Member
 
harlekin's Avatar
 
Join Date: May 2007
Posts: 709
Mentioned: 66 Post(s)
Tagged: 0 Thread(s)
Quoted: 308 Post(s)
Liked: 140
I am actually second guessing myself about the BTB clipping in XBMC, I will have to double check this tomorrow and edit this thread accordingly.

edit: Double checked - and there still IS BTB clipping in my device chain, caused by XBMC.

Last edited by harlekin; 10-11-2014 at 12:19 AM.
harlekin is offline  
post #26 of 29 Old 10-13-2014, 05:57 AM
Advanced Member
 
orion2001's Avatar
 
Join Date: Jun 2008
Posts: 741
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 462 Post(s)
Liked: 313
Quote:
Originally Posted by sawfish View Post
Not really. When I set my Panasonic ST60's HDMI/DVI Dynamic Range to Non-standard (aka PC Levels), its gives me PC Levels with Black at 0 at White at 255. After setting the players to PC Levels, to get AVS HD 709 clipping patterns looking correct, I have to make minor changes to Brightness/Contrast in the Nvidia Control Panel, which is inconvenient and causes problems for video I linked to in an earlier message (that would be the xbmc.org message I linked to). With players and TV at Video Levels, I can do everything on the TV, and it looks the same as my Sony S5100 BD player, which BTW doesn't support PC Levels. (My old S350 did, so they dropped support at some point.)
That's strange. Seems like some issue either with the Nvidia card/driver or the way the TV handles PC levels. On my setup which involves Intel HD2000 integrated GPU and my Samsung PN60F5300 plasma, I can leave all GPU controls untouched and the output I view on my TV is consistent for the same TV settings irrespective of whether I have things setup to go via PC Levels or Video levels.

-------------------------------------------------------------------------
Recommended calibration settings for Samsung PN60F5300B
orion2001 is offline  
post #27 of 29 Old 10-13-2014, 09:51 AM
AVS Forum Special Member
 
sawfish's Avatar
 
Join Date: Feb 2013
Posts: 1,000
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 146 Post(s)
Liked: 150
Quote:
Originally Posted by orion2001 View Post
That's strange. Seems like some issue either with the Nvidia card/driver or the way the TV handles PC levels. On my setup which involves Intel HD2000 integrated GPU and my Samsung PN60F5300 plasma, I can leave all GPU controls untouched and the output I view on my TV is consistent for the same TV settings irrespective of whether I have things setup to go via PC Levels or Video levels.
By "minor adjustments", I mean tweaking for the very ends of the ranges, around 17 and 234. I don't doubt it's a YMMV situation.
sawfish is offline  
post #28 of 29 Old 10-13-2014, 09:57 AM
AVS Forum Special Member
 
sawfish's Avatar
 
Join Date: Feb 2013
Posts: 1,000
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 146 Post(s)
Liked: 150
Quote:
Originally Posted by harlekin View Post
I am actually second guessing myself about the BTB clipping in XBMC, I will have to double check this tomorrow and edit this thread accordingly.

edit: Double checked - and there still IS BTB clipping in my device chain, caused by XBMC.
I promise you, XBMC does not innately clip BTB and WTW, at least not for the configuration I described (Nvidia cards, ST60 TV). You could try playing with the DXVA2 and rendering options; through experimenting, I found they can make a difference. Of course, if your video is being expanded to PC Levels, of course BTB and WTW are eliminated.
sawfish is offline  
post #29 of 29 Old 10-13-2014, 10:57 AM
AVS Forum Special Member
 
Brian Hampton's Avatar
 
Join Date: Apr 2000
Posts: 9,257
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 1500 Post(s)
Liked: 913
I recently started using XBMC on a Chromebox and thought this thread would be fun to read.

Oh well...guess I thought wrong.

I've always used my BDP for calibration so far and the XBMC setup looks just as perfect.
Brian Hampton is offline  
Sponsored Links
Advertisement
 
Reply Display Calibration

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off