Hey guys. I'm attempting to write a sort of one stop guide for the gaming subforums for HT component selection, setup and calibration, specifically for those who primarily want to play games on their HT. Still a few parts I want to run by you guys, just to make sure I'm getting everything right. It doesn't really help anyone if I'm giving out bad information.
I've searched through all the previous threads in this subforum on the topic beforehand. I've got my own setup calibrated using an i1, and generally, I think most games look great. There's a bunch of standouts that don't, and certain things like gamma tend to be particular problems. I'm well aware that the game dev community lacks an official, codified standard like film, and I'm pretty sure that many, possibly most game studios simply arent following any standard. But I believe that if there was to be a standard, the only sane direction for them to move in would be to adopt the current standards we have for film/HT. I believe this is happening (slowly), and as more gamers begin to take HT seriously, they'll expect the same standards of quality that videophiles do. Still though, there's a few issues stemming from the fact that even console games are fundamentally PCs, and would have to be forced into video standards at some point in the output chain.
I've written a few paragraphs on 0-255 vs 16-235, color space etc.
HDMI Black Level and RGB/YCbCr –At their heart, game consoles and modern displays are derivatives of PC technology. Game consoles can output PC settings, and displays can accept them. First, let’s clear up the terminology:
0-255 vs. 16-235(Grayscale Range) – In the digital domain, 24-bit color is broken down into three 8-bit channels for red, green and blue. 8-bits of precision give us 256 discrete steps – 0-255. As software, games are rendered internally at RGB 0-255. Due to legacy from the CRT days, video continues to be mastered at 16-235 – values below 16 are considered “blacker than black” and values above 235 are considered “whiter than white”.
RGB vs. YCbCr (Color Space) – RGB creates color by directly mixing red, green and blue. YCbCr creates color using a grayscale luminance value (Y) and two color difference channels, Blue (Cb) and Red (Cr). PCs use RGB, but video is stored in YCbCr. When using the same targets for color primaries, the range of colors that can be created are exactly the same. (I'm not 100% sure on this one in particular...is this correct?)
sRGB vs. Rec. 709 (Color Primaries) – Rec. 709 defines the exact primary colors expected for HDTV – red, green and blue. sRGB, the most common PC target, uses the exact same primaries. Despite sounding radically different, both color spaces and primaries are functionally equal. The difference lies in how gamma is defined(Are they actually equal? And if correct, can anyone explain how sRGB and 709 differ regarding gamma?), and arguably this is the one of the reasons why gamma remains such an issue with games.
So how do we reconcile this? Should we stick with the video standard or use native output? The answer is tricky, coming up time and time again in the forums, a source of endless confusion. For color space, it is essentially a distinction without a difference – the end result should be the same. Calibrating for Rec. 709 primaries is essentially calibrating for sRGB, and choosing to output RGB or YCbCr will not throw this off. (Of course, this is based on the assumption that my understanding is straight, and that the consoles can perform this conversion without degradation)
For black level, there is a clear answer – 16-235. As always, calibrate to the only standard we have, that for video. You may be tempted to use 0-255/RGB, reasoning that using the most native mode will give you the best image quality and least amount of input lag. This reasoning may be sound, but the internal conversion process is so fast that input lag is not a concern, and image quality is not degraded. You also risk having different settings on your various other devices, which can completely throw off your calibration. The most frustrating part about getting this process right, is that consoles and displays frequently use entirely different nomenclature for the same settings:
Resolution – Set to the native resolution of your display
Reference levels – Standard (16-235)
HDMI Color Space – Auto (This will output YCbCr for video, and RGB for games)
Resolution – Automatic – this will detect all resolutions that your display supports
RGB Full Range – Limited (16-235)
YPbPr Super-White – On (This setting does not affect games, only video) (Sony's documentation confirms this....but I'm still not 100% clear on how super-white actually pertains to video, I believe it has something to do with passing blacker than black and whiter than white....but then why is it just "super-white" and not "super black & white"?)
Deep Color Output – Off (This setting does not affect games, nor does any commercially produced video use it)
Thoughts? Particularly if anyone has actual game dev experience, I'd love to know how this is handled from the inside. Not really the most accessible bunch. :P
As a side question.....I've heard it both ways, that gamma is either a) a user preference, based on your lighting conditions, or b) that "standard" gamma for video is 2.2, a specific target that you must match for accurate video. I personally prefer 2.4 for game, video and film content in a pitch black, light controlled room.....so, am I ruining the directors intent, or is there really some element of preference/psychovisual adaptation that needs to be accounted for?