Calibrating for gaming - a few Qs, want to make sure I've got everything right. - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 5 Old 06-04-2013, 03:05 PM - Thread Starter
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,289
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 124 Post(s)
Liked: 1343
Hey guys. I'm attempting to write a sort of one stop guide for the gaming subforums for HT component selection, setup and calibration, specifically for those who primarily want to play games on their HT. Still a few parts I want to run by you guys, just to make sure I'm getting everything right. It doesn't really help anyone if I'm giving out bad information.

I've searched through all the previous threads in this subforum on the topic beforehand. I've got my own setup calibrated using an i1, and generally, I think most games look great. There's a bunch of standouts that don't, and certain things like gamma tend to be particular problems. I'm well aware that the game dev community lacks an official, codified standard like film, and I'm pretty sure that many, possibly most game studios simply arent following any standard. But I believe that if there was to be a standard, the only sane direction for them to move in would be to adopt the current standards we have for film/HT. I believe this is happening (slowly), and as more gamers begin to take HT seriously, they'll expect the same standards of quality that videophiles do. Still though, there's a few issues stemming from the fact that even console games are fundamentally PCs, and would have to be forced into video standards at some point in the output chain.

I've written a few paragraphs on 0-255 vs 16-235, color space etc.
Quote:
HDMI Black Level and RGB/YCbCr –At their heart, game consoles and modern displays are derivatives of PC technology. Game consoles can output PC settings, and displays can accept them. First, let’s clear up the terminology:

0-255 vs. 16-235(Grayscale Range) – In the digital domain, 24-bit color is broken down into three 8-bit channels for red, green and blue. 8-bits of precision give us 256 discrete steps – 0-255. As software, games are rendered internally at RGB 0-255. Due to legacy from the CRT days, video continues to be mastered at 16-235 – values below 16 are considered “blacker than black” and values above 235 are considered “whiter than white”.

RGB vs. YCbCr (Color Space) – RGB creates color by directly mixing red, green and blue. YCbCr creates color using a grayscale luminance value (Y) and two color difference channels, Blue (Cb) and Red (Cr). PCs use RGB, but video is stored in YCbCr. When using the same targets for color primaries, the range of colors that can be created are exactly the same. (I'm not 100% sure on this one in particular...is this correct?)

sRGB vs. Rec. 709 (Color Primaries) – Rec. 709 defines the exact primary colors expected for HDTV – red, green and blue. sRGB, the most common PC target, uses the exact same primaries. Despite sounding radically different, both color spaces and primaries are functionally equal. The difference lies in how gamma is defined(Are they actually equal? And if correct, can anyone explain how sRGB and 709 differ regarding gamma?), and arguably this is the one of the reasons why gamma remains such an issue with games.

So how do we reconcile this? Should we stick with the video standard or use native output? The answer is tricky, coming up time and time again in the forums, a source of endless confusion. For color space, it is essentially a distinction without a difference – the end result should be the same. Calibrating for Rec. 709 primaries is essentially calibrating for sRGB, and choosing to output RGB or YCbCr will not throw this off. (Of course, this is based on the assumption that my understanding is straight, and that the consoles can perform this conversion without degradation)

For black level, there is a clear answer – 16-235. As always, calibrate to the only standard we have, that for video. You may be tempted to use 0-255/RGB, reasoning that using the most native mode will give you the best image quality and least amount of input lag. This reasoning may be sound, but the internal conversion process is so fast that input lag is not a concern, and image quality is not degraded. You also risk having different settings on your various other devices, which can completely throw off your calibration. The most frustrating part about getting this process right, is that consoles and displays frequently use entirely different nomenclature for the same settings:

Xbox 360
Resolution – Set to the native resolution of your display
Reference levels – Standard (16-235)
HDMI Color Space – Auto (This will output YCbCr for video, and RGB for games)

Playstation 3
Resolution – Automatic – this will detect all resolutions that your display supports
RGB Full Range – Limited (16-235)
YPbPr Super-White – On (This setting does not affect games, only video) (Sony's documentation confirms this....but I'm still not 100% clear on how super-white actually pertains to video, I believe it has something to do with passing blacker than black and whiter than white....but then why is it just "super-white" and not "super black & white"?)
Deep Color Output – Off (This setting does not affect games, nor does any commercially produced video use it)

Thoughts? Particularly if anyone has actual game dev experience, I'd love to know how this is handled from the inside. Not really the most accessible bunch. :P

As a side question.....I've heard it both ways, that gamma is either a) a user preference, based on your lighting conditions, or b) that "standard" gamma for video is 2.2, a specific target that you must match for accurate video. I personally prefer 2.4 for game, video and film content in a pitch black, light controlled room.....so, am I ruining the directors intent, or is there really some element of preference/psychovisual adaptation that needs to be accounted for?

Steam/PSN/Xbox Live: Darius510
bd2003 is online now  
Sponsored Links
Advertisement
 
post #2 of 5 Old 06-04-2013, 07:17 PM - Thread Starter
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,289
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 124 Post(s)
Liked: 1343
A thought....if sRGB and 709 share the same color points, and RGB and YCbCr are two different methods to reach the same result.....wouldn't it be better to recommend sending RGB, 16-235? I guess it would all depend on whether they actually are equal, or is there any chance that a display would interpret them differently?

Steam/PSN/Xbox Live: Darius510
bd2003 is online now  
post #3 of 5 Old 06-04-2013, 08:03 PM
AVS Special Member
 
Michael TLV's Avatar
 
Join Date: Feb 2000
Location: THX/ISF Calibrationist/Instructor, Calgary, AB, Canada
Posts: 6,723
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 114
Greetings

You do realize that there is no standard that is applied to video games ... (unless they are THX certified) the game companies don't care. Their test displays are all set to dynamic mode and they only care about overscan issues and little else.

Regards

Michael Chen @ The Laser Video Experience
ISF/THX/TLV Video Instructor
The Video Calibration Education Hub - www.TLVEXP.com

Michael TLV is offline  
post #4 of 5 Old 06-04-2013, 08:33 PM - Thread Starter
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,289
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 124 Post(s)
Liked: 1343
Quote:
Originally Posted by Michael TLV View Post

Greetings

You do realize that there is no standard that is applied to video games ... (unless they are THX certified) the game companies don't care. Their test displays are all set to dynamic mode and they only care about overscan issues and little else.

Regards

I'm sure that's the case for a great deal of developers, however I have an extremely difficult time believing that this applies to all of them. I think you're selling an entire industry far too short with that comment, I find it difficult to believe any digital artist makes it through their training in 2013 without at least a rudimentary understanding of the basics. Should they ever decide to get their act together as an industry, I cant think of a single reason they'd want to settle upon anything but the standard we already have, especially given the built in blu-ray players....they're legitimate video devices. Therefore, the only sane option for end users as well would be to calibrate to video standards.

Until then, I'm not really trying to come at this from "make your games perfect" standpoint, because I agree, that seeking the same level of exactitude that you'd get some film is at this point not going to happen. But for someone who has already calibrated to video standards, there's quite a few places and settings within the devices themselves that could potentially throw off their video calibration, such as the different color spaces, black level settings etc....I'd like to fully understand the consequences of those choices, and avoid throwing off the video calibration at all costs.

Steam/PSN/Xbox Live: Darius510
bd2003 is online now  
post #5 of 5 Old 06-04-2013, 09:50 PM
AVS Special Member
 
sotti's Avatar
 
Join Date: Aug 2004
Location: Seattle, WA
Posts: 6,585
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 12 Post(s)
Liked: 164
No reason not to simply configure your console to work like every other CE device (ie out of the box configuration).

The developers that do care, and I've talked to some of them, are testing on TV's calibrated to rec.709 typically with a gamma 2.2

Joel Barsotti
SpectraCal
CalMAN Lead Developer
sotti is offline  
Reply Display Calibration

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off