Originally Posted by FromPlasma2LCD
Sweet answer, thank you. I'm still on firmware 03.51.30 out of the box. I'm not sure which firmware is the one that broke the HDR mode. Should I be console gaming in PC mode on the C7? I feel like color accuracy in video games is subjective and brightness, gamma and shadow details are more important to me above all else. If I rename the input to PC will I have to change it back when watching a UHD bluray on the Xbox 1s or will my current firmware be fine for UHD HDR bluray playback?
I can't remember what firmware I was at before updating. But I can say that by updating to the latest update as of July 31, 2017, it broke PC Input HDR as well as the YouTube app's HDR no matter what. However, it was still worth it because before the update, my Blu-ray player (Sony UBP-X800) was dropping video at random. Updating fixed that.
Frankly, it's all subjective, one could watch in Vivid mode if they prefer (bleck!). But I always prioritize calibrating to as close to reference quality as possible because the bottom line is that movies and games and photos are all shot and graded to look a certain way. There are objectively compelling reasons to stick to a display calibrated to a certain standards demands in as dark an environment as comfortably possible.
While video games don't seem to adhere to as strict of color standard as cinema and photos, they are generally created using assets that were produced under video standards. Video games will benefit from a display calibrated for accurate cinema viewing in a dark environment. A good example is Quantum Break, which uses very well shot and highly produced live action cinematics in between gameplay. That footage adheres to video standards and the gameplay colors and looks benefit just as much from being calibrated to those standards.Although it may take as much as a few weeks to get used to if one is used to very blue, too bright uncalibrated images, like phones or out of the box PC monitors.
Personally, I calibrate to 120 nits for a dark room (OLED light 28), 0 V and H sharpness for no added sharpness, Warm 2 with minor RGB high color tmep adjustments (-6, 0, -8), and gamma Bt. 1886 to "mimic" CRT behavior.
Gamma, I want to stress has become a fairly confusing setting. I've always calibrated my IPS PC monitors for photo and video editing, gaming, and movie watching to 2.2, but with OLEDs, it seems ideal to go for Bt. 1886 which is like but not exactly 2.4. Here's the source that leads me to think Bt. 1886 for these OLEDs is ideal: http://www.spectracal.com/Documents/...rs/BT.1886.pdf
With your firmware, you should be fine to use PC Input and an ISF dark or Bright preset across the board. I'm tempted to say that for videogames it may be worth creating a preset calibrated to 2.2 gamma.
Here are some good test patterns that anyone can use to calibrate their set with basic settings by eye: http://diversifiedvideosolutions.com/
Go to products and choose whichever option will be most convenient for you. There are SDR and HDR10 patterns. You can put the files on the device you intend to watch content through to ensure the intended calibration maintains across devices when using the same settings. For instance, sometimes PC video cards have different settings from say a BluRay player that can dramatically change how a calibration affects the image.
Using the SDR patterns off a USB drive on the TV, I have found that contrast at 85 is best for no clipping and that brightness of 50 is best because any higher brightness raises all other levels more than it reduces the slight crush at just above 0 black level 17.
The HDR patterns are cool because you can see just how low on the totem pole of HDR brightness all TVs are -- even the brightest 1400 nit LCDs. 2017 OLEDs hit, at best, around 700 nits which is a small fraction (less than 1/4th) of where HDR is headed in the near future. For reference, a clear sunny day iin ambient light, equates to about 30,000 nits.