If anyone cares, here are my settings after spending a few weeks tweaking and trying a variety of different settings on my 65x850c.
I'm curious if anyone disagrees with anything, can correct any obvious mis-understandings, or just wants to chime in with an opinion
It should be noted that my sources include:
- Mac Mini PLEX HTPC for TV and Movies
- PS4 for gaming and Bluray
I don't watch Cable/Satellite/Broadcast TV.
Unless otherwise noted, settings for both sources are the same.
I've compared the Standard, Cinema (Pro/Home) and Game/Graphics modes. Of course, don't be fooled by the initial settings for each one, they can all be made to look the same. However, there are some subtle differences in behaviour and the settings available. Game and Graphics mode allow you to connect a computer that outputs 4:4:4 or 24-bit RGB. You can verify this by displaying single pixel red or magenta lines (there are lots of test patterns out there). 4:4:4 will preserve these lines perfectly. It seems other picture modes covert the signal to YCbCr 4:2:0 which will cause dithering and smearing of the single pixel red or magenta lines. Unfortunately the Game and Graphics modes have limited Motionflow settings (which probably keeps input lag to a minimum) so they're not really suitable for 24fps material. The only other difference between picture modes that I could find is that "Mastered in 4K" is only available in one of the Cinema modes. Hence, I use Cinema Pro (not sure why it's called "Pro" instead of Cinema 1?) for viewing content and Game Mode for PS4 Gaming.
Auto Picture Mode:
I'm not sure if this works as expected. I will experiment with this more later.
The brightness controls the backlight level and the difference between settings is subtle. The range is not vast. The difference between 20 and 30 is not that significant. The good thing is that this setting does not affect blacks, whites, or gamma, it's simply a backlight level. So I think this really comes down to personal taste for any given ambient light level. I prefer a bright picture, so I set mine at 30.
I tried this but it seemed to dim the backlight even under the brightest conditions, so it's a failure in my opinion. I've currently got it disabled.
(UPDATED): A negative setting (higher gamma) will crush your blacks, increasing the setting (lower gamma) will wash out your picture, remove contrast and blow your whites. The TV appears to be set for a gamma of 2.4 (-2) which I find is too dark (can crush blacks). A gamma of 2.2 better matches the other displays in my life. So I have it set at zero and instead use Black Level to ensure shadow details are set properly.
I spent a lot of time fussing around with this before finding the ideal setting, you may want to do the same. You can use the Black level control to determine exactly how much shadow detail you want vs how contrasty you want the image. Lower black level means darker shadows and more contrast, higher black level means more details in the shadows at the expense of a more washed out picture. To adjust black level I use a combination of test patterns (like this one
) and source material (the opening scene of Transformers AoE is perfect - the opening scene of Star Wars might be another one but I don't have a copy of that film). My preferred setting for movies which offers inky blacks without crushing shadow detail, is a setting of 30. On a black level test chart, that will make 3% black barely visible (or just above "Reference Black" at about 22). For Gaming I set it at 47 which provides a bit more detail in the shadows and it doesn't compromise contrast too much since games seem to have a higher contrast ratio (dynamic range?) anyway.
This is exactly the same as lowering Black Level but in bigger steps. I suggest turning this off, and just adjusting black level using the slider.
Advanced Contrast Enhancer:
I think this is the good old "dim the backlight for the credits" setting. I leave it on "Low" but I've yet to see it play a role. Perhaps its improved so much since my old set that it's working and I just can't tell... that would be cool.
I'll probably annoy purists on this statement, but D65 calibration is just way too warm for my liking. It's probably because on a normal day, I spend almost all my waking hours looking at a display... whether it's my 4K Dell monitors at home, my Apple Thunderbolt display at work, my MacBook laptop display, my iPad display, or my iPhone display... and ALL of them are setup from the factory to what I'm guessing is somewhere between 7000-7500K. So anyone who works with mobile devices or Apple displays is going to think 6500K is too warm. So for me, and my environment full of 7000-7500K screens, "Neutral" is perfect color temperture for the TV in my household... Whites look white just like they do on all my other displays
If you're a purist, you'll probably want to set the color at 50 and ignore "Live Color" however if you like punchy color like I do, then see the next setting. (BTW, if you want more saturation, I think you get better results using "Live Color" than "Color")
I mention this here because to my eyes, it has a huge impact on the colors of the display. Purists will probably ignore this setting, but I purchased a Triluminous display for it's vibrant colors and this setting is what seems to bring it to life. It offers Low, Medium, and High, and Low is definitely enough to get some much needed color pop. To my eyes, without Live Color, red looks kind of orange. With Live Color, reds look they way they should. Pull up a color test pattern (like this rainbow one
) and see for yourself what it can do to red and magenta in particular.
Other color settings:
I leave Hue alone and haven't done any advanced color temperature settings. I think copying settings from those who've calibrated their display is probably going to make things randomly worse, not better, unless every TV ships from the factory out of calibration by exactly same amount (possible, I suppose, but unlikely). Color space... Auto is probably best. It should pick sRGB/BT.709 unless the source indicates otherwise.
I think it's common knowledge based on reviews etc. that a setting of 50 is neither sharpness or blurring and so that's the desired setting. While sharpness may sound like a good thing, it almost immediately introduces unwanted halos around edges. If you want to sharpen some details, use the Reality Creation instead.
I played around with this a lot and as far as I can tell it adjusts the amount of micro-contrast. Anyone familiar with photo post-processing knows there's edge enhancement type sharpening (the control above) and micro-contrast enhancement. This control seems to be the latter. It's more subtle and provides a more pleasing result than "Sharpness" which can quickly cause halo artifacts on edges. The best way I found to adjust this is to pause the video on a frame full of detail that is clearly in-focus and then note what it does to enhance the contrast of the details. I found setting it on Manual and maxing out the Resolution slider and then switching between "Manual" and "Off" was the best way to observe the effect. After I could see what it was doing, I adjusted the Resolution slider down until it was helping but not over-doing it. To my eyes this was about 25 on the Resolution slider. I seriously doubt that many (maybe even any?) people will see this setting make much of a difference on video unless they really start looking for it, which means the movie is horrible and you should probably watch something else.
Mastered in 4K:
This appears to be a setting that can make some Sony Blurays mastered in 4K look better, but I don't own any such Blurays so I can't comment on how good this is. It's on for no good reason. (Not available in Game Mode)
Random Noise reduction:
This will undo any micro-contrast addition you make through Reality Creation. It effectively blurs fine edges to eliminate noise. So if you like what Reality Creation is doing, turn this off.
Digital Noise Reduction:
This is probably a great setting for crappy macro-block happy cable signals, but I don't watch cable anymore so I turn this off.
I've gone on at length about this here lately. I suggest reading back through the last few pages of posts from me and others on this subject if you want to learn the background. My conclusion after exhaustive testing with multiple different types of moving objects on 24fps material is the following:
- Standard is the best setting I've found. It offers smooth motion interpolation without losing much sharpness.
- Smooth overdoes it and moving things can have artifacts or look blurry
- Clear lowers the backlight and offers no visible advantage to 24fps jerky motion that I can see
- True Cinema is the worst on a 24fps source so it's a real misnomer. rtings.com mentioned that this is the setting to use for reverse 3:2 pulldown (material that was once 24fps but telecined using 3:2 pulldown to playback at 30fps). This is old-school. Maybe use it for old DVDs or movies on cable?
- Custom allows you to set your own smoothness and clearness. Standard seems to be a bit of Smoothness (2 or 3).
- Off also looks horrible on 24fps material. It's great for Gaming.
Update: I've since found this great post on Sony Motion Control Settings
This was called Film Mode up until a recent firmware update. Based on the on-screen instructions this appears to help with interlaced material. I have no such material but there really is no "Off" setting as even "Off" says it's doing something. I've left this on "High" out of ignorance more than anything. I'd love a clear explanation of what this is doing.
Update: There is an explanation here
but I'm not sure that helps explain it. It would be nice if Sony or someone else could offer some best-practices settings advice based on content type (24fps movies vs. TV movies vs. sports, etc.) Maybe it's worth looking at the default motion control settings in each of Sony's predefined Picture Modes (Standard, vs. Sports, vs Cinema, etc.)?
The only other settings I've made (through the Home Screen Settings App) that might be worth mentioning are Dynamic Range (set to Full) and Screen (Full pixel).