Originally Posted by dkersten
My understanding of the colorimetry setting is if turned on, an app can tell the Shield to switch down to rec709 if the content is rec709 and your HDMI setting is BT2020. Prior to this, rec709 would work on a bt2020 setting, but it used tone mapping and was not perfect. And a couple years back, rec709 wouldn't work at all in a bt2020 mode. Now it switches to an actual rec709 mode. But from what I understand, the shield never switches to a higher mode than what is set in the HDMI settings. So if you are set to rec709, bt2020 will be blown out even with colorimetry on.
So you should set your HDMI to the highest resolution/frame rate/color depth/luminance you can and let the app tell it what to use.
So, for example, set your HDMI on the shield to 4k, 60p, bt2020, 10biit, 4:2:0 (techically the highest you can get because of 18gbps limitations), and if you send a 4k 24p rec709 8 bit video, it will play at 4k, 24p, rec709 8 bit. Then if you play Netflix 4k HDR, it will switch to the 60p BT2020 setting automatically. No need to touch the HDMI modes, and the closest thing to a "set it and forget it" mode. But it will ONLY go down, not UP.
This is important to understand: If you are set to rec709 in HDMI settings, nothing will switch up to bt2020. An app can play 24p, 23.976, 30, 59.94, or 60p if you have the HDMI set to 60p, but if you have it set to 30p, it will only scale down, it won't play 60p content (it will try but fail and you will get more like 1 frame every 3 seconds). 10 bit color on 8 bit settings will result in banding, but 8 bit color on 10 bit settings works fine.
Resolution doesn't seem to work that way though because the shield will upscale all content to 4k if you are in a 4k HDMI mode, so you can't play 1080p content on a 4k mode and get 1080p output. If you set the Shield HDMI to 1080p, apps like Netflix won't even display 4k content because it will think you can't play it, and from what I can tell, 4k will either downscale or not play at all. This is really only an issue if you want to use an external upscaler for 1080p content. In this case you will have to switch the HDMI settings manually each time. Not an issue for most, but some feel the shield's upscaling isn't good. Personally I think it does just fine, but I also haven't compared it to something like a Lumagen Pro.
Keep in mind, this is all restricted by your display. If your display cannot do over 10gbps, you won't even have the option to go over 4k, bt2020, 10 bit 30p. This is a problem for 60p content, even 1080p 60p content. I had an Epson projector for a while that was 10gbps max HDMI, and if I ran a BT2020 HDMI setting on the Shield, I couldn't watch PSVue because it streams at 60p. However, Netflix seemed to switch to 24p or 30p streams just fine. Now that I have a JVC with 18gbps HDMI, I can set it at 4k bt2020 10bit 60p and everything works perfectly, and even Netflix streams some content at 60p.
Don't just assume the auto HDMI mode will go to the max, if it can't detect your display (and cables) support 18gbps, it will go to the most compatible mode, which is 4k, 60p, rec709 8 bit for 10gbps displays. So 10 bit sources will have banding, and bt2020 will be blown out. Sometimes you need to set your other devices to advanced hdmi in order for 18gbps to work.
Finally, in EVERY case, it is up to the app to do the switching, so if you are set at 4k, bt2020, 60p, 10 bit and your source is 24p but your display says it is playing at 60p, then your app is not written to tell the shield to switch to 24p. All my apps tell the shield the right framerate, color depth, and bit depth, so I have had no issues. I run Plex, Netflix, Amazon, and PSVue. I can't personally speak for Kodi, but I hear it switches the shield just fine.
+1. This is pretty much dead-nuts right, based on my own experiences and comparisons (early-adopter, had a Shield in my living room for about 3.5 years now, and liked it so much I've got one on almost every tv in the house now)... Just wanted to throw out the +1 for any new Shield owners out there, this is very good guidance for both the "how" and the "why" for Shield setup.
Originally Posted by chanc
So I set my shield to HDMI on the shield to "4k, 60p, bt2020, 10bit, 4:2:0 (technically the highest you can get because of 18gbps limitations)" and colorimetry on as suggested by dkersten. My TV is a Sony xbr75-x940e with 18gbps hdmi cable and to my great surprise the shield picture on netflix is much better than the recommended setting 4k, 59.94, rec709. Not having time to really tested it to compare to the picture on the Apple 4k tv but much appreciated and thank you dkersten for sharing. Up until now the Apple 4k tv picture is better than the shield picture on netflix and amazon video.
I've also got a x940E in my main living room, and have a Shield, ATV4K, and OPPO UDP-203 hooked up to it for content... the AppleTV does a *couple* of things *a little bit* better than the Shield, so it gets some use in my house. But the Shield is just a much more versatile, capable, "swiss-army-knife," and the vast majority of the living room usage in my household is via it. Wife prefers it. Kids prefer it. I prefer it. They're great!