This post will attempt to clarify exactly WHAT different confusing settings on the XBOX One X do to the signal it's putting out to your display, with information on how the PS4 Pro and Nintendo Switch output signals with their somewhat simplified settings as well.
For a while, I've been trying to figure out actual behavior of my XBOX One X with some of the less conventional video settings (which also made me wonder what the PS4 Pro and Switch output). Specifically, there's a lot of confusion out there over the Color Depth setting as well as Color Space and the Allow 4:2:2 setting (which deals with Chroma). It was driving me crazy as I couldn't seem to find a definitive, non-contradictory answer with all I wanted to know, and a LOT of the information out there is disputed between one source and another. Different people different places recommend completely different things, with very few really knowing what effects the various settings have real-world. I finally decided I needed to work this out for myself so I didn't lose my mind, even if I had to buy the hardware to run the tests. =oP
I need to give props to TOrangeJuice who initially had this post and had it right: https://www.reddit.com/r/Xbox_One_X/...or_4k_is_8bit/
Vincent Theo at HDTV Test also had similar conclusions (https://www.youtube.com/watch?v=SHVJYB-Qews
), albeit a few of his signal findings being different, possibly attributable to different  XBOX software or something odd with the projector he was using to detect the signal. The people over at RTINGs also recommended similar settings. So much of this information has been out there, but amidst a lot of other misinformation from other sources.
I appreciated their information, but wanted to test some additional things and provide a bit more detail. Also, there have been some updates to the XBOX since that post that revise some graphics settings, which could have changed how a few features worked. Like him, I'm using a HDFury Vertex with the latest firmware updates to confirm the signals. (And while probably overkill for me, it's a very cool piece of kit if you ever want to do signal analysis or convert signals from one thing to another).
Other equipment being used include a Marantz SR7011 Receiver and Sony XBR-75Z9F television, with the Vertex in between the receiver and TV (it shouldn't matter as the Marantz is set to pass on everything as received by the source device [unless the overlay is up, which wasn't the case]). All equipment tested has the latest software/firmware as of 04/11/19.
All resolution settings are assumed to be set to 4K in the console unless otherwise stated, though most of this should still apply to 1080P signals.
One last disclaimer: I believe all information here to be accurate, especially the signal information being reported by the HDFury; that said, I'm still learning myself about their interpretation, so I welcome and appreciate any feedback/corrections/etc.
XBOX One X (Should Apply to One, One S)
TL;DR - From my testing, for most people, the best XBOX settings should be 8-bit (24 bits per pixel) Color Depth with Standard (Recommended) Color Space and "Allow 4:2:2" ON (Checked). Details below.
Color Depth Setting:
In SDR, if you select 8-bit (24 bits per pixel), you get true RGB (no Chroma subsampling necessary, which is the full uncompressed signal, equivalent to 4:4:4 - it can't get any more accurate than this, regardless of display type/bits).
In SDR, if you select 10-bit (30 bits per pixel), you get 10-bit 4:2:0 [This surprised me; I expected 4:2:2 when "Allow 4:2:2" was enabled, but this was not the case. It's 4:2:0 in the menus and SDR games from what I can tell.]
In SDR, if you select 12-bit (36 bits per pixel), you get 12-bit 4:2:0.
I have seen a small number of anecdotal reports of less banding visible at higher bit depths. This could be attributed to placebo effect OR something to do with how the XBOX upscales the RGB colors. I am not able to perceive a difference on my TV, though I do use Low Gradation Smoothing (a Sony TV feature). The only thing I could find was an old quote by someone who supposedly worked on the XBOX saying essentially that higher bit depths just pad the beginning of the signal with 0's (source: https://www.avsforum.com/forum/141-x...t-depth-3.html
). That's quite old, though, so it's certainly possible it's no longer accurate. Either way, if you have a good TV, 8-bit should allow the TV to do the upscaling on the uncompressed signal.
While using 4:2:0 won't likely make a huge visible difference, it's still compressing the color data, which is undesirable if it can be avoided. Therefore, as others have found before, as far as I can tell, 8-bit is still the best setting to use here. In fact, it confuses me why this is an option at all. I can't think of a use, as to my knowledge there aren't any games that even use the 10-bit+ feature that aren't HDR. Color Depth does *NOT* appear to affect HDR or Dolby Vision at all, so you'll still get 10-bit or 12-bit for those, even if set to 8-bit. Neither the Nintendo Switch or Playstation 4 Pro allow for setting Color Depth, so it's puzzling why XBOX does with no clear reason or use that I can find.
Color Space Setting:
Not all displays support PC RGB, and whether you'll have to manually change any settings on your display for it to work properly will depend on your equipment. As an example, my current Sony Z9F television recognizes and automatically switches seamlessly between Standard (Limited) signals and PC RGB. I haven't noticed anything seem to display incorrectly as the Auto Color Space setting works great. My previous TV, however, an LG OLED, would display things incorrectly if the wrong Color Space was selected in conjunction with its Black Level setting on Low or High (no Auto was available), resulting in either a washed out picture or lots of black crush depending. Therefore, it's important to have the same setting for both the TV and the console for displays that do not correctly auto-detect. You also have to have an HDMI port with Enhanced HDMI (Sony) or Deep Color (LG) enabled - it may be called other things on other brands.
In theory, PC RGB allows a wider range of blacks to whites from 0-255 instead of the reduced 16-235 for Standard (aka Limited). However, there's some evidence I'll cite in links below that the XBOX treats ALL games as Standard (Limited) and simply expands the range artifically to Full when PC RGB is selected. The XBOX itself also warns of "lost highlights during video playback". I was puzzled by this at first, but it appears to relate to Blacker than Black and Whiter than White data, which is a thing outside of the 16-235 Limited range, but Full Range can't display that as it can't recognize anything outside 0-255. This is a good article for more details on that: https://www.howtogeek.com/295569/sho...ation-or-xbox/
Here's another post detailing how Standard vs. PC RGB worked a year or so ago, though it's hard to say if any changes have been made in the more recent update that moved some menu settings around and added Dolby Vision: https://www.reddit.com/r/xboxone/com...s_information/
I can confirm if I use 8-bit Color Depth, the sun in the Calibrate tool disappears when set to PC RGB, which seems to line up with their Whiter than White observations. Switching to Standard Color Space resolves this. Oddly, staying at PC RGB but upping Color Depth to 10-bit or 12-bit also seems to resolve this, though I'm not entirely sure why...maybe for PC RGB in higher bit depths, the Whiter than Whites are compressed down to something now usable in PC RGB mode, whereas with the uncompressed signal, they aren't.
I used to always use PC RGB, but given XBOX's own recommendation and the observations/resources mentioned here, I think sticking with Standard makes the most sense, especially with 8-bit Color Depth selected. Otherwise, you could be clipping Whiter than Whites or Blacker than Blacks, as the Calibrate screen shows.
UPDATE 05/21/19: I got a request to see if Standard vs. PC RGB Color Space affected HDR output at all. While my assumption was that it should not, I hadn't thought to test this at the time. I tested an HDR game with RGB set to Standard and then to PC RGB, and I got BT2020 4:2:2 12-bit for both, regardless of how Color Space was configured. I believe BT2020 specifies Color Space, so for HDR, it's a non-issue. I also tested both settings with Allow 4:2:2 Off, and I still got BT2020, but with 4:2:0 10-bit instead, again for both regardless of Color Space setting. Therefore, it can be concluded the XBOX Color Space setting does NOT seem to affect HDR and only applies to SDR RGB sources. Please note that I don't have an HDR or Dolby Vision Blu-Ray movie to test, but I'm 99% sure the same should apply to those as well since HDR and Dolby Vision seem to handle Color Space via their BT2020 signal format instead of RGB.
Allow 4:2:2 Setting:
This setting doesn't seem to affect SDR menus or games from anything I tried. You still get RGB (so the equivalent of 4:4:4) either way.
With Allow 4:2:2 ON, HDR displays as 12-bit 4:2:2. [This surprised me, as I expected 10-bit 4:2:2, not 12-bit...the move from 4:2:2 must up the Chroma subsampling to make it 12-bit.]
With Allow 4:2:2 OFF, HDR displays as 10-bit 4:2:0.
As 4:2:2 should have higher quality color data, it's generally considered the better format to use.
Even when set to 8-bit Color Depth, DVD's play at 12-bit BT709 4:2:2. If "Allow 4:2:2" is OFF, they instead play at 10-bit BT709 4:2:0.
For HD SDR Blu-rays, it's 10-bit BT709 4:4:4 regardless of the "Allow 4:2:2" setting. (I do not have any native 4K or HDR/Dolby Vision Blu-rays to test at this time, though I believe XBOX One S and X can indeed play 4K/UHD Blu-rays, though I don't think the original XBOX One can.) [I initially was surprised to see 4:4:4 for Chroma as I knew that was a challenge at higher bit rates, but with feedback from the people at HDFury, realized it must be possible with Blu-rays because of the fewer frames per second (~24 as opposed to ~30-60 in games).]
PlayStation Pro 4 (Should Apply to Non-Pro Versions):
PS4 is a lot easier to configure in some senses as the options aren't quite so confusing and the Automatic settings work well (depending on display, I'm sure). With everything (Resolution, RGB Range, HDR, and Deep Color Output) set to Automatic, things work great on my TV, though as with the XBOX, if your TV requires changing settings for proper Color Space (called "RGB Range" here), there might be good reason to lock it to Limited (Standard on the XBOX).
Interestingly, by default when set to Automatic (which is listed as Recommended), menus and SDR games display in 8-bit Full Range RGB (PC RGB on the XBOX), so it doesn't seem to have the same concerns with Full Range clipping anything, though there aren't any built-in Calibration patterns, so I'm not sure how I could test this.
HDR games display as 12-bit 4:2:2 (the same as XBOX with the "Allow 4:2:2" Checked).
When playing a DVD, I get an 8-bit BT709 signal at 4:4:4 Chroma.
With a HD SDR Blu-ray, I get 12-bit BT709 at 4:4:4. (I do not have any native 4K or HDR/Dolby Vision Blu-rays to test at this time, though from my understanding, PS4's CANNOT play 4K/UHD Blu-rays.)
Not a lot of options here. It's a 1080P signal (instead of 4K like the others) at 8-bit. This is with everything (TV Resolution and RGB Range) set to Automatic (again, no warnings not to use it, and it chooses Full by default).
I find it really interesting how differently Sony & Nintendo's console treat Automatic settings including preferring Full (PC) RGB Range (not to mention the lack of a Color Depth option on those two altogether). Preferring Standard (Limited) range and having that Recommended is also unique to XBOX.
Honestly, I had a really tough time at first believing that 8-bit was the proper Color Depth setting. My display is natively 10-bit (and is made to be able to handle a 12-bit signal well; ie. Dolby Vision), so perception-wise, it just *feels* wrong to use 8-bit, but data and analysis are data and analysis; 8-bit has the full color range of SDR games and is the only true uncompressed option!
I also was surprised by the differences in DVD/Blu-ray playback between the XBOX and PS4 Pro. [The same DVD/Blu-ray was used to test both systems to avoid any disc differences, by the way.] I'm a bit puzzled why they output anything but 8-bit 4:4:4 as from my understanding, without HDR or Dolby Vision, those sources should ALL be 8-bit natively. Still, I think the PS4 probably has the slight edge for those uses since it preserves 4:4:4 Chroma in all cases.
Interestingly, when Vincent Theo at HDTVTest did his XBOX testing in late 2017 (https://www.youtube.com/watch?v=SHVJYB-Qews
), his Color Depth setting didn't seem to affect SDR menus and games the same way mine did. Either something changed with the XBOX software since then (the menus are a bit different now) or the projector didn't refresh properly at the new setting. It's possible since a Blu-ray was running in the background, that had something to do with it as well. His conclusions were right in the end either way, though, and he has several more observations about 4K Blu-ray and HDR playback than I do as I focused primarily on gaming.
I've found this experiment pretty interesting, and it's nice to be able to easily verify what a source is. If anybody has any questions, I'd be glad to do my best to answer. Similarly, I'd enjoy hearing any contructive feedback, and if there's anything anybody would like me to test, I'd be glad to within my ability to do so. =) I hope a few people find this useful and interesting!