A lot of devices are capable of displaying deep color. The only problem is that there aren't any sources that are encoded in deep color so my suggestion would be to leave it off or disabled, otherwise you may have video problems because the tv will try to display information that isn't there. This is one of those specs, like HDMI with ethernet, that never really came into play.
My xbox setting fpr color depth doesnt turn on and off. It has the option for 24 bit, 32 bit, and 36 bit. It only says it supports 10 bit? I figured being only 6 months old and a nice brand it would support higher.
Blu-ray's are encoded in 8 bit RGB/ 24 bit depth. Unless that has changed, there aren't any sources that are encoded in Deep Color. You can certainly upscale to 30 or 36 bit (as most newer blu-ray players can) and quite a few tv's can display Deep Color ( I know my 3 year old LG can) but all you're doing is adding information that isn't there in the first place. What can happen is you either get a very bad looking picture, or depending on the quality of the upscale, banding in the picture in a lot of cases. It's a feature that I wouldn't worry about or even use.
Java developers, when I saw what has been placed into Java 8 I was immediately reminded of how I've spent so much of my life trying to protect engineers from themselves. Lambda expressions are a horrible idea. Gentlemen: the goal isn't to make code readable for a competent mid-level engineer. The goal is to make code readable for a competent mid-level engineer exhausted and hopped up on caffeine at 3 am. What a disaster Java 8 is!
Yeah, I know about that. I think the OP was wondering if enabling Deep Color on his XBox would make for a better picture because his Vizio was Deep Color capable (which most tv's are). Other than the above mentioned camcorders, I think it would cause more problems (banding) than anything because of source encoding and isn't really a good idea to enable it. Or has that changed?