So I have my 4k bluray player hooked up to my 1080p plasma which I use to watch 4k movies. I just finished calibrating it with the Spears & Muncil UHD HDR calibration disk and I noticed one of the test patterns is Quantization Rotate. From the research I have done, this pattern is suppose to help decided whether it will look better for movies to be output in 8bit or 10bit color depth but the part I'm confused on is whether I used this pattern correctly. I looked at the pattern with my player settings output in 8bit and then again in 10bit and both results look the same. The 8bit shows obvious banding while the 10bit shows banding but to a far lesser extent. So my question is that because 10bit looks smoother on both settings that 10bit would be the optimal color depth? I also read that because my plasma has HDMI 1.3 ports that it does support deep color so it would support 10bit?