i tried it already but for them 4 pixel on a UHD screen are a new super pixel that is worth 10 bit. don't forget with SD these are 12 bit ultra pixel!
the theory is very simple. you have now 4 pixel in 8 bit that show 1 pixel so they can now be a little bit different to give 4 times more pixel states.
but if you resize you get float point informations for every pixel because you need a lot more bit deep to show a resized pixel with as little error as possible. so a resized picture needs more bit deep then the original picture counts for down scaling too.
now let's take a step back and feed 10 bit 1080p in a UHD screen.
if this gets up scaled by point resize and this gets dithered on a 4 pixel 8 bit basic then we may use the super pixel idea. a lot of "if" and a different source and still not as good as true 10 bit (no dither needed).
UHD has more pixel to spread the dither error but on the other hand it needs to spread more error. not a easy topic... and who said that they even dither and how good is the dithering? can be totally different from screen to screen.
At least, that is my inexpert idea of what goes on. It's not a 2k picture with 8 bit color -- it's a 4k picture with 8 bit color. If the resolution were reduced to 2k, that 2 bits of color information would be lost. If the original content were 2k at 8 bit color depth, the extra 2 bits per pixel of color information wouldn't be there at all.
so each time we down scale by factor 2 and lose 3/4 of all pxel we lose 2 bit of color information? we try to put 96 bit of color information (24 per channel) in 24 bit (8 per channel). it is just me or did we lose more than 2 bit per pixel?