Originally Posted by GregLee
I suppose. But why assume no dithering? More pixels gives more dither patterns gives smoother transitions.
Well they could do - though I think interpolating to a fake 10 bit colour (not dithering) would be better. But the reason is, because you might want it to display what is actually in the source - without changing it (without introducing fake new colours/noise).
eg. If I make 3840x2160 video, with 8 bit pcc colour, and I have a 3840x2160 TV by dithering you're adding noise to something that wasn't in the source. The video may already have dithering in the encode. What if it was computer graphics/text that I didn't want dithering added to? Basically dithering is like picture noise - while it can reduce the appearance of banding (or make it appear as though there's more colours) it's better not to have to use it, especially if content could look worse as a result - also dithering in an encode increases the bandwidth -it's better to have 10 bit encoding and no dithering. Would dithering make the text of this post any clearer or make it worse? I'd think worse.
When they create video encodes where they have a 10 bit source and from that they create a dithered 8 bit one, that's dithering from a higher number of colours to a lower one. But if the source (eg. of a UHDTV) is only 8 bit, it may or may not have dithering already in it, and because it's source isn't any higher than 8 bit, it can't really dither to simulate a higher number of colours - because it's source is only 8 bit. It can only dither to reduce banding (or aliasing?) etc.
Upscaling may interpolate the colours, but you are also really blurring it. Basically I think they should adopt at least 10 bit, from source to display, then (even with other colour enhancements) we shouldn't get banding (or should get less - you still might if bitrate is too low).