Originally Posted by Mrorange303
The more pixels the sharper an image can appear to the eye. Color can also be greater as the transitions of color have more pixels to complete the transition.
To every 1 1080p pixel there are 4 4k pixels to a coming the task.
So when only 1 pixel on a 1080p set may light up to show green for grass.
On a 4k set the 4 pixels may be more like 2 lighter green, 2 darker green. This creates detail within the color, shadow, hair, etc.
first of all resizing reduce sharpness or add other artefacts like ringing aliasing and other things. true UHD on UHD can be sharper that's not the point here.
this sub-topic isn't about UHD is about FHD on UHD and that the UHD represent higher "bit deep" with that content.
but it's the opposite.
think about a 256 steps grey ramp that a change in pixel every 15 pixel if send rounded as UHD in a UHD panel so the change to see banding is way higher. and this shows that UHD can take full advantage of true 10 bit with every 3.75 pixel a need step.
now we enter the world of dithering and high bit deep processing.
the grey ramp picture gets now in the TV as 1080p and gets scaled to UHD. this creates an float point with needs to be down sampled to 8 bit this can be done with different types of dithering or rounding.
dithering spreads this error over more pixel so the TV doesn't create blocks with that upscaling like 15 pixel big blocks because the steps are now about 15 pixel big. so with dither the error is now spread over multiply pixel to make the banding less obvious. this adds noise it doesn't work without noise.
so and you call this an quality improvement ?
just making new pixel up create a ton of errors in the meanwhile and them spread them?
so this sound like UHD is reducing picture quality. this is of cause not true.
a native 10 bit UHD with true UHD 10 bit input source like a 1024 step grey ramp would show a theoretical huge improvement over a 8 bit FHD screen.
of cause i know a FHD screen needs to scale a BD input too but this is "only" chroma.