Originally Posted by PlasmaPZ80U
Will enabling the deep color setting on a BD player or PS3 reduce any banding when watching BD movies?
Potentially, it could reduce banding and/or image noise. Whether or not it results in a visible change depends on the source player and the display. I would expect little to no benefit on a Plasma display for example, as they already use a lot of dither to display images.
Originally Posted by Smackrabbit
Likely not. The example being shown is dealing with larger internal bit-depth calculations and then dithering down to 8-bit output. Since any player could already do this without needing Deep Color support, it's really making no difference in this example.
Also, this example is using a grayscale gradient, which is already being passed at full resolution in 8-bits per pixel. If there are large bits of banding occurring instead of dithering, as seen in the example, that can be fixed at the mastering stage instead of in the processing stage.
To be clear, the gradient source is not
1080p native, that is an upscaled image. And yes, it's illustrating the benefits of higher internal bit-depth processing.
This shows that even though the source material was 8-bit, 8-bits is clearly not
enough precision to do things like image scalingand all
images on Blu-ray require chroma to be upscaled.
So if all Blu-ray (or MPEG video in general) requires that chroma be upscaled to 4:2:2, 4:4:4, or RGB, then there must
be more than 8-bits of internal precision for good image quality.
If you are using more than 8-bits of internal precision, it's stupid not to pass that on to the display when you have the option (deep color) especially if your display has a 10-bit native panel as many LCD and SXRD displays do.
Furthermore, while the 16-bit image dithered down to 8-bit looks fine there, this is only half of the image processing chain. As soon as it gets to your display, that image is going to have further image processing applied to it (greyscale, gamma and gamut calibration) which can exaggerate the noise added to the image by the dither process and/or introduce banding. If you were able to pass the undithered 16-bit data to the display, you avoid this.
Originally Posted by Smackrabbit
However nothing in the example shown would require either Deep Color, or better processing in the Blu-ray player, but could all be done at the mastering step, which would be a better place for it to occur.
MPEG video is mastered as 4:2:0 data and requires upsampling to be displayed. It applies to all consumer video.
Originally Posted by Doug Blackburn
If it does, the upsampling is broken, period. If video was treated like a static test pattern, potential detail within an image would be eliminated by the upsampling examples posted severl posts ago. The upsampling doesn't know if the contouring steps are image detail or encoding errors. If it removes the contouring, it could be removing detail if subtle detail exists. So upsampling alone should NEVER remove contouring if it was present in the source. Only "intelligent" processing of the image to find and identify contouring vs. actual image detail and removing the contouring (as the Sony 4K projector's control does) can remove contouring effectively in moving video images. There are MANY ways to manipulate static images into showing or not showing "steps" in a fade that have nothing to do with how you would treat video.
Doug, my example has nothing to do with Sony's super bit-mapping technology. It shows the difference between scaling an image with 8-bits of precision vs 16-bits of precision, even though the source image, and final output are both 8-bit.
It demonstrates without a shadow of a doubt, that more than 8-bits is required to process an 8-bit image, even when your final output is still going to be 8-bit. If it requires more than 8-bits to process, what good reason is there to not
pass that on to the display?
While I respect that you are a knowledgeable calibrator and reviewer, it is clear that you are not involved in video mastering or image processing.
Originally Posted by amt
Which means if you stay in 4:2:2 YCbCr, you get 12 bits, even with HDMI 1.1!
This is why I always set my output on Lumagen Radiance to YCbCr 4:2:2, so it can actually keep >8 bits of precision for the color and send it to the display.
Now, the question I would have is: Do players like the Oppo, when configured to 4:2:2 YCbCr (no deep color) make use >8 bits when doing the chroma upsampling?
It is certainly possible
to do that, but there are no guarantees and virtually no sources will tell you whether or not they are doing this. Deep color on the other hand does
indicate the bit-depth that is being passed down the video chain. I would also recommend that you scale the image directly to 4:4:4/RGB in the player, because any good display will be showing full resolution 4:4:4/RGB images, and you want to avoid two upscaling steps. (4:2:0 to 4:2:2 then 4:2:2 to 4:4:4/RGB)
And for the record, one of the reasons I do not
recommend the Radiance, is because it only processes the image in 4:2:2, throwing away resolution.