Originally Posted by Charles R
I presume you won't buy a new display if it has a higher ANSI CR... since the detail would be boosted. Oh and by the way since default isn't pass-through... as posted by an insider how do you know for a fact it's boosted? You have no way of knowing whatever setting the player or you pick is delivering the image as intended
What is the intended ANSI CR... 500:1, 600:1, 1,000:1? It's certainly not solely determined by the source.
You are not understanding contrast ratio.
ANSI CR is the light measurement of the brightest white on the screen divided by the light measurement of the darkest black on the screen.
Contrast ratio is a display measurement not a source measurement. I would buy the display with the highest ANSI CR that I could afford (unless in the future there are displays so bright they will burn your eye like a welder).
The HDMI spec doesn't send a signal as ANSI CR but sends video as either RGB or Y Pb/Cb Pr/Cr. If you are using standard color space (8bit per channel from Bluray) you map 16 to black and 253 to white. The ANSI CR is just a measurement of how dark your display's black is versus how bright its white is. And by the way Deep Color and x.v.Color don't change the contrast limits either and only add finer steps between the displays limits. When displays have larger ANSI CR using Deep Color is more important to make the brightness steps less coarse.
Since all Bluray players tested/reviewed so far displays BTB (<16) and WTW (>253) when setup correctly, that mean that no manufacture is boosting or decreasing these color space values (not when properly setup).
The setting that is called Detail on your player is called sharpness in a lot of other displays and players and is not boosting anything but applying filtering and EE. Don't feel bad about using them (Detail boost), many old timers like to set their colors on their TVs to over saturated levels and unnatural and a lot of people like to run their HDTV's in the super bright mode the show rooms use.