The net effect to the image we see on the screen is modified more by Native On/Off in dark scenes, then ANSI modifies the net effect of bright scenes, simple as that. In other words, black levels are more variable between projectors than intrascene contrast in bright scenes. Whereas there are small differences in bright-scene contrast, there are LARGE differences in dark-scene contrast, and that is why On/Off matters more.
I never said I don't see ANY difference, but it is a multi-variate problem to our eyes that our eyes mistake for other attributes of DLP... It has to do with where the ANSI measurement comes from and how it equates to visible intrascene contrast on the screen. The checkerboard pattern is a terrible indicator as it translates to the intrascene contrast ramp of a device in an average bright scene, whereas Native On/Off is a good indicator of how well the intrascene ramp translates in an average dark scene. ANSI does matter in bright scenes (and almost none in dark scenes on today's devices), but the problem is it doesn't matter ENOUGH in bright scenes anymore with the max conditions of peoples viewing area (even if batcave, instead we now need NASA-level black absorbing materials).
Everyone that compares ANSI talks about DLP looking better in bright scenes, when I've done the comparison (and I've done it many times), it is not ANSI contrast people are seeing most of the time (sometimes, very rarely), but rather it is one of the following:
1) Pixel fill - DLP has worse pixel fill and this can make some scenes POP more
2) When there is excessive noise in the image, a DLP tends to seem to maintain sharpness better in the noisy image than other techs, even if on paper sharpness were the same
3) Gamma Errors + Placebo of Pixel fill and differences in brightness make people think incorrectly about ANSI
The actual levels of Intrascene contrast (the supposed effect of ANSI) in bright scenes is almost impossible to compare between two projectors without using some very scientific methods of elimination.
Again, most movies are not well represented by being close to ANSI because the scenes are not bright enough with enough intrascene contrast by default, the times I've noticed ANSI contrast differences are mainly in blown out scenes (like in a movie scene where light is shining through a window in the morning, or the Harry Potter scene where he goes to Heaven temporarily, the problem is there are SO few of those types of OVERLY LIT up scenes in movies).
Does that mean DLP with higher ANSI doesn't sometimes look better, yah on some bright scenes DLP looks better than LCOS (but even LCD can as well on occassion).
In some bright scenes, LCOS looks better than DLP or LCD. Depends how it was filmed, so it's very subjective to try to equate this to ANSI contrast.
It will depend on the source and room conditions far more than the ANSI contrast.
Edited by coderguy - 12/29/12 at 3:40pm