Originally Posted by GregCh
What he is saying is that JVC does not have lower contrast than Sony. In fact, he is saying the opposite. The JV contrast is far superior and shows that with normal movie content. The issue is that the patterns used to measure 1%, 2%,...ANSI contrast are misleading because they bring out the lateral streaking between 100% white squares that raises the black square measurement values. While true movie content at say 10% APL rarely if ever has 100% white squares adjacent to 0% black squares.
These streaking defects don't just exist in JVCs though. Almost every Sony I have seen tested, when displaying 100% white text on 0% black yields blue tears / vertical streaks on the edges of the text. This can easily be seen in credits. It isn't easily seen from a normal seating position. Normally you have to get close to the screen to see it but it is there.
Here is an accentuated photo of the problem on a Sony 885 ($25K projector).
The devils advocate in me would say sure, you can see the effect on all projectors, but if it measures worse in the JVC than on those other projectors, then the JVC is doing a worse job of controlling this particular kind of thing. And it's not like black pixels and white pixels are somehow magical unicorn dust; they might be the (almost) extremes of the panel gamma but the effects of slightly lighter pixels will just be appropriately moderated. If you turn down the patch value by 1%, I guess the streaking doesn't go away?
My few viewing observations when I saw a Sony 760ES, LS10500 and JVC X7900 in split screen was that the X7900 had the lead in out and out black level in very dark scenes, that the Dynamic Iris on both the JVC and Epson could get tripped up - but not so on the Sony laser dimming (later turns out measurements showed very little benefit from the dimming in the Sony), and of all three I found the Sony to have the most shadow detail. Now these were all set up by a renowned UK calibrator in his own showroom, and if I'd had the money available I would have quite likely gone for the Sony if I'm honest. As it is, I'm a card carrying X7900 owner and love it.
I think just discounting this kind of result is wrong, and with HDR content becoming much more the norm I'm sure we'll be seeing a lot more very bright frames with very bright things next to very dark things (watched the new Star Trek series episode 1 on Netflix the other day, I felt like my eyeballs had been assaulted!)
It is quite possible, perhaps, that in spite of worse internal optical performance with respect to streaking the NX9 image end up overall more pleasing because of the tone mapping being that much more advanced. Or perhaps the wider colour gamut you get and the resulting pop works takes you over. Or perhaps the better native contrast trumps all because the frames you really do notice are the lowest ADL - and we're not there yet in terms of our "calibration" of what test results really matter for image quality.
Last point on this; if you could control everything else, and just improve on the streaking, would you have a better image? The answer is provably yes as the streaking is visible with the naked eye; so the reporting of and pursuit of improvement in this aspect has to be a worthwhile enterprise, though not to be blown out of proportion