Originally Posted by Todd68
What is the relationship of contrast ratio and measured black levels?I thought the higher the contrast ratio the better the black levels should be.
As someone else said, manufacturer figures are meaningless lies. There are two types of real contrast ratio measurements:
1) on/off contrast. This divides brightest white (usually of a smallish window on the screen), by best black possible, usually done post calibration. Thus, 40fl whites, 0.008 blacks = 5000:1 (the figures of a G20). Unless the set has some problems getting calibrated to produce higher light outputs, this ratio is basically entirely down to the set's black levels.
2) ANSI contrast. Far more demanding, it tests the TV's ability to produce deep blacks while bright stuff is also on screen, with a full screen test pattern of large white and black squares. For the G20 above, the whites might drop a little, say to 37, but the blacks will get considerably lighter, to maybe 0.022, thus an ANSI ratio of 37/0.022 = 1681:1. The Samsung plasmas I saw from last year did worse on this test (1000:1ish), the 2010 LGs a bit better (2000:1 on the pk550). Note these are all 50" models btw, the 58/63 ones tend to have better blacks.
Which of the above is more relevant is a little tricky: on/off only measures a completely black screen, and a tiny white window, neither of which you're ever going to watch. ANSI meanwhile is a brutal extreme test of the set, perhaps too much so - there are very few movie scenes that would replicate that. However, a TV where on/off and ANSI ratios aren't too far different is going to be good for dark scenes, because you won't see the black level shifting as small bright elements come on screen (eg lights or campfires outdoors).
Personally I'd like to see a third contrast measurement come into use, so for reviews we'd have on/off, full ANSI, and some sort of half-ANSI (perhaps a test pattern where half the screen is black, half is a black/white checkerboard).