Originally Posted by vinnie97
Otherwise known as the automatic brightness limiter control circuitry. Not even on the radar of concern for me, though there is a vocal minority who are incessantly perturbed by it. The most noticeable real-world content in which it rears its head is during commercial breaks with bright solid colors on full-screen.
Actually, I would argue that it's quite noticeable in bright outdoor scenes in films, and it is most
noticeable if you play games or use a computer hooked up to the display - those typically have much higher average picture levels than film.
Originally Posted by vinnie97
You will see it most frequently referenced around here as ABL for short. Anyway, I can't agree...nonapparent ABL is preferable over LCDs.
Unless they can get plasmas back to the level that CRTs were at, where you only lost ~10% brightness when going from a 1% area pattern to a 100% area pattern, I would not call it "nonapparent". Most plasmas are still losing a significant amount of brightness. The Kuros lost 50% for example.
What is also important, is when
that brightness loss occurs. The Kuro TVs started to lose brightness around 20-25%, whereas the Kuro Monitors started to lose brightness after about 50% area. So the overall ABL amount may have been the same (about 50% brightness loss) but the impact of the ABL was significantly reduced on the monitors.
It's really a shame that there are no review sites out there which do proper ABL testing of plasma displays, and that it will likely continue with OLEDs. I think it's an important metric to use when comparing displays.
A number of LCDs now have an optional software-based "ABL" and it would have been nice to see how closely that emulates the behavior of other displays.
Another factor is that with Plasma displays, this seems to be a fixed adjustment. No matter what level contrast was at on the Kuros, you always saw brightness start to drop at 25/50% area, and you always saw that ~50% brightness loss going from 1% area to 100%.
With a CRT, as you reduced the contrast control, the ABL was relaxed so the brightness loss was reduced.
This seems like it may be the case with OLED, as one review has shown that Sony's OLED monitors have no ABL when set to ~70% contrast.
Originally Posted by rogo
That said, when you are watching in darkness or near darkness, over bright whites do something very insidious that can absolutely destroy the video quality of what you're watching for the next 10-30 minutes. if you get "hit" with a too-white/too-bright scenes, your pupils will constrict -- rapidly -- to protect your eyes. Good stuff. Unfortunately, the dilation you'll need to enjoy the subsequent dark scenes that might exist will take many, many minutes. No way around the biology of this. Doubt me? Try going from a bright bathroom into your dark bedroom. See how well you can "see in the dark". Sit around for 10-15 minutes. Then try again. Amazing difference. ABL is usually deployed because of technological limitation, but a properly calibrated LCD will have its peak white limited
in cinema/movie mode so you don't blow out your eyes.
What's interesting about this, is that there are some people that argue for
the use of ABL based on this type of example. If you display a bright beach scene on the TV in a dark room, it will dim the picture and "protect" your eyes from the brightness.
The problem is that you effectively have two "irises" in action when an ABL is in effect - your eye and the ABL. If you display a bright beach scene on the TV, the plasma dims it - but the expectation of human perception is that your iris should be closing and restricting the brightness when something is bright. That your iris can stay wide open without having to adjust to a "bright" scene gives the perception that the plasma TV is dull.
The inverse is also a problem. I follow the calibration standards to the letter, and so my television is calibrated to a peak white of 100 cd/m2. This means that white is at 100cd/m2 whether it's the full screen, or a small area.
With a plasma display, if you calibrate it to a peak white of 100cd/m2 (peak white being a 1% area) you may have an average brightness of ~75cd/m2 and it will drop all the way down to 50cd/m2 with a full white screen. This means that regular viewing is very dull.
Some people opt to calibrate the display using a 15-25% pattern rather than following the spec and using a 1% area pattern. This means that your brightness range might now be 150-75cd/m2, so now when you are viewing dark scenes, any bright lights (e.g. flashlights) are piercingly bright.
It's even worse if you were to decide that you wanted the display to never drop below 100cd/m2 (a calibrated LCD should not) because then your brightness range is 200-100cd/m2 and the average picture level is going to be above 100cd/m2 most of the time.