LIVE: AVSForum Tech Talk Podcast with Scott Wilkinson, Episode 5 Click here for details.
Enter to be our Home Theater of the Month Click here for details.
Originally posted by srgilbert . . . On a display that is rated for 30,000 hours to half life (half original brightness) there is no way you should be able to detect a difference in 200-400 hours . . . . please share some links with us, we need to know the truth! |
Originally posted by trainerdave This would be correct if the phospher wear were linear - that is, if the briteness loss in the beginning is at the same rate as after thousands of hours of use. However, phosphor wear (and its impact on brightness potential of the pixel) decreases in a logorythmic way over time. If you plot it out, the decline curve is pretty steep in the begininning and gets more and more shallow over time. In other words, the first 30,000 hours you describe cuts the brighness in half, the second 30,000 hours cuts it just a little bit more. And the first 1000 hours may account for well over half of the decline in brightness you mention in your 30,000 life-to-half-briteness scenario. |
Panasonic: States (not publicly) that the monitor is good for 20,000 to 30,000 hours. They also state that these plasma displays measure 50% brightness (phosphor ignition may be a better term) at 50,000 hours. Dissipation begins the moment you turn the set on. After 1000 hours of usage a plasma monitor should measure around 94% brightness, which is barely noticeable to the naked eye. At 15,000 to 20,000 hours the monitor should measure around 68% brightness or to say it differently, 68% of the phosphors are being ignited. How do the manufacturers know how to calculate the figures since plasma monitors have not been out long? The manufacturer facilities in Japan test plasma panels at 100% white image light and measure down from that point with meter readings. It takes hours to find that 50% mark - between 30,000 and 50,000 hours. What a job that would be… - to watch the white light. |