Originally Posted by 8mile13
A poster claimes that his used 4.000hr 500M appears to have dimmed by 20%
compared to the new 500M he bought.
In his words: ''it's such a dramatic difference that i've been able to record the difference in overall heat expenditure between the two TVs.''
''I can only guess how badly age affects all the other Plasma manufacturers.'' source
Originally Posted by Timmatron
Chad, I was just reading cnets review of the UN55B8500 and they observed that having the 240hz mode turned on dimmed the light output considerably. I'm wondering if your measurements of the 55LHX had anything to do with that same phenomena. Its also been known that in the earlier versions of the 55LHX firmware, that TruMotion would turn itself back on every time you turn off and on the tv, even though the picture settings show that it is off. So that could have been the case for your follow up calibration unless you made it a point to physically turn on and then off the TruMotion.
So I was thinking that could be one solution as to why the light output was so much lower. I'm not sure if that explains the lack of green though.
Quote from Cnet review:During our calibration we noticed that, as with the LN46A950 from last year, the 8500's LED Motion Plus setting dimmed overall light output considerably, so with it engaged (see video processing below) we had to crank the backlight control to 8 out of 10 to achieve our standard 40ftl light output.
OK, I have a bit of time to flesh this post out now.
First off, I am VERY surprised this is just now being covered. Your white LEDs usually uses some sort of yellow phosphor over Blue-UV LED to produce the white light. Often times a combination of phosphors (such as greens or reds or oranges) are used in order to get a warmer or more level output (more common with UV LEDs). Other makers may use broader band phosphors to achieve a similar effect (this is more common with blue LEDs). In some rare ones, special quantum dots are used as the phosphor. Phosphors as with most other emissive materials have a half-life. As most of you know, this means that the intensity of emission decreases over time in an exponential fashion. The point at which the intensity is 1/2 of its original value is known as the half-life. It LEDs it is more common to use the value L70, which is analogous to the half-life except the reference point is 70% of output (you may also see L50 used, which is the same as half-life).
Another aspect of lifetime which is commonly quoted is the physical lifetime. These are usually quoted as B numbers. So for example a B50 number would be the amount of time until 50% failed or B10 would be the amount of time until 10% failed. This means you want your results to be at the lowest B numbers possible (if they quote them at all).
While manufacturers list L70 lifetimes (time to 70% of original output) of crazy numbers like 50,000 hours or what not, this number is usually at a low current and Tjunc. If cheaper LEDs are used, the LEDs are hot, over-driven etc. You will start seeing a decrease in the output of the LEDs rather rapidly.
For example, a Philips Luxeon K2 goes from 60,000 L70 down to 10,000 when going from a Tjunc of 120 to 150oC at 1.5A current, drop the current to 350mA and this transition occurs at 160 to 185oC (so they could claim L70 of 60,000 hours at a Tjunc of 160oC, but if you drive the LED at 1.5A but the same temps your LED is now dead). Another LED the Cree X-Lamp XR-E lists a lifetime of ~25,000 hours with a Tjunc of 150oC and a current of 350mA. This is a bit over two years continuous. So in a design where these LEDs are driven to near their max and not well ventilated you may see drastic drops in a year or so time. A more modern LED would be the Philips Lumileds Luxeon Rebel. The B10/L70 (where only 10% are allowed to fail and the rest maintain 70% of their voltage) for these LEDs is around 9,000 hours at 150oC and 350mA, step it up to 700mA and the L70 is only around 6000h now, step it all the way up to 1A and you are looking at around 5000h for the L70. Now to be truthful if the the LEDs are heatsinked to keep the junction under 125oC then the L70 shoot up to 60,000h.
Why is all of this important? Well when manufacturers want to cut costs they can do a couple of things. First, they can cut down on heatsinking. If the LED is run at a low current, then it produces less heat, it is easier to cool with a smaller heat sink, etc. They just need more LEDs to get an equivalent output or they can let the LEDs get hotter, which shortens their life. Second, they can cut the number of LEDs. This means to get an equivalent output, each LED has to be run at a higher current to make up for the difference. If the LED is properly heatsinked this isn't an issue, otherwise you get a faster degredation of the LED. Additionally, since brighter sells better there is a good chance that many of these companies are really pushing the LEDs near their limits. This means they really need to manage the heat from the die properly as LEDs efficiency often drops at higher currents. Likely all of the above are factors in play.
Now a bit of best and worst case scenarios.
Lets say you have a model where the LEDs are running hot and are really being push hard to get maximum brightness in the store. Such that lets say the L70 is down at 10000h. Assuming you watch your TV 8h a day, you have lost around 10% of the brightness is the first year and by the end of the second year you are down 19%. Now if this TV is on essentially 24-7 (say as in a store), by the end of the first year the TV has lost 27% of its brightness.
Now compare this against a well-designed model in which the LEDs are run such that the L70 is 60000. By the end of the first year, you have only lost around 1.8% and by the end of year two the brighness is only down 3.4%. Or if run 24-7 by the end of the first year the TV has only lost 5%.
So as you can see, proper thermal management and current management for the LED is very important and can DRASTICALLY affect the lifetime of the television.
Up to this point, we have only addressed the overall output and have said nothing about chromatic maintenance. To be honest, I have seen very little on this aspect of the LED. That being said, for LEDs with multiple phosphors these phosphors will likely have their own distinct lifetimes so you will see a shifting of the color as well (from the decrease in emission of one of the phosphors compared to the others). Similarly in a single wide band doped phospor, you may see certain emission changes overtime due to a variety of degradation pathways. Some of these changes may be only visible with a spectrometer, others may be more apparent.
Lastly, comparing two separate units is foolish. Most LEDs are binned at output binned at around +/- 5% (if not higher). There fore he could very easily have a 10% change between units if the LEDs were at either end of the above spectrum (let alone lets not go into precision and accuracy of the measuring instrument etc.). So quite simply comparing the difference between two units is silly. This alone is assuming all things are equal in the TV and the TV maker didn't use different bins of LEDs, a driver with different output capabilities, etc. There are TOO many things that can go wrong when comparing separate models.