I already said a lot of that. The original question was basically how hot are this years plasmas, and subsequent responses raised the question of whether plasmas convert a higher percentage of their energy consumption into heating the room than say incandescent bulbs or LCD TVs.
I now believe no, there is no difference. All of them basically heat a room at the same fraction of their rated power draw: 100%.
Andy Sullivan, here's your original post about heat:
Originally Posted by andy sullivan
I need a little ammunition if possible to defend the 2012's regarding heat. Someone said that the 55st50 is rated at 400 watts. I don't really know what that means but any heat related help would be appreciated.
You might not have noticed, butthe post just above yours
shows a photograph of a European 50"VT50 clearly labeled as putting out 195 W.
And earlier in this thread this guy said
he measured 120 W on his 50"ST50And rogo claimed
'a 65" plasma' would come in at around 300 W after calibration.
So whether your 50 incher puts out 100 or 200, or your 65 puts out 300, really we have a rough idea of how much power 2012 plasmas are using up: it's not 20 and it's not 2000. It's going to be something like 200.
So what fraction of that 200 W goes to "heating up the room"? Apparently all of it.
I just ran across something that made me realize this in a wikipedia article
(emphasis mine) :
the light energy will also be converted to heat eventually, apart from the small fraction that leaves through the windows
A light bulb, for example, might have 2% efficiency at emitting light yet still be 98% efficient at heating a room (In practice it is nearly 100% efficient at heating a room because
So your 200W TV is basically going to heat a (windowless) room just like a 200W incandescent bulb or a 200W hot plate or a 200W LCD TV or a 200W laptop or anything else. Watts is watts.