Originally Posted by yudkib
heres the math i'm using:
On an Australian website they reference some state-mandated test to note the energy consumption of the TV's. 60F8500 is 809 kWh/yr, 60F8000 is 356 kWh/yr.
So 2.25 times the power usage for plasma versus LED give or take. Very believable. I saw something that said the consumption for the 55" LED was give or take 90-110 watts. My brightness is high so we'll say 110W. Plasma would be 250W, which would probably trend toward the low side for 2013.
So, 250W-110W = 140W difference. With my TV running 12 hours a day @ $.20/kWh works out to $615 a year additional for me to run a plasma. No thanks.
I didn't realize the F8500 Plasma was so power hungry: 218W Average, 653W Max.
My modest F4500 Plasma is Energy Star 6.0 compliant and is: 67W Average, 201W Max.
By comparison, the 60F8000 LED is also ES 6.0 compliant : 79W Average, 182W Max.
Edit: I tried to find the full power specs on the HL-T6187, and the most I can find is "230W during operation". I think you're over estimating how much additional it will cost. I'm in NY, and my electric bill is only around $100 a month on average (more in the summer), and I have multiple computers on 24/7, 1 or 2 TVs on for at least 12 hours a day, and 2 refrigerators.
Edit #2: Holy crap. You're paying .20 kWh in NYC? Is that Manhattan? I'm "only" paying .09 kWh on Staten Island, which is considered part of NYC. Perhaps your math is correct.Edited by justlou - 6/19/13 at 2:26pm