AVS Forum banner

Question on plasma power consumption

3K views 56 replies 23 participants last post by  indept 
#1 ·
Hi all,

I've got a quesiton for folks about the impact that their plasma screens have had on the monthly power bill.

I moved into a new home in December, and bought my first plasma at the same time, a Panny TH-50PH9UK. I'm also running a Marantz SR7001 AVR for my sound (5.1 system from OrbAudio). Since moving in my electric bills have been twice what they were at my old place, and the plasma is the only "new" thing we didn't have at the old place. When I called the electric company the rep suggested that the plasma is the culprit.

I don't have any way of knowing how more the electric bill is with the plasma, as I don't know what the bills were before I moved in. So, my question to folks is what impact did adding a plasma HT to your home have on the electric bill? I'm averaging 1000 KWH per month, almost twice what I was averaging at my old place. Thanks for any input.

Bryan
 
#29 ·

Quote:
Originally Posted by SaltiDawg /forum/post/0


Would that be the 4th Law of Effect and Cause?

Nope. 1st law of consumer electronic devices.


Measure the total mass of all heat sink devices on a typical plasma TV set on these boards. Realize heat sinks are used to dissipate heat from electronic devices to prevent them from suffering thermal overload.


Y Sustain board

Power supply

X Sustain board


Now, take Watts law of


P = I * V


A typical Plasma panel drive circuit operates at only 200 - 300volts Vs necessary to provide the potential to light the panel but the current level is very high.


A typical DV set will run at 30,000KV at the CRT anode but the current is minimal.


I've been jolted quite a few times in 32 years of working on TV's by CRT high voltage and I can tell you, I'd rather get "bit" by that 30kv on a crt than accidently get a jolt from the Vs voltage running a plasma panel. There's far more current potential there, the kind that can kill you.


Furthermore


A typical Direct view TV set of around 36" has only half the mass of heat sink devices that a typical 42" plasma set has. Even if a 42" DV set was used as a test comparison (if one was available), the heat sink area would barely increase. It would still be far less than any 42" plasma.


But getting back on topic, a Plasma set would not cause any significant increase in electrical usage unless the owner was actually watching more TV because of owning said Plasma TV.
 
#30 ·

Quote:
Originally Posted by MOtvGuy /forum/post/0


The amount of heat an electrical device emits is in direct proportion to the amount of power the device is using. Plasma TV's emit alot of heat hence, they use a lot of power.

.

Thats not entirely true. It depends on how efficient. If your tv was 100 % efficient it would put out light but no heat, 100% of the power would be emmited as light. The same with a light bulb, if it's 100% efficient (which of course nothing, with the exception of congress
, is) it wouldn't get hot at all therfore if a bulb is 75% efficient then 75% of the power goes to producing light, 25% or 25watts would be given off as heat. I'm sure you're aware of this MOtvGuy, but this was for the benefit of non techs here.
 
#31 ·

Quote:
Originally Posted by indept /forum/post/0


Thats not entirely true. It depends on how efficient. If your tv was 100 % efficient it would put out light but no heat, 100% of the power would be emmited as light. The same with a light bulb, if it's 100% efficient (which of course nothing, with the exception of congress
, is) it wouldn't get hot at all therfore if a bulb is 75% efficient then 75% of the power goes to producing light, 25% or 25watts would be given off as heat. I'm sure you're aware of this MOtvGuy, but this was for the benefit of non techs here.

If you'll look over my post above.....


Heat sink usage and size in electronics is directly proportional to the amount of heat that must be drawn off electronic components in order prevent thermal failure. The average Plasma TV has more heat sink area than just about any other electronic component out there other than maybe a 70's vintage power amp.


It's also the reason why Plasma's, at least the early versions, were loaded with cooling fans.


BTW, if you'd like to try a little experiment, take a short piece of copper wire and short it between the + & - posts on a typical 9 volt battery. While a piece of copper wire is about as efficient as you can get when it comes to the transfer of electrons is also has a minute amount internal resistance. I promise you that wire will get hot and pretty quickly.
 
#32 ·

Quote:
Originally Posted by unionmade /forum/post/0


I don't have any way of knowing how more the electric bill is with the plasma, as I don't know what the bills were before I moved in. Bryan
http://www.pricegrabber.com/p__P3_In...a%20watt/skd=1

Best $25-$30 you will ever spend. Just monitor your TV for a month and compare the results it to your electric bill. Then measure your Marantz, and then your refrigerator as one poster suggested.


Archived: http://archive2.avsforum.com/avs-vb/...d.php?t=582016
 
#33 ·
aging appliances like water heaters, refrigerators or HVAC systems are the likely culprits. how's the weather stripping at your doors and windows? attic access have a tight seal? change the filter for you HVAC system? installed low flow shower heads to reduce water consumption and reduce the energy needed to heat the water you've used? how much attic insulation do you have? fixing or checking these things will help reduce your power bill and take the perceived burden of your higher bill off your plasma.


you've compared apples to oranges with regards to your new place versus old.
 
#35 ·

Quote:
Originally Posted by MOtvGuy /forum/post/0



BTW, if you'd like to try a little experiment, take a short piece of copper wire and short it between the + & - posts on a typical 9 volt battery. While a piece of copper wire is about as efficient as you can get when it comes to the transfer of electrons is also has a minute amount internal resistance. I promise you that wire will get hot and pretty quickly.

Useless point, in that example the wire is a resistance (still a few milliohms) so all the dissipation will be given off as heat, a better example would be an incandescent lightbulb. It takes 100w to give off the same amount of light, in lumens, as a 23 watt flourescent bulb, hence the flourescent bulb is more efficient. Your 9v example is the same as household electric heat. Electric heat is extremely efficient, most of the power consumed is given off as heat which is what the thing was designed to do. It's expensive though (at least in my part of the galaxy where a Kwh is about $0.14 / kwh). The only part not given as heat is the loss in the wire to the element & any energy given off as light (the coils glowing). A plasma tv is designed to give off light in the form of a picture so the more efficient you can make the panel, the less is given off as heat.
 
#36 ·
I do not think it is your plasma but how is that the electric company's fault? They're not TV experts they provide power to you and your home, if something is consuming power you have several options including contacting them about the issue, but if they take a guess and are wrong with regards to one of your appliances calling and complaining is just ridiculous.


Plasma's use electricity but I doubt seriously it is the plasma causing the problem. I'm a big fan of reducing one's carbon footprint, you need to consider the rest of the house. Is your attic insulated? If so what is the rating? What was the rating at your old place? How thick are the walls? 2x4 or 2x6? What is the insulation rating in the walls? What are the windows rated at? There's an entire host of things that affect electricity consumption. In order for your plasma to have a significant affect on your bill you would have to be consuming something around 200KwH/mo and then have NO TV and then go straight to a 50 inch plasma. Which as already stated would probably consume between 30 and 45KwH depending on your usage. At that point you'd see roughly a 20% increase in power, that is significant, but as you already stated your previous consumption was around 500KwH.


Cliffs: Look at the rest of your house.
 
#37 ·
Here's a follow up I made to another post on power consumption, there is more above in this thread dated 4/05/07:

Here's the scoop II


In a recap to my previous post here are the numbers:

Total run of 1 week or 52 hours run time (52 hrs PDP time, 64 hours run time, Philips lists both in their service menu but to be conservative I'll go with the Plasma display time of 52 hours) and 18.97 kwh from my Kwh meter is 18.97kwh / 52= 364.8 watts average over the last week. at my approx kwh rate of $0.14 that equates to $2.66 for 1 week (which based on previous weeks was a light week of tv, I was averaging 100 hours / week in the first 3 weeks I've owned my Philips 50PF9630) but at this rate my tv costs $138.32 / year or $11.53 / month to run, Hot as our Phillies.... NOT

Overall, not bad considering the tv is rated @ 408w



You may want to check for extension cords between your outside outlets & you neighbors house
 
#38 ·

Quote:
Originally Posted by MOtvGuy /forum/post/0


If you'll look over my post above.....


Heat sink usage and size in electronics is directly proportional to the amount of heat that must be drawn off electronic components in order prevent thermal failure. The average Plasma TV has more heat sink area than just about any other electronic component out there other than maybe a 70's vintage power amp.


It's also the reason why Plasma's, at least the early versions, were loaded with cooling fans.


BTW, if you'd like to try a little experiment, take a short piece of copper wire and short it between the + & - posts on a typical 9 volt battery. While a piece of copper wire is about as efficient as you can get when it comes to the transfer of electrons is also has a minute amount internal resistance. I promise you that wire will get hot and pretty quickly.

I'm probably going to be sorry I chimed in, but here goes...


IMHO, heat sink area is not necessarily a direct indicator of power usage - it is simply an indicator that heat needs to be dissipated. Put a low-power device in a small box with undersized vents, and you will get heat that needs to be dissipated - a huge heat sink may not be enough. Put the same device in a large box with plenty of ventilation and a couple of fans, and the size of the required heat sink might be MUCH smaller.
 
#40 ·
My household power consumption is up by about 46%, or 184 KWH average per month for the last three months. The one thing that is different this year compared to last year is the 50 inch Philips plasma. The stated power draw for the set is 400KW. If I run it 8 hours a day for 30 days, I get 96 KWH; presumably that would be the maximum draw, and others here have suggested actual draw would be 50% of maximum. So, it seems unlikely that the set is the bulk of the household power increase, but I am wondering about it. I live in the Houston TX area and have gas heat. Yes I know the blower is electric, but it didn't seem that cold a winter.


Further: the plasma replaced an old 20 inch CRT set which is listed as having 110W power draw, so the delta due to the plasma is even smaller than I showed above.


Here is an interesting TV power consumption comparison:

http://reviews.cnet.com/4520-6475_7-...3.html?tag=txt


Still wondering.....
 
#41 ·
My electric bill for last month was 2480kWH. I agree with your math. 8 hrs a day X 400W X 30 days would give you 96kWH. That's pretty negligible as far as I'm concerned. That's assuming your run nothing but a white screen for the entire 8hrs a day. What I want to know is how you get away with only using 400kWH per month. I wish I could.
 
#42 ·

Quote:
Originally Posted by retexan599 /forum/post/0


My household power consumption is up by about 46%, or 184 KWH average per month for the last three months. The one thing that is different this year compared to last year is the 50 inch Philips plasma. The stated power draw for the set is 400KW. If I run it 8 hours a day for 30 days, I get 96 KWH; presumably that would be the maximum draw, and others here have suggested actual draw would be 50% of maximum. So, it seems unlikely that the set is the bulk of the household power increase, but I am wondering about it. I live in the Houston TX area and have gas heat. Yes I know the blower is electric, but it didn't seem that cold a winter.


Further: the plasma replaced an old 20 inch CRT set which is listed as having 110W power draw, so the delta due to the plasma is even smaller than I showed above.


Here is an interesting TV power consumption comparison:

http://reviews.cnet.com/4520-6475_7-...3.html?tag=txt


Still wondering.....

Read my post above from yesterday(4/08/07). That is actual power from a kw meter on my 50" philips.
 
#43 ·
Just to be clear about this (from a physicist), ALL the electrical energy used by all your appliances ends up as heat in your home, except for the negligible amount that escapes out your windows as light or gets through your outside walls as sound. If you have electric heat, saving energy used by appliances in the winter does you no good at all, as it was helping to heat your home. There is no such thing as a difference in efficiency - all the electrical energy used ends up as heat one way or the other.


I've measured my Panasonic plasma TV with a watt meter. In normal use it averages under 200 watts, since I don't usually run it very bright. If I run it 30 hours per week, that would be about 300 KWH per year, a fairly trivial amount of electricity (of course the satellite tuner and the sound system add more).


The fridge and the TV have traditionally been the biggest power users in the house aside from electric heat and hot water. All fridges have had an "average power use" rating sticker on them for decades. A modern fridge of normal sub-20 cu ft. size uses less than 500 KWH per year. More than your plasma TV, but again, not a huge amount.
 
#44 ·

Quote:
Originally Posted by Marky_Mark896 /forum/post/0


My electric bill for last month was 2480kWH. I agree with your math. 8 hrs a day X 400W X 30 days would give you 96kWH. That's pretty negligible as far as I'm concerned. That's assuming your run nothing but a white screen for the entire 8hrs a day. What I want to know is how you get away with only using 400kWH per month. I wish I could.

My average power take in the winter months was around 400 KWH/month a year ago, but this year it has kicked up to the 600 KWH range. Last summer it ran about 1300 KWH/month due to the air conditioning here in the Houston area. I also have a fairly small house at 1500 sq ft, so that keeps it down some. I have also noticed that Oct-Nov-Dec 2006 was quite a bit higher than Oct-Nov-Dec 2005, but the months are comparable before that; so perhaps whatever is happening started in the Fall of 2006. My investigation continues.....
 
#45 ·

Quote:
Originally Posted by amesdp /forum/post/0


Just to be clear about this (from a physicist), ALL the electrical energy used by all your appliances ends up as heat in your home, except for the negligible amount that escapes out your windows as light or gets through your outside walls as sound. If you have electric heat, saving energy used by appliances in the winter does you no good at all, as it was helping to heat your home. There is no such thing as a difference in efficiency - all the electrical energy used ends up as heat one way or the other.

This is absolutely correct (coming from another physicist.)
 
#46 ·

Quote:
Originally Posted by retexan599 /forum/post/0


My average power take in the winter months was around 400 KWH/month a year ago, but this year it has kicked up to the 600 KWH range. Last summer it ran about 1300 KWH/month due to the air conditioning here in the Houston area. I also have a fairly small house at 1500 sq ft, so that keeps it down some. I have also noticed that Oct-Nov-Dec 2006 was quite a bit higher than Oct-Nov-Dec 2005, but the months are comparable before that; so perhaps whatever is happening started in the Fall of 2006. My investigation continues.....

As someone who works in the energy industry keep in mind that oct/nov/dec were a bit warmer especially for you guys in houston than was 2005.
 
#47 ·

Quote:
Originally Posted by BBigJ /forum/post/0


This is absolutely correct (coming from another physicist.)

That makes two "physicists" that are taking a special case and generalizing. You two "physicists" are trying to take the narrow case where the appliance does no (or very little) work and generalize it to all appliances.

Take a window fan as an example or take a furnace as another. Additionally, the cost to heat one's closed system home by plasma would be considerably greater than to heat it by furnace - be it heat pump, gas or oil heat.


/s/ Humble disciple of the First Law of Thermodynamics.
 
#50 ·

Quote:
Originally Posted by SaltiDawg /forum/post/0


That makes two "physicists" that are taking a special case and generalizing. You two "physicists" are trying to take the narrow case where the appliance does no (or very little) work and generalize it to all appliances.

Take a window fan as an example or take a furnace as another. Additionally, the cost to heat one's closed system home by plasma would be considerably greater than to heat it by furnace - be it heat pump, gas or oil heat.


/s/ Humble disciple of the First Law of Thermodynamics.

The window fan is a poor example because its sole purpose is to transport mass and energy out of the house. Therefore, no conservation laws apply because the system (house) is not closed. The point is that whether the appliance is doing mechanical work, producing light, or processing information, all of the energy that goes into the appliance is eventually converted to heat. The only reason it is cheaper to heat the house by furnace than by plasma is because fossil fuels are a cheaper source of energy than electricity. If you tried to heat your house with a bunch of electric heaters, it would cost exactly the same as heating it with a bunch of plasmas.
 
#51 ·

Quote:
Originally Posted by BBigJ /forum/post/0


The window fan is a poor example because its sole purpose is to transport mass and energy out of the house. Therefore, no conservation laws apply because the system (house) is not closed. The point is that whether the appliance is doing mechanical work, producing light, or processing information, all of the energy that goes into the appliance is eventually converted to heat. The only reason it is cheaper to heat the house by furnace than by plasma is because fossil fuels are a cheaper source of energy than electricity. If you tried to heat your house with a bunch of electric heaters, it would cost exactly the same as heating it with a bunch of plasmas.

But the same could be said about any display. Not just limited to plasmas.
 
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top