AVS Forum banner
1 - 20 of 35 Posts

·
Registered
Joined
·
3,954 Posts
Discussion Starter · #1 ·
I saw newegg had an 1100 watt power supply as their deal of the day and thought, "that must use alot of energy". Of course we all know a 200 watt system isn't going to draw 1100 wats... or even 600 watts. So what is the harm I thought of getting a massively overated power supply.


Here is what I came across"

Quote:
A 500 Watt Power Supply can DELIVER 500 Watts, but it will ever use only as much as the components in your PC need (and of course that depends on Load and Activity, if Energy Savings Mechanisms like AMD's Cool'n'Quiet or Intel's SpeedStep is enabled etc.).


In Theory, with a 100% efficiency rating, which is impossible.


The usual Efficiency rating lies around 80%, but it can vary greatly between low quality and proper power supplies.


So with 80% efficiency, your power supply will use as much power as your components need and then about 20% extra.


Another caveat: Optimal efficiency is only reached at a "proper" load. If you have a 500 Watt Power Supply but then a super-low-consumption PC that only consumes 80 Watt, you're not going to reach 80% efficiency and could easily use ~120 Watt (~50% efficiency).


Due to the ~80% efficiency, you can also not use 500 Watt out of a 500 Watt Power Supply.


Those numbers are all estimates, as PSEs vary greatly, but a rule of thumb is that you should get a PSU with at least 80% Efficiency and get one that is not too big (but not too small either) for your PC.

According to this. a 70% efficiency 300 watt power supply at full load would waste 90watts. Now if we put an 80% efficiency PSU rated at 500 watts and ran it with the same draw and it dropped the effiency to 60%, then you are wastin 120watts. Does this sound right?


I was always under the impression an 80% effiency system was 80% efficient across the board.
 

·
Registered
Joined
·
1,111 Posts
As for efficiency, it'sn not across the board, depends on load.

From what I understand, a larger PS will draw more current even at idle than a lower power PS, so it will cost you more $$ in electricity (and probably heat).
 

·
Registered
Joined
·
3,764 Posts

Quote:
Originally Posted by bjmarchini /forum/post/16920465


According to this. a 70% efficiency 300 watt power supply at full load would waste 90watts.

Around that much, yeah. Assuming, of course, it's of decent quality (Seasonic, etc). Cheapy power supplies usually aren't capable of delivering the full advertised load.

Quote:
Originally Posted by bjmarchini /forum/post/16920465


Now if we put an 80% efficiency PSU rated at 500 watts and ran it with the same draw and it dropped the effiency to 60%, then you are wastin 120watts. Does this sound right?

At 300W load, your 80-Plus PSU should only use 360W, assuming the same calculations as above is used. I've forgotten what formula they use to calculate for efficiency.

Quote:
Originally Posted by bjmarchini /forum/post/16920465


I was always under the impression an 80% effiency system was 80% efficient across the board.

As far as I know, to get the 80-Plus logo, power supplies are tested from 20% to 100% load and must have at least 80% efficiency at those loads. Peak efficiency varies among different power supplies, but is usually achieved at ~50% load. SilentPCReview has a pretty good explanation on this. In short, 80+% efficient from 20~100% rated wattage. Beyond that, anything goes.


I use eXtreme Power Supply Calculator to figure out what wattage power supply to get. It does have some allowance but it's not as exaggerated as other power supply calculators.
 

·
Registered
Joined
·
1,375 Posts
It is all explained well here. http://www.hardwaresecrets.com/article/742 Basically, the maximum efficiency (least power wasted) is at about 60 - 80% of it's rated capacity. High quality PSUs will run at 84 - 85% in this range and fall off at higher or lower loads.


BB
 

·
Registered
Joined
·
1,611 Posts
I don't know if the requirements fall into the 20-100% load across the board. I remember seeing something like 40/60/80 as the qualifying points. Still, the most efficiency is seen ~50%+ of load. You may be able to find a chart for the PS you're looking at, but it should never dip too far below 80%. Though, this is exactly why you don't buy huge PSs when you don't need them. I only wish I could find quality ones in the 200W range.
 

·
Registered
Joined
·
1,375 Posts
It is interesting to note that those of us in 115 volt land (North America) have ~ 5% lower efficiency than our 220 volt cousins in Europe and most of the rest of the world.


BB
 

·
Registered
Joined
·
3,764 Posts

Quote:
Originally Posted by video321 /forum/post/16921012


You may be able to find a chart for the PS you're looking at, but it should never dip too far below 80%.

Beg to differ. Here are some power supply test data from SilentPCReview:
http://www.silentpcreview.com/article936-page5.html
http://www.silentpcreview.com/article925-page5.html
http://www.silentpcreview.com/article898-page4.html
http://www.silentpcreview.com/article880-page4.html
http://www.silentpcreview.com/article864-page3.html
http://www.silentpcreview.com/article813-page4.html
http://www.silentpcreview.com/article802-page4.html
http://www.silentpcreview.com/article792-page4.html
http://www.silentpcreview.com/article753-page4.html
http://www.silentpcreview.com/article751-page4.html
http://www.silentpcreview.com/article726-page4.html


At very low loads, some of the power supplies dip into ~50% efficiency. Sure, it's unlikely to go down to 20W load, but 40~60W is highly probable with a carefully designed HTPC.
 

·
Registered
Joined
·
1,611 Posts


Give me a break here....you're referencing 600W power supplies and their efficiency at 20W
 

·
Registered
Joined
·
1,375 Posts

Quote:
How so?

It is a bit complex, but without writing a book about the subject, is is because higher voltages require less amperage to produce the same amount of power.

Power = Volts x Amps


The wasted power in a PSU is released as heat. Higher the amperage produces more heat, so more waste. Some of the components in a PSu see double the amperage in a 115v system so produce twice as much heat. This waste is combated by the component designs done by the propeller hat wearing electrical engineering crowd but some components are immune to design improvements, so they produce more heat in a 115 volt system.


If you measure the current at the wall plug for a psu pulling 500 watts you will find about 2.5 amps on the 220 v circuit and ~5 amps on the 115 volt circuit.


Plug in your electric tea kettle and boil a pot of water. Your voltage is 115 v. The kettle will draw 1500 watts so the amperage is about 14 amps (1500/110 = 14). Feel the cord. It will be warm. Now fly to Europe and do the same experiment with one of their kettles. If it is a 1500 watt kettle, the cord will not be so warm because it will only draw about 7 amps to produce 1500w (220 x 7 = 1500). The wires i North America get hotter because they are carrying twice the amperage.


BB
 

·
Registered
Joined
·
1,375 Posts
Like video said.


Does it really matter whether a PSU is 70 or 80% efficient at 30 watts. The difference is 3 watts of power loss or about 24 kwh ($3) per year


This review gives a good explanation of what an 80 plus means with PSUs of different sizes.

"A 250W 80 Plus power supply is assured to have >80% efficiency down to 50W, compared 80W for a 400W model."


Don't get to hung up on absolute efficiency, Just buy a properly sized, quality PSU with an 80Plus rating and enjoy your rig.


BB
 

·
Registered
Joined
·
1,165 Posts
Well, you could also argue that at no load all power supplies will be 0% efficient.....


Why not go to the source - http://www.80plus.org/


"Due to the ~80% efficiency, you can also not use 500 Watt out of a 500 Watt Power Supply."


This is simply not true. The wattage rating is the output wattage, not the input wattage.


Peter
 

·
Registered
Joined
·
3,764 Posts

Quote:
Originally Posted by Bigbird999 /forum/post/16922872


Does it really matter whether a PSU is 70 or 80% efficient at 30 watts. The difference is 3 watts of power loss or about 24 kwh ($3) per year




Don't get to hung up on absolute efficiency, Just buy a properly sized, quality PSU with an 80Plus rating and enjoy your rig.

True, ±5W doesn't really matter all that much but the difficulty here is getting people to buy properly sized power supplies. On a lot of forums, people are recommending a minimum of 400W for builds in a Mini-Tower case with just a motherboard with integrated graphics, ≤65W TDP processor, 2 sticks of DDR2 RAM, one hard drive, and one optical drive when a good quality 250W PSU is plenty. The exception to that seems to be SilentPCReview where PicoPSU's reign king. I remember a post on a hardware forum where the OP wanted to upgrade his old Dell to a GeForce 6200TC and one of the replies said he needed to upgrade the power supply to 500W.
 

·
Registered
Joined
·
1,375 Posts

Quote:
the difficulty here is getting people to buy properly sized power supplies

I totally agree!


And it isn't helped by those so called calculators that overestimate everything by a factor of n and then add a couple of hundred watts. I think most of this is a hangover/confusion from the gaming crowd that overclock their cpus into volcanos and run overclocked dual card GPUs that pull 200 - 300w w each. Both ATI and Nvidea make blanket, cover their asses, statements that "this card requires a minimum PSU of x00 watts.


In the end most just get one that is bigger than they need. They don't notice the power cost and are ignorant of what all the numbers mean and the penalty for under sizing is a rig that doesn't work while over sizing has no easily visible effect.


BB
 

·
Registered
Joined
·
5,622 Posts
Underpowered boxes do cause weird probs though. So unless u precisely MEASURE how much juice ur using, most ppl just buy a bigger PSU.


One of these days -don't hold your breath-, The PSU ppl will provide a readout of max power and load factor signal, and then u will find out whether u need a new PSU for that super-dupper video brd u wanna add.
 

·
Registered
Joined
·
71 Posts
The other issue, especially with these smaller HTPC cases, is that none of the lower powered PSUs are modular. I stepped up to a beefier PSU just so I could get a modular one and not have to deal which a bunch of cables screwing up the airflow in my case.
 

·
Registered
Joined
·
5,622 Posts

Quote:
Originally Posted by Cetra00 /forum/post/16924076


have to deal which a bunch of cables screwing up the airflow in my case.

Ah, reminds of IBM PS2 of the early 90s. There were NO CABLES, everything was snapped on, and no screwdrivers needed to take apart/put together.


Didn't think too much about it at the time, but what clean design!
 

·
Registered
Joined
·
1,611 Posts

Quote:
Originally Posted by ilovejedd /forum/post/16923300


True, ±5W doesn't really matter all that much but the difficulty here is getting people to buy properly sized power supplies.

So true.

Which is why I stated this earlier:

"Though, this is exactly why you don't buy huge PSs when you don't need them. I only wish I could find quality ones in the 200W range."


I even had to stop my cousin from buying a 750W PS for a small build
 

·
Banned
Joined
·
1,614 Posts

Quote:
Originally Posted by Bigbird999 /forum/post/16922806


It is a bit complex, but without writing a book about the subject, is is because higher voltages require less amperage to produce the same amount of power.

Power = Volts x Amps


The wasted power in a PSU is released as heat. Higher the amperage produces more heat, so more waste. Some of the components in a PSu see double the amperage in a 115v system so produce twice as much heat. This waste is combated by the component designs done by the propeller hat wearing electrical engineering crowd but some components are immune to design improvements, so they produce more heat in a 115 volt system.


If you measure the current at the wall plug for a psu pulling 500 watts you will find about 2.5 amps on the 220 v circuit and ~5 amps on the 115 volt circuit.


Plug in your electric tea kettle and boil a pot of water. Your voltage is 115 v. The kettle will draw 1500 watts so the amperage is about 14 amps (1500/110 = 14). Feel the cord. It will be warm. Now fly to Europe and do the same experiment with one of their kettles. If it is a 1500 watt kettle, the cord will not be so warm because it will only draw about 7 amps to produce 1500w (220 x 7 = 1500). The wires i North America get hotter because they are carrying twice the amperage.


BB


Your post would be correct if the voltages supplied were 115 and 220. However, in N America, residential households are fed with a split phase power supply. A center tapped transformer supplies 220 to 240 V, phase to phase. 110 to 120V phase to neutral....exactly half the phase to phase voltage.

BTW, power dissipation is the product of voltage and current.
 
1 - 20 of 35 Posts
Top