Originally Posted by EricN
WTF is measured in "watts per year"? I thought you claimed to be an electricity reseller at one point. The arithmetic is also way off.
I am not trying to measure anything in watts per year.. I am trying to determine how much it costs.
Walk me through it please.
Edit : your right about math. I'm in car but will recalculate later. My decimal needs move over I think. 5 watts for Seagate minus 4.4 watts for RED is .6 watts difference. Not .06.
I'd like too see you walk me through how you would do it.
5.0 watts minus 4.4 watts is a .6 watt difference between the two. (Above I said .06 as a typo)
.6 watts X 24 hours is 14.4 watts per day.
14.4 watts per day X 365 days is 5256 watts per year.
5256 watts is 5.256 kilowatts.
I pay $0.08 (8 cents) per Kwh so that is 5.256 x .08 = .042048
like I said above... 42 cents
A modern 5 watt HDD only costs about $3.50 per year in electricity consumed, so I am not sure how much savings you can have by lowering it. (I mean I don't think there is much room or opportunity for saving $ )
5 watts x 24 hours is 120 watts x 365 days = 43,800 watts. (43.8 kilowatts)
If the price of a kwh is 8 cents then 43.8 x .08 = $3.50
If one drive costs $3 a year and another $3.50 a year it is hardly a big determining factor.
The only time it is going to matter is if you have a really efficient drive (like 4 watts) versus a really power hungry drive (like 9 watts) in which case you might be able to save some decent money if your running a fleet of them. Otherwise power consumption is really not a big deal if drives are within a single watt of power of each other it's not going to matter.
Just my opinion.
Let me know if I did this wrong, admittedly I did it quick without much attention to detail. If I am wrong I'd love to know where and why.