|Home||Reviews||Tools||Forums||FAQs||Find Service||ISP News||Maps||About|
how-to block ads
There has been much discussion about this very topic and there are many who give
their opinion, however there is nowhere that gives the facts for potential users to read
this information which they can then show to those who may complain about the use
Lets start by explaining the simple principles of Electricity and the power consumed.
As we all know, electrical circuits consume power (va), one Watt of Electricity is also
1va. Calculating the voltage in a circuit and multiply it by the current used arrives at
Circuit A has a supply rated at 12V (Vn) and measurement shows it draws some 4.5A
(In) from the supply. To determine the power rating of the circuit you simply multiply
the Voltage by the Current thus..
P = Vn * In = 12 * 4.5 = 54 watts or 54va
Now we understand that, there is one important point to make at this stage. If you
alter the Voltage, you will also alter the current drawn from the circuit, however the
power will remain the same at 54va.
P = Vn * In = 200 * 0.27 = 54 watts or 54 va.
If you are wondering about the 0.27, then that is the current drawn from the circuit.
As I said, the power must remain constant for the circuit, so by increasing the voltage
you are reducing the load taken from the circuit.
If you have any doubts about this then do the calculation in reverse.
In = P / V = 0.27A
Now we have the basics out of the way, lets go on to the important part of cost.
Lets assume you turn your monitor off manually when not in use, so we can just
concern ourselves with the actual computer itself.
Please bear in mind these are rough figures with regards power used and only as an
example. The maths are correct, but the power used by each users machine will
depend on the hardware and the setup of the machine.
Your computer has a 300W (300Va) PSU and runs at 120V, therefore it will draw
some 2.5A from the supply
Remember Examples 2 and 3 above.
300 / 120 = 2.5 where
300 is the maximum power of the PSU
120 is your supply Voltage.
There are things inherent in all electrical circuits that have a bearing on the actual
power used and these are called losses. They occur due to electrical circuits being far
from perfect and take the many forms. The average PSU in a computer will operate at
about 80% efficiency due to the nature of the circuits employed. This means that for
the PSU to deliver 300W (va) as rated then it will actually draw about 375W (375va).
Now we have that information, we can begin to look at a more accurate cost running
Whilst we know that your Computer will not run at full PSU load, if ever, we will use
the full load to give the MAXIMUM cost that should be seen to any user.
Your PSU, as we have shown above, uses 375W (va) of electrical energy from the
supply. Therefore to do the cost calculation is rather simple.
Power used at maximum (Pmax) = 375va
So now calculate the total power used in 24 hours,
P(max) * 24hrs = 375 * 24 = 9000va or 9000W (9Kva or 9Kw)
If you pay, for example, 10 cents for each unit of electricity, then your cost of running
the machine is :
9 * 10 = 90c per day
If you wish to know the monthly cost, then calculate as follows:
90 * 7 * 4.3 = 2709c or $27.09 per Calendar month.
The 4.3 in the above equation is the multiplier required to calculate any figure on a
calendar month basis.
Now we all know that electricity costs vary across the nation and from supplier to
supplier. If you have a look at your last bill, you should see the unit cost of electricity
clearly indicated on there. This is the figure you need to use to calculate the
approximate cost of running a PC 24/7 for a month.
Some of the information at the top of this in Part 1 may seem a little irrelevant to the
cost calculation, however that is information that you can use to calculate all sorts of
costings if you know the rating of various pieces of equipment.
Part 3. Estimates:
The following are based on the following figures.
That all of the PSU's run at 80% efficiency, about average, and that each Kw/h of electricity will cost $0.06c per Kw/h.
Please note that you will need to adjust the calculation for your given tarrif of costs from your electricity supplier. This information will be available on your electricity bill.
Assumed maximum demand due to losses. 293w
In one calendar month this would consume 211.6632 Kw/h
at a cost of 0.06c per Kw/h, the maximum cost is $12.70
Assumed maximum demand due to losses. 375w
In one calendar month this would consume 270.90 Kw/h
at a cost of 0.06c per Kw/h, the maximum cost is $16.25
Assumed maximum demand due to losses. 437.5w
In one calendar month this would consume 316.050Kw/h
at a cost of 0.06c per Kw/h, the maximum cost is $18.96
Assumed maximum demand due to losses. 500w
In one calendar month this would consume 361.20Kw/h
at a cost of 0.06c per Kw/h, the maximum cost is $21.67
Assumed maximum demand due to losses. 562.5w
In one calendar month this would consume 406.35Kw/h
at a cost of 0.06c per Kw/h, the maximum cost is $24.38
Assumed maximum demand due to losses. 625w
In one calendar month this would consume 451.50Kw/h
at a cost of 0.06c per Kw/h, the maximum cost is $27.09
As you can see the costs escalate incrementally as you would expect. These costs do not include the cost of running a monitor, however as this is not on 24 hours a day these costs are minimal, a few extra dollars a month. The figures quoted above are only if the machine is running at maximum potential load at all times. In most cases the machine will probably consume only about 50% of the estimated maximum and thus cost, however this will vary due to hardware differences and usage. The maximum you can use is listed above and is accurate for consumed power, actual cost will vary with supplier costs per Kw/h.
Hope this is of help to those whose spouses, partners or parents are concerned about the cost of allowing a machine to run 24/7.
This is wrong calculation. If power supply has 300W it does not mean it will consume it all the time. This is _maximum_ power which PSU can provide. Real consuming will be much less.
This helps! Thanks for your energy put forth here.
Watt != VA http://en.wikipedia.org/wiki/Volt-ampere
Is There any extra cost in curred in powering up the computer and powering down
The higher power rating of a psu does not mean it will use more power, only it is capable of delivering more IF needed
You might want to factor in the cost of wear and tear on the computer. Everything is rated for only so many hours of operation.
I live in mass and my last bill states .0728c per watt, but there is also an additional .05462 included based on fixed transition costs and whatnot
He stated that the power supply did not consume its max power all the time. The 0.06c was only an example. Wikipedia is not a valid reference in any valid argument. And it seems most of you need to read the entirer article from what I have observed from your replies. Good article. The math is sound and it does answer some question I have about our surrent office power consumption for the related devices. Thank you.
I have a device that measures power draw from any device plugged into it. It plugs into the outlet and then your power cord plugs into the device and an LCD display shows power draw. I have an older computer I've set up to do 24/7 BOINC operations, does not have a monitor or any peripherals or sound, just the case plugged in and on-board cards. Has a Smith field Pentium D 2.66 GHZ Dual Core (95w processor), the GPU is not used for processing. The PSU is 450 watts max. The actual power draw at 100% processing is about 245 watts. That just FYI and a guide to anyone who was shocked to see how much their 600 watt power supplies are costing them. Check out your power company's website to see if they have a calculator to help you figure out how much you are being charged; it varies widely all over the world.
Thanks for the explanation, Regardless the w = va or 300W power supply will consume 300 W all the time, the final results in your examples are wrong, you should divide the cost by 100 because the kw/h cost is in cents. So, based on your assumptions, a 500W PSU will cost $0.2709 if the 1KW will cost 0.06c/h ((500 * 100 / 80) /1000)*24*7*4.3 = 451.5 Kw/h for one month 451.5 * 0.06c = 27.09 c = $0.2709 for one month If we assume your assumptions are correct then one simple formula will be: Let W = maximum PSU power and the efficiency is 80% Then cost per month in dollars = W*0.903*Cost in cents/100 500 * 0.903 * 0.06 / 100 = $0.2709 per month