Running a 1200 W professional server 24/7 costs about $1890 a year

A 1200 W professional rack server running 24/7 at 80% load costs about $1892.16 a year at $0.18/kWh — $157.68 a month. That's 10,512 kWh annually, similar to the electricity consumption of a small second home or an electric car doing 30,000 miles a year.

Annual cost
$1,514
At $0.18/kWh
Monthly cost
$124
24 h/day at 80% load
Annual consumption
8410 kWh
1200 W rated

Calculator

Power Cost Estimator

Device Configuration

W

Not sure? Check PCPartPicker or BuildCores.

Effective Power: 960W

Hours
$/kWh

Not sure? Check global prices on World Population Review.

Daily Cost
$4.15
Monthly Cost
$124.42
Yearly Cost
$1,513.73
8410 kWh / year

How this is calculated

This is the cost that makes colocation pricing — typically $50-$150/month for a 1U rack unit with power included — actually reasonable. Running the same server at home means paying for the electricity directly, plus cooling (a 1200 W heat source in a small room adds noticeable HVAC load), plus a UPS that keeps it online through brief power cuts. Enterprise data centres amortise this across thousands of servers in climate-controlled buildings with bulk electricity contracts, which is why hyperscalers can rent the equivalent capacity for $70/month.

Verdict

$1892 a year is the number that makes colo or cloud attractive for any always-on workload heavier than homelab. Below 500 W it usually still pays to self-host; at 1200 W the math swings hard the other way once you add cooling and a UPS.

More Server scenarios

Frequently asked questions

How much does it cost to run a gaming PC 24/7?
At 300 W continuous draw and €0.30/kWh, a gaming PC left on 24/7 costs about €788 per year. Cut that to 8 hours per day of active gaming plus idle the rest and it drops to roughly €260/year. Sleep mode and turning the PC off when not in use are the biggest levers.
How do I calculate kWh from watts?
Multiply wattage by hours, then divide by 1,000. A 500 W device running for 4 hours uses 500 × 4 ÷ 1,000 = 2 kWh. Multiply by your electricity rate (e.g. €0.30/kWh) to get the cost.
What uses more electricity: an RTX 4090 or a typical fridge?
Peak power, the RTX 4090 wins (450 W vs roughly 100-150 W compressor draw). Annual consumption is the opposite: a fridge runs 24/7 at an average of 40-50 W, adding up to around 400 kWh/year — more than many gaming PCs used only a few hours daily.
Does a 1000 W PSU use 1000 W all the time?
No. A PSU is rated for its maximum output capacity; actual draw depends on what the components need. A system with a 1000 W PSU at idle might pull 80 W from the wall, jumping to 400-600 W under gaming load.
How much does an always-on TV or server cost per year?
At 100 W continuous draw and €0.30/kWh, any device running 24/7 costs €263/year. A home NAS at 50 W costs €131/year; a 65-inch OLED TV watched 4 hours/day at 150 W costs about €66/year.
Why is my electricity bill higher than the estimator predicts?
The estimator assumes the wattage you enter is the average over runtime. Real devices spike, idle, and have standby draw. Measure with a cheap wall-socket power meter for the most accurate baseline, then use the estimator to project costs for different usage scenarios.