Running a 500 W gaming PC 4 h/day costs about $105 a year

A mid-range 500 W gaming PC running 4 hours a day at a realistic 80% load costs about $105.12 a year to power, at a US-average rate of $0.18/kWh. That's roughly $8.76 a month and works out to 584 kWh of annual consumption — about 3% of an average US household's total electricity use.

Annual cost
$105
At $0.18/kWh
Monthly cost
$9
4 h/day at 80% load
Annual consumption
584 kWh
500 W rated

Calculator

Power Cost Estimator

Device Configuration

W

Not sure? Check PCPartPicker or BuildCores.

Effective Power: 400W

Hours
$/kWh

Not sure? Check global prices on World Population Review.

Daily Cost
$0.29
Monthly Cost
$8.64
Yearly Cost
$105.12
584 kWh / year

How this is calculated

Most real-world gaming sessions don't pull a PC's rated maximum wattage for long: peak draw only happens during the most demanding scenes. The 80% utilisation assumption used here captures typical AAA gaming; esports at lower resolutions may run closer to 50%. Desktop idle (web, video) is usually under 10% on modern builds thanks to Intel's P/E cores and AMD's CCD power gating, which brings the effective running cost much lower if most of your PC-on time isn't gaming.

Verdict

$105 a year for a mid-range build is a small fraction of what the hardware itself cost. Electricity is rarely the deciding factor on a consumer desktop — it's the all-day servers and workstations where the difference between a 500 W and 850 W class starts to matter financially.

More Gaming PC scenarios

Frequently asked questions

How much does it cost to run a gaming PC 24/7?
At 300 W continuous draw and €0.30/kWh, a gaming PC left on 24/7 costs about €788 per year. Cut that to 8 hours per day of active gaming plus idle the rest and it drops to roughly €260/year. Sleep mode and turning the PC off when not in use are the biggest levers.
How do I calculate kWh from watts?
Multiply wattage by hours, then divide by 1,000. A 500 W device running for 4 hours uses 500 × 4 ÷ 1,000 = 2 kWh. Multiply by your electricity rate (e.g. €0.30/kWh) to get the cost.
What uses more electricity: an RTX 4090 or a typical fridge?
Peak power, the RTX 4090 wins (450 W vs roughly 100-150 W compressor draw). Annual consumption is the opposite: a fridge runs 24/7 at an average of 40-50 W, adding up to around 400 kWh/year — more than many gaming PCs used only a few hours daily.
Does a 1000 W PSU use 1000 W all the time?
No. A PSU is rated for its maximum output capacity; actual draw depends on what the components need. A system with a 1000 W PSU at idle might pull 80 W from the wall, jumping to 400-600 W under gaming load.
How much does an always-on TV or server cost per year?
At 100 W continuous draw and €0.30/kWh, any device running 24/7 costs €263/year. A home NAS at 50 W costs €131/year; a 65-inch OLED TV watched 4 hours/day at 150 W costs about €66/year.
Why is my electricity bill higher than the estimator predicts?
The estimator assumes the wattage you enter is the average over runtime. Real devices spike, idle, and have standby draw. Measure with a cheap wall-socket power meter for the most accurate baseline, then use the estimator to project costs for different usage scenarios.