Running a 230 W gaming laptop 4 h/day costs about $48 a year

A 230 W gaming laptop running 4 hours a day at 80% utilisation costs about $48.36 a year at $0.18/kWh — $4.03 a month and 269 kWh annually. That's less than half what a comparable-performance 500 W gaming desktop costs to run on the same schedule, entirely because of mobile silicon's lower power budget.

Annual cost
$48
At $0.18/kWh
Monthly cost
$4
4 h/day at 80% load
Annual consumption
269 kWh
230 W rated

Calculator

Power Cost Estimator

Device Configuration

W

Not sure? Check PCPartPicker or BuildCores.

Effective Power: 184W

Hours
$/kWh

Not sure? Check global prices on World Population Review.

Daily Cost
$0.13
Monthly Cost
$3.97
Yearly Cost
$48.36
269 kWh / year

How this is calculated

Laptop peak wattage is bounded by the AC adapter: a 230 W brick means the laptop can't sustain more than 230 W regardless of workload. In games, a 230 W gaming laptop typically pulls 180-210 W from the wall — a real-world cap that mobile GPUs are deliberately binned around. On battery, performance drops sharply (often 30-40%) because the platform self-limits to preserve runtime, so the meaningful electricity-cost figure assumes plugged-in use.

Verdict

$48 a year for mobile gaming is genuinely cheap — the watt-hours of a full evening's session work out to pennies. If electricity cost mattered more than portability, a matched-performance desktop would use about twice as much.

More Laptop scenarios

Frequently asked questions

How much does it cost to run a gaming PC 24/7?
At 300 W continuous draw and €0.30/kWh, a gaming PC left on 24/7 costs about €788 per year. Cut that to 8 hours per day of active gaming plus idle the rest and it drops to roughly €260/year. Sleep mode and turning the PC off when not in use are the biggest levers.
How do I calculate kWh from watts?
Multiply wattage by hours, then divide by 1,000. A 500 W device running for 4 hours uses 500 × 4 ÷ 1,000 = 2 kWh. Multiply by your electricity rate (e.g. €0.30/kWh) to get the cost.
What uses more electricity: an RTX 4090 or a typical fridge?
Peak power, the RTX 4090 wins (450 W vs roughly 100-150 W compressor draw). Annual consumption is the opposite: a fridge runs 24/7 at an average of 40-50 W, adding up to around 400 kWh/year — more than many gaming PCs used only a few hours daily.
Does a 1000 W PSU use 1000 W all the time?
No. A PSU is rated for its maximum output capacity; actual draw depends on what the components need. A system with a 1000 W PSU at idle might pull 80 W from the wall, jumping to 400-600 W under gaming load.
How much does an always-on TV or server cost per year?
At 100 W continuous draw and €0.30/kWh, any device running 24/7 costs €263/year. A home NAS at 50 W costs €131/year; a 65-inch OLED TV watched 4 hours/day at 150 W costs about €66/year.
Why is my electricity bill higher than the estimator predicts?
The estimator assumes the wattage you enter is the average over runtime. Real devices spike, idle, and have standby draw. Measure with a cheap wall-socket power meter for the most accurate baseline, then use the estimator to project costs for different usage scenarios.