Running a 100 W home server 24/7 costs about $158 a year
A 100 W home server or NAS running 24/7 at 80% load costs about $157.68 a year to power at $0.18/kWh — $13.14 a month. That works out to 876 kWh annually for a machine that's doing something useful every hour of the year: file serving, Plex, backups, Home Assistant, or a dev box.
Calculator
Power Cost Estimator
Device Configuration
Not sure? Check global prices on World Population Review.
How this is calculated
The big difference from a gaming PC isn't peak power — it's the "24/7" multiplier. A 100 W always-on device uses as much annual electricity as an 850 W gaming PC running 4 hours a day, because 24/7 × 100 W still beats 4 × 850 W × 80%. That's why hobbyists obsess over efficient Mini PCs, Synology units, and Raspberry Pi-class servers: the watts matter in a way that doesn't apply to desktops that shut down at night.
Verdict
$158 a year is the mental benchmark for "is self-hosting worth it?" It's below the cost of most paid cloud equivalents but it's not free — if your server idles at 30 W instead of drawing 80% of 100 W, you can halve the bill.
More Server scenarios
Frequently asked questions
How much does it cost to run a gaming PC 24/7?
How do I calculate kWh from watts?
What uses more electricity: an RTX 4090 or a typical fridge?
Does a 1000 W PSU use 1000 W all the time?
How much does an always-on TV or server cost per year?
Why is my electricity bill higher than the estimator predicts?
Related tools
WebGPU Benchmark
Stress-test your GPU right in the browser with real WebGPU workloads and get a performance score.
Use tool ➜RAM Latency Calculator
Convert DDR3/DDR4/DDR5 timings (CL, tRCD, tRP, tRAS) into true latency in nanoseconds.
Use tool ➜RAID Calculator
Calculate usable capacity and fault tolerance for RAID 0, 1, 5, 6, and 10.
Use tool ➜Data Transfer Calculator
Estimate transfer times for files over USB, WiFi, Ethernet, and more.
Use tool ➜