There’s increasing talk of gigawatt data centers. Currently the largest data center, Switch’s Citadel Campus in Nevada, uses 850 megawatts of power. OpenAI’s Stargate data center, under construction, is supposed to use 1.2 gigawatts.
Gigawatt
An average French nuclear reactor produces about a gigawatt of power. If the US were allowed build nuclear reactors, we could simply build one reactor for every gigantic data center. Unfortunately, the Nuclear Regulatory Commission essentially prohibits the construction of profitable nuclear reactors.
An American home uses about 1200 watts of power, so a gigawatt of electricity could power 800,000 homes. So roughly, a gigawatt is a megahome.
Gigawatt-year
A gigawatt is a unit of power, not energy. Energy is power over some time period.
A gigwatt-year is about 3 × 1016 joules, or 30 petajoules.
A SpaceX Starship launch releases 50 terajoules of energy, so a gigawatt-year is 60 Starship launches.
A couple months ago I wrote about illustrating crypographic strength in terms of the amount of energy needed to break it, and how much water that much energy would boil. Let’s do something similar for a gigawatt-year.
It takes about 300 kilojoules of energy to boil a liter of water [1], so 30 petajoules would boil 100 billion liters of water. So a gigawatt-year of energy would be enough to boil Coniston Water, the third largest lake in England.
If you could convert a kilogram of matter to energy according to E = mc², this would release 90 petajoules. So a gigawatt-year is the energy in about 300 grams of matter.
[1] In detail, boiling a liter of water is defined as increases the temperature from 20° C to 100° C at sea level.
Is water really boiled if it’s still liquid? The latent heat of vaporisation significantly increases the energy required.