Powering AI is becoming an increasingly complex challenge. With over 10,000 data centers globally, and more than 5,000 in the United States alone, the demand for electricity is skyrocketing. In fact, U.S. data centers consumed more than 4% of the country's total electricity in 2023, and this number is expected to rise to 9% by 2030.
The sudden need for so many data centers is putting a strain on power grids, causing delays in the planned shutdown of some coal-fired power plants and raising prices for residential consumers. Meeting the needs of data centers is also slowing the transition to clean energy, as there aren't enough sources of renewable energy to serve both hyperscalers and existing users.
The rapid development and deployment of powerful generative AI models also come with environmental consequences, including increased water consumption. To address these challenges, companies are turning to alternative sources of energy, such as nuclear power. For example, Microsoft has signed a deal to buy power from a reopened nuclear reactor at Three Mile Island.
Google has also ordered a fleet of small modular nuclear reactors to generate power for its data centers. Researchers are exploring ways to make data centers more energy-efficient, such as shifting computing tasks to times and places where carbon-free energy is available on the grid.
Companies are also investing in new technologies, such as next-generation geothermal projects and fusion power plants, to provide clean energy for data centers. As the demand for AI continues to grow, finding sustainable and efficient ways to power these technologies will be essential.