The rising demand for AI data centers is straining global power grids, driving up electricity costs, and raising environmental concerns. The cost of electricity has increased significantly over the past year, with the average price of 1 kilowatt-hour rising from 16.41 cents to 17.47 cents, a 6.5% increase. This surge is largely attributed to the growing energy demands of massive data centers built to train and run AI models.
Some states are feeling the pinch more than others, with Maine seeing electricity prices jump 36.3%, Connecticut climbing by 18.4%, and Utah experiencing a 15.2% bump. These increases are largely due to the rapid development of data centers in these areas. The demands of existing and proposed data center projects are stressing the grid and its operators, particularly in regions like "Data Center Alley" in the mid-Atlantic, where prices are projected to rise after a recent capacity auction saw prices skyrocket.
Tech giants like Google and Meta are driving the demand for data centers, with plans to invest billions in new projects. While Google has agreed to curb its energy usage from data centers during peak hours, it also plans to pour $25 billion into data center projects. According to a report from the RAND Corporation, global power demand from AI data centers could hit 327 gigawatts by 2030, about 30% of the United States' current grid capacity of 1,280 GW. This could lead to further increases in electricity prices and strain on the grid.