Artificial intelligence (AI) is increasingly becoming a part of our daily lives, but it comes with a significant environmental impact. The massive amounts of energy and water used by data centers that power AI are substantial, contributing to greenhouse gas emissions and climate change.
A simple AI prompt, such as "Tell me the capital of France," uses 23 times more energy than the same question typed into Google without its AI Overview feature. Data centers require large amounts of water to stay cool, with some consuming up to 5 million gallons per day, equivalent to the daily water demand of a town with 50,000 people.
The energy used by data centers is often generated by fossil fuels, leading to increased greenhouse gas emissions. Furthermore, the Jevons Paradox suggests that as technology becomes more efficient, we tend to use it more, offsetting any potential energy savings. This means that even as tech companies work to make data centers more efficient, the environmental impact of AI may not shrink due to increased usage.
To mitigate the climate cost of AI, it's essential to be mindful of our usage and adopt more sustainable practices. Limiting tech use, using energy-efficient AI tools, and opting for local AI solutions can help minimize energy consumption. Some search engines, like Ecosia, use efficient algorithms and plant trees to offset their carbon footprint.
By acknowledging the environmental impact of AI and taking steps to reduce our usage, we can help minimize the climate cost associated with this technology. As AI continues to become more integrated into our daily lives, it's crucial to consider the long-term consequences of our actions and strive for a more sustainable future.