Artificial intelligence (AI) is a significant energy consumer, and its energy demand is doubling every 100 days as generative AI tools become more embedded in everyday life. A recent UNESCO report highlights the potential for reducing AI's substantial energy footprint through simple changes. According to the report, implementing these changes could slash AI energy use by up to 90% without compromising performance.
The report suggests that using smaller models designed for specific tasks can match the performance of larger general-purpose systems while using significantly less energy. For instance, small models used 15 times less energy for summarization, 35 times less energy for translation, and 50 times less energy for question answering. Additionally, shortening prompts and responses can significantly reduce the energy consumption of AI systems.
The energy footprint of generative AI is substantial, with ChatGPT alone consuming around 564 MWh of electricity daily, enough to power 18,000 American homes. The AI industry is expected to consume 85-134 TWh annually by 2027, roughly the same as Bitcoin mining operations. The annual energy footprint of generative AI is equivalent to that of a low-income country, and it's growing exponentially.
By implementing simple changes, such as using smaller models, shortening prompts and responses, and applying quantization, we can make AI more sustainable and reduce its environmental impact. These strategies can help mitigate the growing energy demands of AI and promote a more environmentally friendly approach to AI development and deployment.