A recent analysis shows that the new wave of AI models — those designed to mimic human‑like reasoning rather than just generate simple text — are consuming vastly more energy than older models. On average, reasoning‑enabled models used 100 times more power to respond to 1,000 written prompts than equivalent models with reasoning disabled.
In concrete terms: a simplified version of one model consumed about as much power as a 50‑watt lightbulb running for an hour when reasoning was turned off; with reasoning on, the same test demanded orders of magnitude more energy. Many of these reasoning‑enabled systems are now becoming the norm — especially for tasks that require multi‑step logic, planning, or long-form output.
This shift is drawing fresh scrutiny over AI’s environmental and infrastructural footprint. As energy-hungry reasoning models proliferate, the strain on power grids and data‑center infrastructure is growing — raising concerns about rising electricity costs, increased carbon emissions, and sustainability of large-scale AI deployment.
At the same time, researchers and industry voices warn this does not mean we must reject advanced AI categorically — but rather use it more thoughtfully. The call is to match the right model to the right task, avoid overusing heavy reasoning models when simpler ones suffice, and build better electricity‑planning and sustainability standards for AI infrastructure.