A recent study recommends that ChatGPT might not be as power-hungry as already expected, even though its energy utilization largely depends on usage patterns and the AI models in operation. According to research conducted by Epoch AI, a nonprofit AI research established, prior estimates of ChatGPT’s energy utilization were significantly exaggerated.
A widely cited claim suggested that ChatGPT required around 3 watt-hours of power per query—10 times the energy of a Google search. However, Epoch’s analysis found that the actual energy usage of OpenAI’s latest model, GPT-4o, is closer to 0.3 watt-hours per query, which is lower than many household appliances.
Joshua You, a data analyst at Epoch, explained that the study was conducted to challenge outdated research that assumed OpenAI used older, less efficient hardware. He pointed out that while AI’s overall energy demands are growing, misconceptions about current AI power usage persist. You stated.
“The energy use is not a big deal compared to normal appliances, heating or cooling your home, or driving a car,”
Despite this, concerns about AI’s environmental impact are mounting. Over 100 organizations recently published an open letter urging AI companies and regulators to ensure that expanding AI infrastructure does not deplete natural resources or increase reliance on nonrenewable energy sources.
Epoch’s analysis, while more precise than earlier estimates, still has limitations. OpenAI has not released detailed data that would allow for an exact calculation. Additionally, the study does not account for power-intensive features like image generation or long-form queries with large file attachments.
Looking ahead, AI’s energy consumption is expected to rise as models become more advanced and widespread. A report from Rand predicts that by 2030, training a single frontier AI model could require the power output of eight nuclear reactors (8 GW). AI data centers may also need nearly all of California’s 2022 power capacity (68 GW) within the next two years.
OpenAI and other major players in the AI industry are investing billions in expanding data center infrastructure to meet increasing demand. At the same time, the industry is shifting toward reasoning models, which take longer to generate responses but are more capable of handling complex tasks. Unlike GPT-4o, which delivers near-instantaneous replies, these reasoning models “think” for several seconds to minutes before responding—requiring significantly more computing power.
While OpenAI has introduced more energy-efficient models like o3-mini, efficiency gains may not be enough to counterbalance the rising energy demands of reasoning models and AI’s growing global usage.
For those concerned about AI’s environmental impact, You suggest minimizing ChatGPT usage when possible or opting for smaller, less power-intensive models like GPT-4o-mini. He advised.
“You could try using smaller AI models and sparingly use them in a way that requires processing or generating a ton of data,”