CONNECT WITH US

OpenAI's GPT-5 consumes over eight times the power of GPT-4, researchers estimate

Ollie Chang, Taipei; Sherri Wang, DIGITIMES Asia 0

Credit: AFP

OpenAI's latest flagship model, GPT-5, delivers markedly stronger reasoning performance than its predecessor—but at a sharply higher environmental cost. According to a new analysis, each GPT-5 query consumes roughly 8.6 times the electricity required by GPT-4.

Power hunger jumps with advanced capabilities

Citing data from Tom's Hardware and The Guardian, the University of Rhode Island's AI Lab estimates that a single GPT-5 query uses an average of 18.35 watt-hours, compared with just 2.12 watt-hours for GPT-4. That puts GPT-5's energy draw behind only OpenAI's o3 reasoning model and China's DeepSeek R1 in the ranking of most power-hungry AI systems.

Shaolei Ren, a professor at the University of California, Riverside, says GPT-5's "thinking mode," which processes tasks for longer periods, can use five to ten times more power than a standard response. Its capacity to handle text along with images and video further increases the model's already substantial energy demands.

Estimates based on industry benchmarks

OpenAI has not disclosed deployment specifics. The URI team derived its estimates by multiplying average response times by hardware power consumption, drawing on configurations typical of Nvidia DGX H100 or H200 systems, Microsoft Azure data center efficiency metrics, and factors such as Power Usage Effectiveness (PUE), Water Usage Effectiveness (WUE), and Carbon Intensity Factors (CIF). The researchers cautioned that their calculations involve multiple assumptions and that actual consumption may vary.

Environmental impact scales with usage

OpenAI previously told Axios that ChatGPT handles up to 2.5 billion queries each day. If every one of those queries were processed by GPT-5, researchers estimate daily energy use could hit 45 gigawatt-hours, which is about the same as the output of two to three nuclear power plants and enough to supply electricity to 1.5 million US homes for a day.

Researchers warn that GPT-5's heavy power consumption for complex tasks could drive up demand at AI data centers, increasing costs as climate policies tighten in parts of the US. Environmental scholars told The Verge that AI has two things in common with Bitcoin mining: a constant push to scale up and a lack of transparency over actual energy use and carbon emissions.

Industry disputes over transparency

OpenAI CEO Sam Altman wrote in a June 2024 blog post that ChatGPT's average per-query energy use was only 0.34 watt-hours, roughly equal to an oven running for one second or a low-energy light bulb glowing for a few minutes. The claim drew skepticism from industry experts, and Wired reported that critics questioned whether Altman's figure accounted for image generation, model training, or the energy needed for server cooling, raising doubts about its reliability.

Article edited by Jerry Chen