The Energy Usage of Every ChatGPT Prompt

The Energy Usage of Every ChatGPT Prompt

The Energy Usage of Every ChatGPT Prompt


reintroducing nuclear energy to support their AI goals, while OpenAI is exploring the concept of establishing data centers in outer space. With such futuristic plans in the pipeline, it’s only natural for individuals to question why major technology firms require significant power and the actual energy consumption of our daily interactions with AI services.

In light of our inquiries, companies like Google have disclosed details this year regarding energy usage and efficiency concerning their AI offerings, with OpenAI following suit shortly thereafter. In June, CEO Sam Altman shared a blog that mentioned the energy consumption of “the average” ChatGPT inquiry: 0.34 watt-hours. Altman compares this to “approximately what an oven would use in just over a second, or what a high-efficiency lightbulb would consume in a few minutes.”

So, does that mean each ChatGPT prompt truly uses 0.34 watt-hours of energy? Regrettably, it’s likely not so straightforward. Although the figure might be precise, Altman provided no context or detail on how it was derived, which significantly hampers our grasp of the scenario. For example, we don’t know how OpenAI defines an “average” ChatGPT inquiry, considering the LLM can perform various tasks, such as responding to general inquiries, coding, and generating images — all of which demand different energy levels.

Why is AI energy consumption so intricate?