ChatGPT, equipped with its enhanced privacy functionalities that can assist with nearly anything, to generative AI capabilities seamlessly integrated into your mobile device — such as Google’s AI-driven audio and video editing tools — AI is ubiquitous. As AI continues to become increasingly effective and reliable over time, it is also becoming pricier to operate.
Presently, it is estimated that a single inquiry in ChatGPT consumes approximately 0.34 watt-hours of energy — a figure that CEO of OpenAI, Sam Altman, compares to “about what an oven would use in just over one second” on his blog. While this may not seem significant in isolation, when one considers the volume of queries processed by ChatGPT daily, that figure escalates considerably. Consequently, the necessity for enhanced energy sources has intensified alongside the growth of AI. As more enterprises aspire to establish AI data centers to support their operations, this entails discovering more effective methods to supply the required energy without overtaxing the existing energy infrastructure.