- By Prateek Levi
- Mon, 21 Apr 2025 12:56 PM (IST)
- Source:JND
ChatGPT: Just a few weeks ago we saw the Ghibli trend taking over the internet, with everyone rushing to ChatGPT to get their pictured memories translated into a Japanese-style art. This was the time when OpenAI came forth and asked people to slow down a bit, as it was melting their servers down. Now only a few days after Sam Altman has opened up to the public again and has revealed something even more shocking. He has revealed that those simple gestures that users make, like 'please' or 'thank you', are actually quite expensive for the AI giant and cost it millions of dollars in operational expenses. Surprising, right?
A curious user on X (formerly Twitter) asked Altman, “How much money has OpenAI lost in electricity costs from people saying ‘please’ and ‘thank you’ to their models?” Altman responded, “Tens of millions of dollars well spent.” He then also added, “You never know.”
Now all this sounds fun and games till the time we factor in the massively growing energy infrastructure that the step demands, which is tied to language models like ChatGPT, especially when the public goes on a frenzy and their usage starts hitting the roof. Each interaction that takes place, no matter how small, adds to the computational load, which in turn increases energy usage and the cost tied to it.
ALSO READ: New iPhone 17 Pro CAD Render Surfaces Online: Will It Feature A Design Similar To The iPhone 16 Pro?
ChatGPT users have been increasing by the day, and the average weekly user numbers have surpassed 150 million by now. A report from Goldman Sachs has pointed out that every ChatGPT-4 query/prompt consumes 2.9 watt-hours of electricity, which is times more than a normal search you make on Google. Now if you think that it's not much, then you need to factor in that it receives over a billion prompts a day; this means nearly 2.9 million kilowatt-hours of energy is being consumed on a daily basis.
While these chatbots come with their own advantages, there is a higher hidden cost that we usually don't see, and it is on the rise. With people relying more on AI for their day-to-day activities, the demand for energy also rises.
It must be noted that some netizens were also quick enough to lend a helping hand to Sam Altman with possible solutions. One user proposed that OpenAI could cut down on electricity by using client-side code to respond with a simple “You’re welcome.” Another joked that ChatGPT should stop ending every response with a question to save power.
With this said, it does raise questions on the rampant usage of electricity that these AI giants require, and is it feasible for the current energy infrastructure to deal with it? While AI is integrating itself with our daily lives, we must also keep in mind the rising energy costs and its subsequent impact on the environment.