OpenAI CEO Sam Altman said it is fair to compare the energy required for ChatGPT to answer a query with the energy a 20-year-old human has consumed over a lifetime to respond to the same query, suggesting that the AI chatbot could be considered more energy efficient in that way.
Speaking as a guest at Express Adda held on February 20, on the sidelines of the India-AI Impact Summit 2026, Altman said that many discussions around ChatGPT’s energy usage were unfair because they tend to focus on “how much energy it takes to train an AI model, relative to how much it costs a human to do one inference query.”
“But it also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart. And not only that, it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you,” he said.
In the view of the OpenAI chief, a fair comparison would be: “If you ask ChatGPT a question, how much energy does it take once its model is trained to answer that question versus a human? And probably, AI has already caught up on an energy efficiency basis, measured that way.”
Altman was responding to a question posed by Anant Goenka, Executive Director of The Indian Express Group, who cited a previous interview with Bill Gates and asked whether it’s accurate to say a single ChatGPT query currently uses the equivalent of 1.5 iPhone battery charges.
“There’s no way it’s anything close to that much,” Altman replied. He also said that artificial general intelligence (AGI) — a hypothetical state of technological maturity in which an AI system becomes capable of performing every task that a human being can, with greater precision, efficiency and speed — feels “pretty close at this point.” “Given what I know to be a faster takeoff, I expect [artificial] superintelligence is not that far off,” Altman said.
You can watch Anant Goenka’s full interview with Sam Altman at Express Adda by clicking on the link below.
Story continues below this ad
The massive build-out of AI infrastructure, including large data centres, has come under scrutiny because of the vast resources they consume, particularly energy which has also been connected to rising electricity prices. In January this month, the Trump administration and several US state governors signed a pact that calls for tech companies to pay for new power plants built on the PJM electricity grid in the country and is used to power data centres that are, in turn, used to train and power AI models.
Altman’s remarks also come at a time when India is positioning itself to be the data centre hub of the world, with investment commitments exceeding US $200 billion made in the country, a majority of which is dedicated to building out AI infrastructure here over the next decade or so.
AI’s water usage ‘totally fake’: Altman
In response to a question about the amount of water going into data centres housing GPU server racks that power AI models, Altman suggested that such concerns were “totally fake” because “we used to do evaporative cooling in data centres.”
“Now that we don’t do that, you see these things on the internet where, ‘Don’t use ChatGPT, it’s 17 gallons of water for each query’ or whatever. This is completely untrue, totally insane, no connection to reality,” he further said.
Story continues below this ad
However, he acknowledged that it was fair to be concerned about “the energy consumption — not per query, but in total, because the world is now using so much AI is real, and we need to move towards nuclear or wind and solar [energy] very quickly.”
Interestingly, Altman rejected the concept of space-based data centres. “Putting data centres in space with the current landscape is ridiculous. Orbital data centres are not going to matter at scale this decade due to the rough math of launch costs and how hard it is to fix a broken GPU in space,” Altman told Goenka.
A growing number of tech companies, including SpaceX, owned by Altman’s archrival Elon Musk, wants to show how outer space can be a hospitable environment for data centres in comparison to enormous multi-gigawatt terrestrial facilities that consume millions of litres of water daily and produce substantial amount of greenhouse gas emissions.
Altman’s clip on the energy usage of AI also went viral on social media and triggered various reactions online:
I build AI for a living. I believe in what we’re building. But this kind of rhetoric makes my work harder and more dangerous.@sama, comparing human development to model training is tone-deaf, strategically reckless.
People are losing jobs. They’re getting angry. They’re… https://t.co/DgiaPEABm1
— Muratcan Koylan (@koylanai) February 21, 2026
https://platform.twitter.com/widgets.js
I don’t get it: aren’t you building a tool to make human lives easier?
What sort of an argument equates an ai model to a human being, comparing a tool to an end user? Or are you imagining a world where it’s the other way around?
For years, entrepreneurs have forged… https://t.co/Xm0RHeFgN6
— Raunaq Mangottil (@RaunaqMangottil) February 22, 2026
https://platform.twitter.com/widgets.js
Sam Altman just said training AI is like training a human because “it takes 20 years of life and all the food you eat before you get smart.”
Let’s run the actual numbers he is hoping you will not run.
Training a human for 20 years costs roughly 17 megawatt-hours of food energy.… https://t.co/scaKGSS9bq pic.twitter.com/JCk9viDseI
— Shanaka Anslem Perera ⚡ (@shanaka86) February 21, 2026
https://platform.twitter.com/widgets.js
Always look at what they do; not what they say
After making a case that a human being is an inefficient use of energy, a reminder that Altman chose to have a kid recently. Given that he is gay, this was not easy or natural but…
Every single tech founder, every single one of… https://t.co/JQaHEFDQNw
— Devina Mehra (@devinamehra) February 22, 2026
https://platform.twitter.com/widgets.js
A human consumes about 2,000 calories per day. Over 20 years, that’s roughly 17,000 kWh of total food energy. Training GPT-4 consumed an estimated 50 GWh of electricity. That’s 3,000 humans worth of “training energy” for a single model run.
And GPT-4 is already dead. OpenAI… https://t.co/C7KojpAIqA
— Aakash Gupta (@aakashgupta) February 21, 2026


