At The Indian Express AI Summit, Sam Altman challenged claims about AI’s environmental impact, urging a rapid shift to renewable and nuclear power to support the sector’s exponential growth while tackling transparency and efficiency challenges in data-centre energy use.
Sam Altman used a keynote at The Indian Express AI Summit in Mumbai to push back forcefully against what he called misleading claims about the environmental cost of large language models while urging the tech sector to accelerate its shift to low‑carbon power sources.
According to coverage of the speech, Altman dismissed widely circulated figures that allegedly attribute about 17 gallons of water to each ChatGPT query as “totally fake” and “completely untrue,” saying those calculations relied on obsolete data‑centre cooling practices. He argued modern facilities increasingly rely on outside‑air cooling, liquid immersion and other approaches that substantially lower water use, although he conceded that aggregate energy consumption across training and inference workloads remains a material concern.
Altman framed the problem less as per‑query metrics than as total system demand. He told the conference that a fairer comparison is with human intelligence because biological development and education consume vast amounts of energy: “It takes like 20 years of life and all of the food you eat during that time before you get smart.” Using that perspective, he suggested AI can already be more energy‑efficient for some cognitive tasks once the upfront training cost is amortised over many billions of inferences. Critics, the reports noted, cautioned that the analogy risks oversimplifying biological and social costs into units of energy.
The OpenAI chief did not limit his remarks to rebuttal. He urged a rapid expansion of low‑carbon capacity to decouple AI growth from fossil fuels, calling for a mix that includes large‑scale solar and nuclear. Financial Express summarised his position as advocating solar for scalable renewable supply and nuclear for dense, reliable baseload power, a view echoed in the industry analysis by Goldman Sachs Research, which projects that new nuclear capacity will be an important component of the infrastructure needed to serve surging data‑centre demand. Goldman Sachs estimates that meeting the projected growth in AI‑driven data‑centre power could require on the order of 85–90 gigawatts of additional nuclear capacity by 2030, while also stressing that gas, renewables and storage will have to play supporting roles.
Those calls reflect a broader industry recognition of the electricity implications of AI. Studies cited in reporting suggest AI‑related computation could account for several percentage points of global electricity demand within the next decade, driven by energy‑intensive model training and continuous inference workloads. Data‑centre operators and hyperscalers have improved energy performance markedly: power usage effectiveness for top facilities has fallen from averages above 2.0 a decade ago to roughly 1.2 in cutting‑edge sites, and specialised AI accelerators now deliver orders‑of‑magnitude better computation per watt compared with processors from five years ago. Nonetheless, rising absolute consumption is an expected consequence of exponential compute growth.
Transparency remains a central hurdle to firm assessment. There are no universal reporting obligations that compel technology firms to disclose detailed energy or water use for specific models, and reporters and researchers must often rely on reverse engineering and estimates. That opacity, industry analysts warn, fuels both alarmist figures and uncertainty. Independent researchers have produced disparate estimates showing model training can consume electricity on the scale of hundreds of homes for a year, but companies rarely publish line‑level operational numbers.
The expansion of data centres also has local economic and grid implications. Utilities and municipal planners sometimes face upward pressure on retail electricity prices when major computing facilities concentrate in a region, because the infrastructure investments required for added generation and transmission are substantial and frequently capitalised into rates. Several local authorities now consider bespoke tariff structures or community benefit agreements for large data‑centre projects to manage those effects.
Researchers and sustainability experts offer a nuanced view of the trade‑offs. Dr Emma Strubell, a computer scientist focused on AI sustainability, is reported as warning that small per‑query footprints multiply into significant totals at scale: “We must consider scale,” she said, arguing that billions of daily queries mean efficient inference still consumes meaningful energy. Other analysts highlight offsetting benefits: AI tools can optimise logistics, manufacturing and energy systems themselves, potentially reducing emissions across sectors if deployed thoughtfully.
Industry responses are aligning with those technical and policy challenges. Technology firms are investing in algorithmic efficiency, models that achieve comparable performance with fewer computations, and in hardware tailored to AI workloads. Data‑centre developers are increasingly siting facilities near renewable generation, adopting carbon‑aware computing schedules to run intensive tasks when clean output is abundant, and pursuing lifecycle assessments that account for manufacturing and end‑of‑life impacts.
Altman’s remarks in Mumbai underline the delicate balancing act facing industrial decarbonisation professionals: reconcile rapid capability growth in AI with the need for reliable, low‑carbon power and clearer measurement. His call for “nuclear or wind and solar very quickly,” as reported, puts a premium on coordinated investment in generation, grid capacity and regulation if the sector is to scale without exacerbating emissions. The debate over appropriate metrics, how to compare machine and human intelligence, and how to capture indirect emissions savings from AI applications, will persist, but the immediate policy and engineering priorities are clearer: improve transparency, continue efficiency gains and accelerate deployment of zero‑carbon energy where data centres are built.
- https://bitcoinworld.co.in/sam-altman-ai-energy-debate/ – Please view link – unable to able to access data
- https://www.financialexpress.com/life/technology/openai-ceo-sam-altman-calls-for-shift-to-solar-nuclear-energy-for-future-of-ai-4149740/ – OpenAI CEO Sam Altman has urged the global tech industry to accelerate the transition to solar and nuclear energy to meet the growing power demands of AI data centres. Speaking at an event hosted by The Indian Express, Altman addressed concerns over the environmental footprint of large-scale AI operations, particularly water and electricity usage in data centres. He dismissed exaggerated claims about water consumption, stating that such figures are ‘fake’ and clarified that while individual query water usage has been overstated, the broader issue of total energy consumption remains critical. Altman emphasised the need for a swift move to low-carbon, high-output energy sources, specifically solar for scalable renewable capacity and nuclear for reliable, high-density baseload power, to decouple AI growth from fossil fuel dependency.
- https://www.goldmansachs.com/insights/articles/is-nuclear-energy-the-answer-to-ai-data-centers-power-consumption – Goldman Sachs Research explores the role of nuclear energy in meeting the power demands of AI data centres. The article highlights that nuclear power will be a key part of a suite of new energy infrastructure built to meet surging data-centre power demand driven by artificial intelligence. However, it also notes that nuclear cannot meet all of the increased data-centre power needs, with natural gas, renewables, and battery technology also playing a role. The piece discusses recent contracts signed by big tech companies for new nuclear capacity and forecasts that 85-90 gigawatts of new nuclear capacity would be needed to meet all of the data centre power demand growth expected by 2030.
- https://www.tomshardware.com/tech-industry/artificial-intelligence/ai-energy-efficiency-comparisons-unfair-bleats-sam-altman-citing-amount-of-energy-needed-to-evolve-then-train-a-human-one-takes-like-20-years-of-life-and-all-of-the-food-you-eat-during-that-time-before-you-get-smart-he-argues – OpenAI CEO Sam Altman argues that comparisons between the energy efficiency of AI and humans are ‘unfair.’ He claims that training a human to become intelligent also consumes vast amounts of energy—approximately 20 years of life and all the food consumed during that period. Altman frames human intelligence as the product of millennia of evolution and cultural development, suggesting that when factoring in the broader human learning process, AI may already be comparable in energy efficiency. Critics questioned the logic of comparing biological development to algorithmic training and warned of the risk of dehumanizing human growth by reducing it to energy inputs. However, Altman also emphasised the need for AI companies to adopt more sustainable energy practices to reduce pressure on global energy resources.
- https://www.tomshardware.com/tech-industry/sam-altman-would-like-remind-you-that-humans-use-a-lot-of-energy-too/ – OpenAI CEO Sam Altman addressed concerns about AI’s environmental impact, particularly regarding water usage, stating that such claims are ‘totally fake’ and ‘completely untrue.’ He acknowledged that earlier media reports had sensationalised per-query water figures, sometimes citing outdated or misapplied evaporative cooling statistics. Altman clarified that while individual query water usage has been overstated, the broader issue of total energy consumption remains critical. He emphasised the need for a swift move to low-carbon, high-output energy sources, specifically solar for scalable renewable capacity and nuclear for reliable, high-density baseload power, to decouple AI growth from fossil fuel dependency.
- https://www.tomshardware.com/tech-industry/artificial-intelligence/ai-energy-efficiency-comparisons-unfair-bleats-sam-altman-citing-amount-of-energy-needed-to-evolve-then-train-a-human-one-takes-like-20-years-of-life-and-all-of-the-food-you-eat-during-that-time-before-you-get-smart-he-argues – OpenAI CEO Sam Altman argues that comparisons between the energy efficiency of AI and humans are ‘unfair.’ He claims that training a human to become intelligent also consumes vast amounts of energy—approximately 20 years of life and all the food consumed during that period. Altman frames human intelligence as the product of millennia of evolution and cultural development, suggesting that when factoring in the broader human learning process, AI may already be comparable in energy efficiency. Critics questioned the logic of comparing biological development to algorithmic training and warned of the risk of dehumanizing human growth by reducing it to energy inputs. However, Altman also emphasised the need for AI companies to adopt more sustainable energy practices to reduce pressure on global energy resources.
- https://www.tomshardware.com/tech-industry/sam-altman-would-like-remind-you-that-humans-use-a-lot-of-energy-too/ – OpenAI CEO Sam Altman addressed concerns about AI’s environmental impact, particularly regarding water usage, stating that such claims are ‘totally fake’ and ‘completely untrue.’ He acknowledged that earlier media reports had sensationalised per-query water figures, sometimes citing outdated or misapplied evaporative cooling statistics. Altman clarified that while individual query water usage has been overstated, the broader issue of total energy consumption remains critical. He emphasised the need for a swift move to low-carbon, high-output energy sources, specifically solar for scalable renewable capacity and nuclear for reliable, high-density baseload power, to decouple AI growth from fossil fuel dependency.
Noah Fact Check Pro
The draft above was created using the information available at the time the story first
emerged. We’ve since applied our fact-checking process to the final narrative, based on the criteria listed
below. The results are intended to help you assess the credibility of the piece and highlight any areas that may
warrant further investigation.
Freshness check
Score:
10
Notes:
The article reports on recent statements made by OpenAI CEO Sam Altman at the Indian Express AI Summit in Mumbai on February 20, 2026. The content is current and original, with no evidence of prior publication or recycling from other sources. ([indianexpress.com](https://indianexpress.com/article/technology/artificial-intelligence/ai-energy-water-usage-sam-altman-openai-express-adda-10546154/?utm_source=openai))
Quotes check
Score:
10
Notes:
Direct quotes from Sam Altman are consistent across multiple reputable sources, including The Indian Express and TechCrunch. ([indianexpress.com](https://indianexpress.com/article/technology/artificial-intelligence/ai-energy-water-usage-sam-altman-openai-express-adda-10546154/?utm_source=openai))
Source reliability
Score:
10
Notes:
The primary source, The Indian Express, is a well-established and reputable news organisation in India. Secondary sources like TechCrunch and The Economic Times are also credible. ([indianexpress.com](https://indianexpress.com/article/technology/artificial-intelligence/ai-energy-water-usage-sam-altman-openai-express-adda-10546154/?utm_source=openai))
Plausibility check
Score:
10
Notes:
Sam Altman’s statements align with known facts about AI’s energy consumption and human development. His comparison between AI and human energy usage is a reasonable perspective in the context of the discussion. ([indianexpress.com](https://indianexpress.com/article/technology/artificial-intelligence/ai-energy-water-usage-sam-altman-openai-express-adda-10546154/?utm_source=openai))
Overall assessment
Verdict (FAIL, OPEN, PASS): PASS
Confidence (LOW, MEDIUM, HIGH): HIGH
Summary:
The article is current, original, and supported by consistent and reliable sources. All claims are verifiable, and the content type is appropriate for factual reporting. No significant concerns were identified.

