OpenAI’s CEO Sam Altman recently addressed the environmental implications of artificial intelligence during a keynote at an AI summit in India, organized by The Indian Express. During his talk, Altman dismissed claims regarding AI’s water consumption as “totally fake,” clarifying that previous concerns stemmed from evaporative cooling used in older data centers. He emphasized that such notions, particularly the assertion that each ChatGPT query consumes 17 gallons of water, are “completely untrue” and lack any factual basis.
While Altman acknowledged the genuine concern surrounding AI’s overall energy consumption—particularly as its use grows globally—he advocated for a swift transition to nuclear, wind, and solar energy sources to mitigate these impacts. In light of the absence of legal requirements for tech companies to reveal their energy and water usage, independent studies are being conducted to better understand these metrics, which are linked with escalating electricity prices.
In response to a question about the energy equivalent of a single ChatGPT query compared to an iPhone battery charge, Altman refuted the claim, stating, “There’s no way it’s anything close to that much.” He further criticized the narrative that compares energy costs of AI model training against human energy expenditure for single queries. Altman highlighted the extensive resources and time invested in human development, suggesting a more equitable energy efficiency comparison between trained AI and humans.
For those interested, the complete interview can be viewed online, with discussions on water and energy consumption starting at approximately 26:35.
