Google has disclosed new details about the environmental footprint of its artificial intelligence chatbot Gemini, saying each text prompt consumes only a fraction of energy and water compared with earlier public estimates.
According to a technical paper and accompanying blog post released by the company, a single text query on Gemini uses about 0.24 watt-hours (Wh) of energy — roughly equivalent to watching nine seconds of television. That consumption, Google says, translates to about 0.03 grams of carbon dioxide emissions. In addition, each query requires around 0.26 millilitres of water, or approximately five drops, largely used in cooling data centre equipment.
The company stressed that its measurements accounted not only for the power consumed by the chips running Gemini but also the energy used by IT equipment in data centres, idle chip power, and water for cooling systems. By including these factors, Google argued, its estimates provide a more accurate picture of environmental impact than many existing studies.
“Per-prompt emissions are quite small,” the blog post noted, adding that the company’s figures show energy and water usage to be “substantially lower than many public estimates.”
The announcement comes as concerns grow about the rising energy demands of advanced computing. The International Energy Agency (IEA) recently projected that electricity demand from data centres, AI, and cryptocurrency could double by 2030, with AI alone expected to consume up to 945 terawatt-hours annually — nearly equivalent to Japan’s current power use.
Comparisons between Gemini and other platforms highlight stark differences. A study by the Electric Power Research Institute estimated that a prompt issued to OpenAI’s ChatGPT consumes 2.9 Wh of energy, nearly ten times Google’s figure. By contrast, a traditional internet search requires about 0.3 Wh.
Despite these relatively low per-query figures, Google’s overall emissions have surged in recent years. Its latest environmental report showed emissions up 51 percent since 2019, driven largely by the production and assembly of hardware needed to support AI technology. The company acknowledged that upstream supply chain activities are contributing significantly to its carbon footprint.
At the same time, Google said efficiency improvements are underway. The company claims that since August 2024, energy use and carbon emissions per Gemini prompt have fallen 33-fold and 44-fold respectively, reflecting advances in hardware and software optimization.
However, analysts note that the company’s data leaves key questions unanswered. While per-query emissions are modest, Google has not disclosed the total number of Gemini prompts processed daily. Without those figures, the full scale of the chatbot’s energy demand remains unclear.
As AI adoption accelerates worldwide, the debate over its environmental costs is intensifying. Google’s new disclosures suggest progress in efficiency but also underscore the challenge of balancing technological innovation with sustainability.
