Amid growing concerns about the environmental impact of artificial intelligence, Google says it has calculated the energy required for its Gemini AI service: Sending a single text prompt consumes as much energy as watching television for nine seconds.
The technology giant on Thursday unveiled a new methodology to measure the environmental impact of its AI models, including energy and water consumption as well as carbon emissions.
AI tools have the potential to drive economic growth by boosting productivity and unlocking other efficiencies, economists say. By one estimate from Goldman Sachs, the tech is poised to increase global GDP by 7%, or $7 trillion, over 10 years.
At the same time, scientists are flagging the outsized environmental impact of AI, which is not yet fully understood even as data centers require enormous amounts of electricity.
“In order to improve the energy efficiency of AI, a clear and comprehensive understanding of AI’s environmental footprint is important. To date, comprehensive data on the energy and environmental impact of AI inference has been limited,” Ben Gomes, Google’s senior vice president of learning and sustainability, said in a blog post Thursday.
Aside from their electricity needs, AI data centers also require “a great deal of water … to cool the hardware used for training, deploying, and fine-tuning generative AI models, which can strain municipal water supplies and disrupt local ecosystems,” research from MIT shows. “The increasing number of generative AI applications has also spurred demand for high-performance computing hardware, adding indirect environmental impacts from its manufacture and transport.”
Some new data centers require energy between 100 and 1000 megawatts, roughly equivalent to powering 80,000 to 800,000 homes, according to an April GAO report. For now, however, there are no regulations that require corporations to disclose how much energy or water their AI tools consume.
Google said in a technical paper released Thursday by its AI energy and emissions researchers that as adoption of AI tools rises, “so does the need to understand and mitigate the environmental impact of AI serving.”
What’s the environmental impact of Gemini AI?
Google’s new paper on the environmental impact of its own AI tools aims to set a standard for measuring the energy and water consumption as well as carbon emissions of various AI models, the company said.
A typical Gemini text query uses 0.24 watt-hours (Wh) of energy, emits 0.03 grams of carbon dioxide equivalent (gCO2e), and consumes 0.26 milliliters — or about five drops — of water.
By comparison, the average ChatGPT query uses 0.34 Wh and about one fifteenth of a teaspoon of water, Sam Altman, CEO of ChatGPT-maker OpenAI, has written.
Google also outlined the progress it has made in reducing the environmental impact of its Gemini platform. Over a recent 12-month period, the energy consumption and carbon footprint of the median Gemini text prompt decreased by factors of 33x and 44x, respectively, it said.
The quality of Gemini’s responses also improved over the same time period, the company said.
Megan Cerullo is a New York-based reporter for CBS MoneyWatch covering small business, workplace, health care, consumer spending and personal finance topics. She regularly appears on CBS News 24/7 to discuss her reporting.
The post What’s the environmental cost of AI? Google says it has an answer. appeared first on CBS News.