
Marissa Loewen first started using artificial intelligence in 2014 as a project management tool. She has autism and ADHD and said it helped immensely with organizing her thoughts.
“We try to use it conscientiously though because we do realize that there is an impact on the environment,” she said.
Her personal AI use isn't unique anymore. Now it’s a feature in smartphones, search engines, word processors and email services. Every time someone uses AI, it uses energy that is often generated by fossil fuels. That releases greenhouse gases into the atmosphere and contributes to climate change.
And it's getting harder to live without it.
The climate cost
AI is largely powered by data centers that field queries, store data and deploy information. As AI becomes ubiquitous, the power demand for data centers increases, leading to grid reliability problems for people living nearby.
“Since we are trying to build data centers at a pace where we cannot integrate more renewable energy resources into the grid, most of the new data centers are being powered by fossil fuels,” said Noman Bashir, computing and climate impact fellow with MIT's Climate and Sustainability Consortium.
The data centers also generate heat, so they rely on fresh water to stay cool. Larger centers can consume up to 5 million gallons (18.9 million liters) a day, according to an article from the Environmental and Energy Study Institute. That's roughly the same as the daily water demand for a town of up to 50,000 people.
It’s difficult to imagine, because for most users the impact isn’t visible, said AI and Climate Lead Sasha Luccioni with the AI company, Hugging Face.
“In one of my studies, we found that generating a high-definition image uses as much energy as charging half of your phone. And people were like, ‘That can’t be right, because when I use Midjourney (a generative AI program), my phone battery doesn’t go down,’” she said.
Jon Ippolito, professor of new media at the University of Maine, said tech companies are constantly working to make chips and data centers more efficient, but that does not mean AI’s environmental impact will shrink. That’s because of a problem called the Jevons Paradox.
“The cheaper resources get, the more we tend to use them anyway,” he said. When cars replaced horses, he said, commute times didn’t shrink. We just traveled farther.
Quantifying AI's footprint
How much those programs contribute to global warming depends on a lot of factors, including how warm it is outside the data center that's processing the query, how clean the grid is and how complex the AI task is.
Information sources on AI's contributions to climate change are incomplete and contradictory, so getting exact numbers is difficult.
But Ippolito tried anyway.
He built an app that compares the environmental footprint of different digital tasks based on the limited data he could find. It estimates that a simple AI prompt, such as, “Tell me the capital of France,” uses 23 times more energy than the same question typed into Google without its AI Overview feature.
“Instead of working with existing materials, it's writing them from scratch. And that takes a lot more compute,” Luccioni said.
And that's just for a simple prompt. A complex prompt, such as, “Tell me the number of gummy bears that could fit in the Pacific Ocean,” uses 210 times more energy than the AI-free Google search. A 3-second video, according to Ippolito's app, uses 15,000 times as much energy. It's equivalent to turning on an incandescent lightbulb and leaving it on for more than a year.
It's got a big impact, but it doesn't mean our tech footprints were carbon-free before AI entered the scene.
Watching an hour of Netflix, for example, uses more energy than a complex AI text prompt. An hour on Zoom with 10 people uses 10 times that much.
“It's not just about making people conscious of AI's impact, but also all of these digital activities we take for granted,” he said.
Limit tech, limit tech's climate impact
Ippolito said he limits his use of AI when he can. He suggests using human-captured images instead of AI-generated ones. He tells the AI to stop generating as soon as he has the answer to avoid wasting extra energy. He requests concise answers and he begins Google searches by typing “-ai” so it doesn't provide an AI overview for queries where he doesn't need it.
Loewen has adopted the same approach. She said she tries to organize her thoughts into one AI query instead of asking it a series of iterative questions. She also built her own AI that doesn’t rely on large data centers, which saves energy in the same way watching a movie you own on a DVD is far less taxing than streaming one.
“Having something local on your computer in your home allows you to also control your use of the electricity and consumption. It allows you to control your data a little bit more,” she said.
Luccioni uses Ecosia, which is a search engine that uses efficient algorithms and uses profits to plant trees to minimize the impact of each search. Its AI function can also be turned off.
ChatGPT also has a temporary chat function so the queries you send to the data center get deleted after a few weeks instead of taking up data center storage space.
But AI is only taking up a fraction of the data center's energy use. Ippolito estimates roughly 85% is data collection from sites like TikTok and Instagram, and cryptocurrency.
His answer there: make use of screen time restrictions on your phone to limit time scrolling on social media. Less time means less personal data collected, less energy and water used, and fewer carbon emissions entering the atmosphere.
“If you can do anything that cuts a data center out of the equation, I think that's a win,” Ippolito said.