In the two and a half years since ChatGPT first rolled out to the public, it’s risen to become the fifth most visited website in the world. Interactions with it and other AI-powered tools like Google’s Gemini and Microsoft’s Copilot are increasingly embedded in our daily routines. These digital assistants—called large language models—can summarize articles, take meeting notes, create a menu for next week’s barbecue, calculate a rocket’s coefficient of drag, or transform an image of your cat into a Studio Ghibli-style animated princess.
There’s something magical—almost surreal—about all this, isn’t there?
What’s quite real, however, is that every prompt entered into one of these models has a cost attached to it. AI relies on a complex infrastructure: The applications tap sprawling data centers that demand tremendous amounts of electricity to run and large amounts of water to keep equipment cool. AI could guzzle as much as 6.6 billion cubic meters of water by 2027, which is about six times what Denmark uses in a year. The Department of Energy estimates that data centers consumed 4.4% of the U.S.’s electricity in 2023 and says that portion could hit 12% by 2028.
Those numbers apply to the whole of AI, which includes training the models and their day-to-day use, or what computer scientists call the “inference” phase. It’s at that point—when the public can start peppering the models with millions of prompts a day—where things really begin to add up. “Inference will account for more emissions over time than training,” says Maximilian Dauner, a researcher at the Technical University of Berlin who studies the environmental impact of AI systems. “That’s where the scale really hits.”
How much energy does an AI search use?
Companies keep mostly quiet about the specifics of their models’ power consumption, but third parties have been able to give us an idea of the tally. One study from the University of California, Berkeley, found that training GPT-3 consumed roughly 1.3 million kilowatt-hours of electricity and emitted around 500 metric tons of carbon dioxide. Out in the wild, though, ChatGPT gets 122 million daily visitors who feed it more than 1 billion prompts. According to the International Energy Agency, a single prompt uses about 2.9 watt-hours of electricity, which means global collective queries could use 2.9 million kilowatt-hours daily—about the same amount 100,000 average U.S. homes use in a day.
For those who look at those numbers and wonder if they can opt out of AI entirely, the answer is, eh, not really. “We can’t avoid it,” says Karen Panetta, an IEEE fellow and professor of electrical engineering at Tufts University. It’s been integrated into everything from search engines to language learning apps, and in every corner of nearly every industry. It’s also embedded in programs that run in the background, like autocomplete suggestions in text messages. “Unless someone is going to disconnect from all electronic devices and the internet, they will not be able to escape it.” she says.
How you can soften AI's impact
So, what should everyday users minding their emissions do? Understand how to use generative AI tools like ChatGPT more intentionally. “AI is coming and there’s no avoiding it,” says Panetta. “Education is the only way to learn how to mitigate the misuses and wastefulness of energy resources.”
A study Dauner recently published in Frontiers in Communication offers helpful insights for doing just that. In it, he compared 14 large language models to determine how much CO2 they produce from different prompts. He and his team asked each model 500 multiple-choice and 500 open-ended questions, tracked the accuracy of their answers, and compared that with both energy expenditure and resulting emissions. ChatGPT wasn’t part of the study, because its code isn’t publicly available, but Dauner's findings can help color our general understanding of what separates an energy-intensive (and therefore more polluting) query from one that's more-benign.
Here's what to keep in mind:
1. Be picky about when you use it
Before opening up an AI assistant, ask yourself: Do I really need to use it for this? Generating a funny image or typing in an obvious question that could be answered via a simple web search—say, asking how many quarts are in a liter—may seem harmless but comes with a cost. “I’m not saying don’t use AI,” Dauner says. “But fasten up and be more direct and keep the unnecessary stuff away.”
There are instances where asking an AI to do a task may actually be more efficient than doing the same job manually. A 2024 study published in Nature found that producing a page of AI-generated text emits roughly 2 grams of carbon dioxide equivalent emissions per page, while the same amount of human-generated text could be responsible for more than 1,000 grams. The study included electricity use and work emissions to estimate the human impact. So you could, presumably, assume that asking a bot to summarize a report could be gentler on the Earth—though the study didn't account for the time it takes for a person to check the accuracy of a bot's work.
2. Choose the right model when available
Models trained with fewer parameters tend to use less energy in the inference phase—though they can still answer questions or summarize information. Extremely narrow models, like customer service bots, are efficient because they’re only expected to answer a limited amount of questions, Panetta notes. Those that can handle open-ended tasks like creative writing or responding to more nuanced questions require more "thinking" and therefore use more energy.
Some large-scale platforms, including ChatGPT, let users toggle between different models, which can have varying levels of energy efficiency. If you’re using the free plan, you’re using GPT-4o, which is Open AI’s newest and most efficient version. Paid users can access others, including GPT 4o-mini. That model was trained on fewer parameters. While it is more limited in scope, it can handle everyday tasks like answering basic questions or summarizing text.
There is a tradeoff between emissions and accuracy, according to Dauner’s recent study. For instance, the newest version of DeepSeek, which learned hundreds of billions of parameters during its training, emitted 2,000 grams of CO2 per 1,000 prompts and was 79% accurate. Meanwhile Qwen, a lightweight model trained on tens of billions of parameters, emitted just 27 grams but had a 40% accuracy rate. The narrower model Cogito produced 1,300 grams of CO2 per prompt with 80% accuracy.
3. Use short, specific prompts
Dauner’s study also found that emissions vary by task complexity, which can include how a question’s posed. “Harder tasks generate more emissions because they keep the model running longer,” he explains. A vague or politely phrased request might also result in a more nuanced response, which keeps the model running longer and consumes more power. “It doesn't matter if a model generates 1,000 useful words or 1,000 [meaningless] words, it would still emit the same amount of CO2,” Dauner says.
To minimize potential emissions when writing a prompt, he suggests the following:
- Avoid vague questions that could trigger the model to ramble through an elaborate answer
- Tell it to give you its response in bullet points
- Use phrases like 'keep it brief' in your prompts
- Write specific prompts that are less likely to require many followups
4. Understand it’s not all on you
Going forward, Dauner and Panetta agree that while individuals have a role in addressing the footprint of AI, tech companies and governments bear the heaviest burden. Due to investments in large language models and other AI tools, the emissions of the world’s top tech companies—including Meta, Google, Microsoft, and Amazon—rose an average of 150% between 2020 and 2023, according to a report from the United Nations International Telecommunications Union.
The burden falls on those companies to make their models better—and on regulators to keep them honest about their footprints. “We need new power efficient technologies and smarter, more efficient algorithms to reduce training or have smarter learning techniques,” says Panetta. “Policies and standards will be essential so that there is a unified expectation to measure and evaluate AI models for energy efficiency and operation.”
Unfortunately, some recent actions signal a shift in the opposite direction. The Trump administration recently released a 28-page plan to fast-track AI infrastructure, which includes rolling back environmental protections in order to allow data centers to be built without review by the National Environmental Policy Act or Clean Water Act. More than 90 environmental groups have come out against it.
There’s no ridding ourselves of AI, and the technologies will only become smarter, more powerful, and more deeply embedded into our daily lives. As that happens, we can't lose sight of its environmental impact so that we and these bots are working smarter, not harder.






