Artificial intelligence is undeniably reshaping how we live, making breakthroughs in healthcare, education, entertainment and more. But beneath the magic, there’s a less-talked-about story unfolding: the environmental impact.
Every time you pop a question to an AI tool, however trivial your query might seem, it nudges up the overall energy consumption. That’s because powering those fast algorithms requires more than just clever software. It demands serious hardware, electricity and cooling systems.
How Much Energy Does AI Consume?
Consider this: training GPT-3, one of OpenAI’s earlier models, produced about 552 tonnes of CO₂. That’s roughly what 110 petrol cars emit in a whole year. Another estimate puts a similar model’s footprint at 283 tonnes, which is nearly 300 return flights from New York to San Francisco.
Moreover, the energy consumed during inference, when models respond to user queries, accounts for about 60 percent of total energy usage, compared with 40 percent for training. So every time you ask for a quick translation or weather update, you’re tapping into a serious energy network behind the scenes.
Cooling Takes a Lot More Than We Think
Servers aren’t just power hungry, they run hot. Data centres often rely on vast amounts of water to keep things cool. In fact, one study found that training GPT-3 alone evaporated around 700,000 litres of fresh water. Forecasts suggest global water usage by AI data centres could reach between 4.2 and 6.6 billion cubic metres by 2027, almost half the total annual water used in the UK.
Then there are proposals for data centres that could end up consuming more energy than airports. One such plan in Lincolnshire could generate as much CO₂ as five times Birmingham Airport while guzzling 3.7 billion kWh a year.
A Slow Response from Industry
Some tech giants are stepping up. Microsoft is exploring nuclear power options, and Meta has locked in long-term nuclear contracts to help balance rising demand. Yet indirect emissions from companies like Amazon, Google and Microsoft shot up by 150 per cent between 2020 and 2023.
In the UK, there’s growing concern that AI expansion might clash with climate goals, despite government enthusiasm for AI growth.
Smarter, Greener Ways Forward
The good news is that some clever approaches are already helping reduce AI’s environmental footprint:
- Energy-efficient hardware – Brands like IBM are creating specialised chips that deliver similar performance using far less power.
- Optimised computing – Techniques such as pruning models or using leaner language models can cut energy needs without significantly compromising output.
- Greener data centres – Innovations like liquid cooling, on-site solar or wind power and low-power designs are making a difference.
- Workload timing and location – When and where tasks are run can influence their environmental impact. Shifting operations to periods of abundant renewable energy can cut emissions substantially.
A bright example is BLOOM, a large language model trained using cleaner chipsets. It emitted just 25 tonnes of CO₂, vastly less than GPT-3 for similar capabilities. On top of that, it’s free, works in 46 languages and doesn’t demand expensive hardware, making it a more accessible and greener option.
Politeness Comes with a Price
Even saying “please” and “thank you” to ChatGPT isn’t quite free. OpenAI’s CEO, Sam Altman, recently joked that these polite words collectively cost millions of pounds in electricity every year. And yet he added that it’s “well worth it”. After all, billions of polite prompts, tiny as they seem, do add up in energy usage.
Innovation vs Impact: Finding Balance
AI holds enormous promise, from curing diseases to helping model climate solutions. But its environmental footprint is growing fast. UN reports warn that indirect AI emissions from leading firms could reach over 100 million tonnes annually. Data centre energy use could double or even triple by 2028 if left unchecked.
So what’s needed?
- Transparency: AI providers should openly share stats on energy and water use.
- Policy and regulation: Incentives or mandates for green energy use and efficient cooling practices will be essential.
- Focus on efficient tech: Investment in hardware, algorithms and infrastructure innovation is vital.
- User awareness: We all have a tiny part to play, even small changes or being conscious of our prompts can make a difference.
Bottom Line
Every model trained, every query generated, it all adds up. From strain on power grids, to water scarcity, to extra carbon emissions, AI’s environmental cost is no fiction. It’s a responsibility shared by researchers, tech firms, regulators and yes, even users.
Even politeness matters when multiplied across millions of requests. Mindfulness in our language can be part of a broader push toward an eco-friendlier AI future.








