The Environmental Impact of Data Centres
When you upload a photo to the cloud or stream a video, it feels weightless. Digital. Clean. But somewhere, a server in a massive warehouse is processing that request, and that server needs power. Lots of it.
Data centres currently consume about 1-2% of global electricity. That might not sound like much, but it’s roughly equivalent to the entire aviation industry. And unlike planes, which we’re trying to make more efficient, data centres are growing exponentially.
The Scale Is Wild
A single large data centre can use as much power as a small city. Google’s data centre in Hamina, Finland pulls enough electricity to power about 50,000 homes. And Google operates dozens of these facilities globally.
Microsoft’s 2025 sustainability report showed their total electricity consumption increasing by 22% year-over-year, almost entirely driven by cloud and AI services. They’re building renewable energy capacity as fast as they can, but demand is growing faster.
The AI boom is making this worse. Training a large language model can consume as much energy as several American homes use in a year. Every ChatGPT query, every AI-generated image, every smart recommendation algorithm – they all require compute, and compute requires power.
Water Is the Hidden Problem
Most people focus on electricity, but water consumption is equally concerning. Data centres use water for cooling – either directly or indirectly through evaporative cooling towers.
In regions with water scarcity, this creates real tension. Google’s data centre in The Dalles, Oregon uses millions of gallons daily from the Columbia River. Locals aren’t thrilled about tech companies competing with agriculture and residential use for limited water resources.
Some facilities are switching to air cooling, but that’s only practical in cooler climates. Try running air cooling in Singapore or Arizona and watch your electricity bill explode from the AC requirements.
The Efficiency Gains Are Real, But…
To be fair, data centres have gotten dramatically more efficient. Modern facilities achieve PUE (Power Usage Effectiveness) ratings around 1.2, meaning only 20% of power is “wasted” on cooling and other overhead. Twenty years ago, PUE of 2.0 or higher was common.
Virtualization and better server utilization mean we’re getting more compute per watt than ever before. Cloud providers can achieve economies of scale that individual company server rooms never could.
But here’s the problem: efficiency gains are being completely swamped by increased demand. We’re using data centres for everything now – streaming, gaming, social media, work collaboration, AI services. The pie is growing faster than we can make it efficient.
Renewable Energy Isn’t a Magic Fix
Tech companies love announcing renewable energy commitments. Google claims to match 100% of their electricity consumption with renewable purchases. Microsoft and Amazon have similar goals.
This is good! But the details matter. “Matching” often means buying renewable energy credits from a solar farm in one location while actually consuming grid power (which might be coal) in another location. The accounting works out, but the actual carbon emissions still happen.
Geographic constraints are real too. You can’t build a data centre anywhere – you need fiber connectivity, reliable power grids, and often proximity to users for latency reasons. Sometimes the best location for a data centre is somewhere with terrible renewable energy availability.
What Actually Needs to Happen
Better transparency: We should know the environmental cost of services we use. What if your cloud storage provider showed energy consumption per gigabyte? Might make people think twice about storing every blurry photo from 2012.
Geographic optimization: Put data centres where renewable energy is abundant and cooling is easier. Iceland and Norway are popular for this reason – cold climate plus geothermal and hydro power.
Workload scheduling: Not everything needs to happen instantly. Some computing tasks could run when renewable energy is abundant and grid load is low. We’re barely scratching the surface of this.
Hardware lifecycle management: The carbon cost of manufacturing servers is significant. Keeping servers running longer, when practical, can sometimes be better than constantly upgrading to slightly more efficient models.
The AI Dilemma
Here’s the uncomfortable truth: AI services are incredibly resource-intensive, and we’re deploying them everywhere. Every website adding an AI chatbot, every app adding AI features, every company building custom AI models – it all adds up.
Some of this AI usage is genuinely valuable. Some of it is frankly stupid – like using AI to generate email responses that could’ve been a simple template. We need to get better at asking: is this AI application worth the environmental cost?
What You Can Actually Do
Individual actions barely move the needle compared to corporate and policy changes. But if you care:
- Choose cloud providers with genuine renewable energy commitments (and check the details)
- Delete data you don’t need – storage isn’t free from an energy perspective
- Be selective about video quality when streaming (4K uses way more data than 1080p)
- Ask your employer about their data centre and cloud providers’ environmental policies
The most impactful thing is probably supporting policy changes that require transparency and accountability from tech companies about environmental impact.
The Bigger Picture
We’re not going back to a world with less computing. That ship has sailed. The question is whether we can grow computing capacity while actually reducing environmental impact, not just slowing the rate of increase.
It’s technically possible. Whether we have the political will and economic incentives to do it is another question entirely.
Every time you think “the cloud,” remember it’s actually someone else’s giant warehouse full of humming servers, consuming electricity and water, 24/7. That doesn’t mean we shouldn’t use these services. But we should at least be honest about the trade-offs.