Introduction – Is AI bad for the environment
When you ask ChatGPT a question, check Google’s AI search results, or use voice assistants on your phone, you’re tapping into massive server farms operating around the clock. Unlike traditional websites that might sit idle between visitors, AI systems must run continuously—burning electricity and consuming water 24 hours a day, 365 days a year. This constant operation is fundamental to how modern AI works.
This “inference” phase—where trained AI models answer your queries—has quietly become one of the fastest-growing energy demands on the planet. While the training of AI models (where training means “teaching them from scratch”, a topic that ShrinkthatFootprint has covered here) gets media attention for its intensive power use, inference now accounts for 80% of AI’s total energy consumption. Training is like building a factory (expensive but one-time), while inference is like running that factory every day. As AI spreads into everything from your smartphone to your refrigerator to your car, understanding its environmental footprint will be important for thinking about carbon footprint and climate change.
Each point represents a published AI model in the Epoch AI “Data on AI Models” dataset, plotted by publication date and total parameter count (log scale) which affects both training and inference costs.
Source: Epoch AI, “Data on AI Models,” https://epoch.ai/data/ai-models (accessed January 12, 2026). Licensed under CC BY.
Why AI Inference Demands Constant Power
To understand AI’s energy problem, you need to understand how it differs from traditional computing. Traditional data centers had highly variable power use—corporate servers often sat idle overnight and on weekends, email servers only worked hard during business hours, and backup systems rarely activated. This variability meant power consumption fluctuated dramatically throughout the day and week.
AI changes this pattern. To deliver the instant responses users expect—whether you’re asking ChatGPT a question at 3 AM or requesting directions from Google Maps—companies must keep powerful processors on. AI inference servers run at roughly 40% utilization on average, compared to traditional enterprise servers that might idle at just 10-20% utilization. That 40% might not sound impressive, but it represents constant, around-the-clock operation rather than sporadic use. This creates a massive baseline load on electrical grids—imagine adding hundreds of factories that never close.
The numbers are staggering. In 2024, U.S. data centers consumed an estimated 200 terawatt-hours (TWh) of electricity—roughly equal to Thailand’s entire annual power consumption or enough to power every residential home in Argentina, the Netherlands, and the UAE combined. AI-specific operations accounted for 53-76 TWh of that total, and critically, inference now makes up 80-90% of AI’s energy use—a stark reversal from just a few years ago when training dominated AI energy budgets.
Looking ahead, the trajectory is remarkable. By 2028, projections suggest AI inference workloads could consume 165-326 TWh annually—a multi-fold increase in just four years. Even the lower end of that range approaches the total electricity consumption of countries like Argentina or Poland. The upper estimate would equal the entire power grid of Spain.
Comparing AI to Heavy Industry
To contextualize these numbers: data centers consumed approximately 415 TWh globally in 2024, representing about 1.5% of worldwide electricity demand. While that percentage might sound modest, it’s growing at an explosive rate and concentrating geographically, causing serious infrastructure problems in specific regions.
The International Energy Agency projects that data center electricity demand could double by 2026, with AI driving most of that growth. To put this in manufacturing terms, we’re essentially building the electrical equivalent of hundreds of new steel mills or aluminum smelters—except these facilities run at consistently high capacity rather than ramping up and down with economic cycles.
Consider Loudoun County, Virginia—home to the world’s densest concentration of data centers, earning it the nickname “Data Center Alley.” Data centers there consume as much power as 700,000 homes, creating high demand on the regional electrical grid. Local utilities are struggling to build new transmission capacity fast enough, with some planned expansions requiring 5-10 years to complete while data centers want power connections in 18-24 months.
Ireland’s situation illustrates how quickly AI can overwhelm national infrastructure. Data centers now consume 21% of Ireland’s total electricity—more than all urban residential consumption combined and more than the entire health service, all schools, and all public transport put together. The growth happened so rapidly that authorities banned all new data centers in Dublin until 2028, citing grid instability concerns and the inability to meet climate commitments while powering data centers.
The infrastructure challenge extends beyond electricity generation. Global data center power capacity was approximately 60 gigawatts (GW) in 2023 and could triple to 200 GW by 2030, with industry analysts projecting that 70% of all data center capacity by 2030 will be dedicated to AI workloads. Building this infrastructure requires not just new data centers, but entirely new power plants, high-voltage transmission lines, and electrical substations—projects that typically require 5-15 years from planning to operation.
The Water Crisis Behind AI
Energy consumption tells only half the environmental story. The powerful processors running AI generate enormous amounts of waste heat—similar to having thousands of high-powered space heaters running continuously in a confined space. This heat must be removed constantly to prevent equipment failure, and the primary cooling method used by most data centers is water-based evaporative cooling.
The water consumption is substantial. Most data centers consume approximately 1.7 liters of water per kilowatt-hour of electricity used, primarily for cooling towers that evaporate water to dissipate heat. Research suggests that a single ChatGPT conversation can consume about 500 milliliters of water—roughly one standard water bottle per chat. While that might seem trivial for one conversation, multiply it by the billions of AI queries happening daily worldwide, and the water footprint becomes staggering.
Google’s water consumption increased 20% in 2023, largely attributed to AI expansion and the cooling demands of their AI servers. Microsoft’s water consumption jumped 42% during the same period, coinciding with their major investments in AI infrastructure and partnerships with OpenAI.
The water crisis becomes acute in regions already facing water stress. The American Southwest—Arizona, Nevada, parts of California and Texas—hosts numerous data centers because cheap land and favorable tax policies made them attractive locations. However, these same regions face severe, persistent droughts. In Mesa, Arizona, community protests erupted when residents discovered a data center was consuming potable drinking water during severe drought conditions, competing directly with residential and agricultural needs.
This creates genuine ethical dilemmas: Should AI services take priority over residential water supplies? In drought-stricken regions, data centers can consume millions of gallons daily—water that could serve tens of thousands of households or irrigate critical farmland. Some cities are beginning to restrict data center development specifically because of water concerns, not just electricity demands.
Carbon Emissions: The Bottom Line
All this electricity must come from somewhere, and despite tech companies’ ambitious climate commitments, the reality is mixed. Approximately 40% of data center electricity still comes from fossil fuels, particularly in regions like Virginia and Texas where coal and natural gas dominate the grid. Even in regions with cleaner grids, the rapid addition of data center load can force utilities to keep older, dirtier power plants running that might otherwise be retired.
When tech companies rapidly scale AI capabilities, the new power demand often gets met by whatever electricity source is immediately available—frequently natural gas “peaker” plants that can quickly ramp up production. While companies purchase renewable energy credits or sign long-term wind/solar contracts, these often represent energy generated elsewhere at other times, not necessarily the actual electrons powering their facilities during peak AI usage.
The carbon impact appears clearly in corporate emissions reports. Microsoft’s total emissions rose 30% since 2020, despite ambitious net-zero commitments, with the company attributing much of this increase to data center expansion for AI services. Google’s emissions increased 48% over a similar period during their AI expansion, significantly undermining their climate goals.
These emissions increases aren’t unique to major tech companies. Every business deploying AI—from startups to enterprises—contributes to this growing carbon footprint. The total AI sector’s emissions are difficult to calculate precisely, but reasonable estimates suggest AI inference could be responsible for 20-50 million metric tons of CO2 annually and rising rapidly—equivalent to the emissions of several million cars driven for a year.
Solutions: Can AI Become Sustainable?
Despite the sobering numbers, efficiency improvements offer genuine hope. The semiconductor industry continues advancing chip design, with newer AI processors becoming dramatically more efficient. Next-generation AI chips can reduce energy consumption per calculation by 30-50% compared to current generation hardware. Techniques like model compression, pruning unnecessary neural network connections, and algorithmic optimizations can deliver similar AI capabilities with significantly less computing power.
Several practical solutions are emerging from the industry:
Liquid cooling systems: Moving beyond traditional air conditioning, advanced liquid cooling—including direct-to-chip cooling and immersion cooling where servers are submerged in non-conductive liquids—dramatically improves efficiency. Microsoft’s immersion cooling experiments reduced overall energy consumption by 5-15% while virtually eliminating water evaporation. Though expensive to retrofit, new facilities are increasingly designed with liquid cooling from the start.
Strategic geographic location: Placing data centers in cold climates eliminates much of the cooling energy burden. Meta’s data center in Luleå, Sweden uses Arctic air for natural cooling and powers operations with 100% renewable hydroelectric energy, achieving near-zero water consumption and minimal carbon emissions. Norway, Finland, and Iceland offer similar advantages, though distance from users can increase latency.
Carbon-aware computing: Google has pioneered shifting AI workloads to times and locations with abundant clean energy. When solar power peaks midday in California, more inference workloads shift to West Coast data centers. When wind power surges overnight in Texas, intensive AI training moves to those facilities. This optimization can reduce carbon intensity by 20-40% without any hardware changes.
Waste heat recovery: Rather than simply venting or evaporating heat, progressive facilities capture and repurpose it. Some data centers provide waste heat to district heating systems that warm residential and commercial buildings during winter, turning an environmental liability into a community asset. Others partner with indoor agriculture operations, using waste heat for greenhouse climate control.
Regulatory pressure is accelerating adoption of these solutions. Singapore now mandates that new data centers achieve 1.3 PUE (Power Usage Effectiveness) or better—a measure of how efficiently data centers use energy, where 1.0 would be perfect efficiency. They must also demonstrate renewable energy usage and contribute power back to the national grid during peak demand periods, effectively turning data centers into grid-stabilizing assets. Ireland adopted similar bidirectional power requirements after lifting their Dublin moratorium.
What This Means for Consumers
Individual AI usage might seem insignificant, but multiplied across billions of users globally, the aggregate impact is enormous. Understanding the carbon cost of different AI applications helps consumers make informed choices:
Search queries: A typical Google search uses approximately 0.3 watt-hours of energy—minimal impact. However, AI-enhanced searches that generate summaries, answer questions directly, or create custom responses can consume 2-5 times more energy per query. With billions of searches daily, this adds up quickly.
Generative AI: Creating images with DALL-E, Midjourney, or similar tools uses dramatically more resources than text generation—often 10-50 times the energy of a text-based ChatGPT conversation. Video generation is even more intensive, sometimes requiring 100-500 times the energy of text queries. A single AI-generated video can consume as much electricity as running your refrigerator for a week.
Always-on AI assistants: Features like Siri, Google Assistant, Alexa, or AI-powered features in your car that continuously listen and process commands consume energy both on your device and in backend servers. While individual usage is small, the aggregate of millions of devices maintaining constant connections to AI services is substantial.
Practical steps to reduce your personal AI carbon footprint:
1. Use AI selectively and purposefully: Don’t default to AI-powered search for simple factual queries that traditional search handles efficiently. Reserve AI for tasks where its capabilities genuinely add value—complex analysis, creative brainstorming, or nuanced questions traditional search can’t answer.
2. Choose text over media generation: When AI can meet your needs with text responses rather than images or videos, opt for text. Generate one image rather than iterating through dozens of variations. Consider whether you truly need AI-generated content or if existing resources might serve your purpose.
3. Manage always-on features: Disable AI assistants and features you don’t actively use daily. Most smartphones and smart home devices allow you to selectively enable AI features only when needed rather than maintaining constant listening/processing modes.
4. Support sustainable providers: When choosing AI services, favor companies demonstrating strong sustainability commitments with transparency about energy sources and efficiency efforts. Some providers publish detailed sustainability reports; others remain opaque. Your purchasing and usage patterns influence corporate behavior.
5. Advocate for transparency: Demand that AI providers disclose the energy and water costs of their services. Just as food products list calorie content and appliances display energy ratings, AI services should show their environmental impact, enabling informed consumer choices.
That being said, it’s also not fair to ask consumers to make AI-usage decisions based on guesswork since consumers are at a disadvantage as to the efficiency of various AI technologies and providers. Much of the burden of improving carbon efficiency and communicating to consumers must fall on the providers.
Conclusion – Is AI bad for the environment
AI inference represents a new category of environmental challenge—one that’s growing exponentially while remaining largely invisible to most users. Unlike industrial emissions you can see or vehicles you can hear, AI’s environmental cost is hidden in distant data centers, making it easy to ignore. Yet the impact is very real: rising electricity demand straining grids, water consumption competing with residential needs, and carbon emissions undermining climate goals.
The always-on nature of AI fundamentally differs from previous computing technologies. When you’re not watching Netflix, those servers can idle. When email is slow, it’s inconvenient but not catastrophic. But AI inference must maintain constant readiness, creating baseline electrical loads comparable to heavy manufacturing—steel mills that never shut down, refineries that run perpetually.
The path forward requires action on multiple fronts. Technologically, continued efficiency improvements in chips, cooling systems, and software algorithms can significantly reduce per-query energy costs. Geographically, strategic data center placement in cold climates with clean electricity grids offers major sustainability advantages. Operationally, carbon-aware computing that shifts workloads to match renewable energy availability can cut emissions without reducing service quality.
But technology alone isn’t sufficient. We need fundamental questions about deployment: Do we really need AI for every application, or are we deploying it because it’s novel and impressive? Should AI replace traditional computing methods that use far less energy but work adequately? How do we balance AI’s genuine benefits against environmental costs?
For anyone committed to reducing their carbon footprint, understanding AI’s environmental cost is increasingly essential. Every ChatGPT query, every AI-generated image, every voice assistant command carries a real-world energy and water cost. These individual costs might seem trivial, but they aggregate to infrastructure-scale impacts rivaling major industries.
The good news: we’re still early enough in AI’s deployment that sustainable practices can be built into infrastructure from the start rather than retrofitted later. With informed consumer choices, smart regulation, continued technical innovation, and honest conversations about when AI truly adds value versus when it’s wasteful, we can harness AI’s benefits while minimizing planetary damage. The choice isn’t between AI or the environment—it’s about deploying AI thoughtfully, efficiently, and sustainably.