Skip to main content
22 May 2025 | 9 min read

Artificial intelligence is rapidly transforming market intelligence, enabling analysts to process unprecedented volumes of data and extract valuable insights with remarkable efficiency. As AMPLYFI’s market intelligence platform harnesses these AI capabilities, an important discussion has emerged regarding the substantial environmental impact of these technologies.

A thorough report by Greenpeace Germany, researched by the Öko-Institut, shows that AI infrastructure consumes extensive resources – particularly energy, water, and raw materials. This carries significant implications for competitive intelligence analysts, market research managers, and strategic planning professionals who rely on AI-powered tools.

The global electricity demand for AI computing is set to increase elevenfold by 2030 compared to 2023 levels. This rise coincides with the growing size and complexity of AI models, with parameters increasing from 1.5 billion (GPT-2, 2019) to 2 trillion (LLama 4, 2025). Computational requirements for training are doubling approximately every five months, creating an unsustainable trajectory that risks undermining climate targets and straining resources across global markets.

For market intelligence professionals, understanding AI’s environmental impact is now a strategic necessity that influences technology adoption decisions, stakeholder reporting, and long-term planning. This analysis examines the key findings from the Greenpeace report and outlines their practical implications for intelligence teams balancing technological advancement with environmental responsibility.

Research Context

The “Environmental Impacts of Artificial Intelligence” report provides the first comprehensive overview of AI’s environmental footprint. Published in May 2025 by Greenpeace Germany and conducted by the independent Öko-Institut, the research combines findings from over 95 scientific studies on AI’s resource consumption and environmental effects.

The report gains credibility from its rigorous methodology and the involvement of external experts who reviewed the study, including specialists in energy efficiency, climate impacts, and technology sustainability. Its timing is particularly relevant for market intelligence professionals, coinciding with the widespread adoption of AI tools across competitive intelligence workflows.

Rather than calling for the elimination of AI, the report emphasises the need for responsible implementation. This measured approach makes the findings valuable for market intelligence professionals who must weigh AI’s analytical benefits against its environmental costs. The research shows how AI’s footprint affects resource availability, regulatory frameworks, and corporate sustainability goals – all factors that directly impact market dynamics that intelligence analysts must monitor.

Main Themes

The Exponential Growth of AI’s Energy Appetite

The energy consumption of AI data centres is increasing at an alarming rate. By 2030, AI-specific data centres will consume 554 TWh of electricity – eleven times more than the 50 TWh they consumed in 2023. This growth outpaces efficiency improvements and will soon account for nearly half of all data centre energy use worldwide.

For competitive intelligence analysts, this trend signals potential market disruptions. Regions with concentrated data centre operations like Virginia (US), Dublin (Ireland), and Frankfurt (Germany) are already experiencing grid constraints, with data centres consuming up to 80% of local electricity in some cities. These constraints will likely trigger regulatory responses and reshape location strategies for tech companies, creating both challenges and opportunities that intelligence teams should anticipate.

The increased energy demand from AI will also heighten competition for renewable energy resources. Tech giants including Google, Microsoft, Amazon, and Meta have committed to climate neutrality by 2030 through the EU Climate Neutral Data Centre Pact, but are increasingly turning to nuclear energy to meet their growing needs – a strategy that introduces new risk factors and stakeholder concerns.

Strategic implications for intelligence professionals: Monitor regional variations in energy policies, anticipate regulatory responses to data centre growth, and track shifts in corporate site selection criteria as energy constraints intensify.

Water Resources: The Overlooked Impact

Water consumption represents one of AI’s most significant yet underreported environmental impacts. According to the research, global data centres consumed 175 billion litres of water in 2023, with projections indicating this will more than triple to 664 billion litres by 2030. AI-specialised data centres will drive this increase, with their water consumption rising from 30 billion to 338 billion litres annually.

This substantial water footprint stems from cooling systems required to dissipate heat from increasingly power-dense AI hardware. The Water Usage Effectiveness (WUE) of AI data centres ranges from 0.32 to 0.67 litres per kWh of IT electricity consumption, with an average of 0.36 litres/kWh in 2023. As AI hardware becomes more concentrated and power-intensive, this ratio is expected to increase to 0.48 litres/kWh by 2030.

The geographic distribution of this water consumption creates particular concerns, as 42% of Microsoft’s water usage and 15% of Google’s occurs in regions already experiencing high water stress. For market intelligence professionals, these patterns suggest potential supply chain vulnerabilities and operational risks that may affect technology vendors and data-intensive industries.

Strategic implications for intelligence professionals: Incorporate water security metrics into vendor assessments, anticipate regulatory constraints in water-stressed regions, and identify emerging technologies that could disrupt current cooling approaches.

Resources and E-Waste: The Material Challenge

The expansion of AI infrastructure will generate between 1.2 and 5 million tonnes of electronic waste between 2020 and 2030. This represents a significant contribution to the existing global e-waste problem, which totalled 62 million tonnes annually as of 2024.

The projected decline in global e-waste recycling rates, from 22.3% in 2022 to an anticipated 20% by 2030, further compounds the problem. Particularly concerning is the fact that only 1% of rare earth elements used in technology components are currently recovered through recycling.

While the raw materials contained in AI hardware (approximately 920 kilotons of iron/steel, 200 kilotons of non-ferrous metals, and 100 kilotons of critical, strategic, or conflict-related materials) represent a relatively small proportion of global production volumes, their concentrated use in specialised applications creates specific supply vulnerabilities that market intelligence professionals should monitor.

Strategic implications for intelligence professionals: Track emerging regulations on extended producer responsibility, monitor technology supply chains for rare earth element dependencies, and identify companies developing more resource-efficient AI hardware and circular economy solutions.

Key Statistics and Insights

  • AI model complexity has increased from 1.5 billion parameters (GPT-2, 2019) to 2 trillion parameters (LLama 4, 2025)
  • The computational effort required for AI training doubles approximately every five months
  • By 2030, AI data centres will require as much electricity as conventional data centres do today
  • In Dublin, data centres already account for nearly 80% of total electricity consumption
  • Global water consumption by data centres will more than triple from 175 billion litres in 2023 to 664 billion litres in 2030
  • Up to 5 million tonnes of additional electronic waste will be generated by data centres and AI capacities by 2030
  • The share of specialised AI hardware in data centre energy consumption will grow from 14% in 2023 to 47% by 2030

Technical Glossary

Power Usage Effectiveness (PUE): Ratio of total data centre energy consumption to IT equipment energy consumption; measures efficiency of power distribution and cooling systems. Lower values indicate greater efficiency, with 1.0 being theoretical perfection.

Water Usage Effectiveness (WUE): Measures water consumption in litres per kilowatt-hour of IT electricity consumption. Lower values indicate more water-efficient cooling systems.

Large Language Model (LLM): AI systems trained on vast text datasets to recognise, summarise, translate, predict, and generate content based on patterns in language data.

Generative AI: AI systems that can create new content, including text, images, audio, and code, based on patterns learned from training data.

Computational Parameters: Variables within an AI model that are adjusted during training; more parameters generally enable more complex learning but require significantly more computational resources.

Energy Reuse Factor (ERF): Ratio of reused thermal energy to total discharged energy from a data centre; measures efficiency of waste heat utilisation.

Critical Raw Materials (CRM): Resources essential for technology manufacturing that face supply risks due to geopolitical factors, limited sources, or extraction challenges.

Hyperscale Data Centre: Massive facilities designed to support efficient, scalable computing, typically exceeding 5,000 servers and 10,000 square feet.

AI Energy Score: Emerging standard for measuring and reporting the energy consumption of AI models during training and inference.

Small Modular Reactor (SMR): Compact nuclear power technology being explored by tech companies to meet growing energy demands for AI infrastructure.

Key Questions & Answers

How is AI contributing to increased energy demand?

AI’s computational requirements are growing exponentially, with model complexity doubling approximately every five months. By 2030, AI data centres will consume 554 TWh annually, eleven times more than in 2023, accounting for nearly half of all data centre energy use.

What regions are most affected by AI infrastructure growth?

Hyperscale data centres are concentrated in regions including Virginia (US), Dublin (Ireland), Frankfurt (Germany), and Singapore. In Dublin, data centres already consume nearly 80% of local electricity, creating grid stress and triggering regulatory responses.

How does AI impact water resources?

Data centres consumed 175 billion litres of water globally in 2023, projected to reach 664 billion litres by 2030. AI-specific facilities have higher water intensity (0.61 litres/kWh) than traditional data centres (0.32 litres/kWh) due to more intensive cooling requirements.

What strategies are tech companies using to address energy challenges?

Major companies have committed to climate neutrality by 2030 through renewable energy purchasing, but increasingly rely on nuclear power (including Small Modular Reactors) to meet growing demands – a controversial approach that raises new environmental concerns.

How can organisations assess the environmental impact of their AI usage?

Organisations should implement comprehensive monitoring of AI systems’ energy and resource consumption, compare different model architectures for efficiency, consider geographical deployment factors, and demand transparency from vendors about their environmental metrics.

What regulatory trends should market intelligence professionals monitor?

Key trends include mandatory environmental reporting for data centres, restrictions on new facilities in grid-constrained regions, water usage regulations in stressed areas, and emerging standards for AI efficiency measurement and disclosure.

How might AI’s environmental impact influence market competition?

Companies with more efficient AI implementations will gain competitive advantages through lower operational costs, reduced regulatory exposure, and enhanced sustainability credentials. This will drive innovation in efficient algorithms, specialised hardware, and alternative cooling technologies.

Our Insights in your Inbox
Close Menu