AI Ethics • VCASSE
AI and the Environment: Balancing Innovation with Responsibility
AI can optimize energy grids, track deforestation, and accelerate climate research. It can also burn through enormous amounts of electricity doing it. The technology's environmental promise and its environmental cost are growing at the same time, and most organizations haven't figured out how to square that.
Founder, Oasis of Change •
The dual nature of AI and sustainability
AI occupies a strange position in the sustainability conversation. It can optimize power grid distribution in real time, predict crop yields to reduce agricultural runoff, and spot illegal deforestation from satellite imagery faster than any human team. The potential is real.
But the infrastructure behind it is also a real source of emissions. Large-scale AI runs on data centers pulling power around the clock, often from grids still heavy on fossil fuels. Training a single foundation model can consume electricity on par with a small town. The tool we're building to help with environmental problems is also making them worse.
Ignoring either side of this leads to bad decisions. Blanket fear of AI's energy cost means missing genuinely useful applications. Deploying it carelessly without measuring its footprint means making things worse while claiming to help.
The carbon cost of large language models
Researchers at UMass Amherst estimated that training a single large NLP model can emit more than 284 tonnes of CO2 equivalent, roughly five times the lifetime emissions of an average car including manufacturing. And that number has only gone up as models scale from billions to trillions of parameters.
Training is only part of it. Once deployed, every query requires compute. A model handling millions of requests per day can burn more energy on inference than it took to train in the first place. A single ChatGPT-style query uses about 10 times the energy of a Google search. Multiply that by hundreds of millions of daily users and the grid load gets serious.
Water is another cost people don't talk about much. Data centers use huge volumes of water for cooling, and AI workloads are pushing that demand up. Microsoft reported a 34% jump in water consumption in 2022, driven largely by AI research. These are measurable costs, and they deserve the same scrutiny as any other industrial process.
How AI can accelerate environmental progress
For all its costs, AI is still one of the most useful tools we have for climate work, when it's deployed with some thought. In climate modeling, machine learning lets researchers run simulations that would take conventional supercomputers weeks, producing higher-resolution projections of temperature, precipitation, and extreme weather that inform adaptation planning globally.
In supply chain optimization, AI systems can identify inefficiencies that human analysts routinely miss. Route optimization for logistics fleets, predictive maintenance for industrial equipment, and demand forecasting that reduces overproduction — these applications can cut emissions across entire sectors without requiring fundamental changes to existing infrastructure. The International Energy Agency has estimated that AI-driven efficiency gains in buildings, transport, and industry could reduce global CO2 emissions by up to 4% annually.
Smart energy management is another area where AI is already delivering measurable results. Grid operators are using machine learning to balance supply and demand in real time, integrating intermittent renewable sources like wind and solar more effectively. Google's DeepMind, for example, reduced cooling energy consumption in its data centers by 40% using reinforcement learning — a result that has since been replicated across other facilities.
Biodiversity monitoring has been transformed by AI-powered acoustic and image recognition. Conservation organizations are using neural networks to identify species from camera trap photos, classify birdsong from remote audio sensors, and track wildlife migration patterns at a scale that was previously impossible. These tools give ecologists the data they need to respond to threats faster and allocate limited conservation funding more effectively.
The core question
"The technology that can model climate futures and optimize energy grids is also consuming resources at a serious rate. We don't get to ignore either side. The obligation is to make sure the net impact is actually positive, not just assumed to be."
Responsible AI development principles
It starts with efficiency. Not every problem needs a trillion-parameter model. Smaller, task-specific models often match or beat general-purpose systems at a fraction of the energy cost. Techniques like distillation, pruning, and quantization can cut a model's compute requirements by 80-90% without meaningful quality loss. Picking the right-sized model for the job is one of the simplest decisions an organization can make, and one of the most impactful.
Infrastructure matters too. Running models on renewable-powered compute cuts the carbon footprint dramatically. Some cloud providers now offer carbon-aware scheduling, routing workloads to cleaner grids or shifting non-urgent training to times when renewable supply peaks. Organizations should be choosing compute partners based on verified energy sourcing, not just cost and performance.
Then there's reporting. Just as companies face growing expectations to disclose Scope 1, 2, and 3 emissions, AI developers should publish the energy and carbon costs of training and running their models. Without standardized measurement and honest disclosure, the industry can't make informed progress. Some initiatives are emerging to define these standards, but adoption is still patchy and mostly voluntary.
What VCASSE is doing
VCASSE (Vancouver Centre for AI Safety, Sustainability, and Ethics) is a subsidiary of Oasis of Change, and this overlap between AI and environmental cost is exactly what it was set up to work on.
The work covers three areas: building practical frameworks for measuring and reporting the environmental costs of AI systems, convening researchers and policymakers to establish shared norms around sustainable development (things like compute efficiency benchmarks and minimum standards for carbon disclosure), and advocating for policy that treats AI's environmental cost as seriously as its economic potential.
Because VCASSE sits within Oasis of Change, it draws on direct experience with digital sustainability. The same organization that builds lower-carbon websites through Web-Ready and tracks real environmental impact on its Impact Dashboard is the one applying that approach to AI governance. It's not theoretical.
None of this is simple, and none of it is settled. AI is developing faster than anyone's ability to regulate it or fully understand its environmental implications. The organizations that engage with these questions honestly now, with real data, will be the ones that shape how this technology gets used going forward.