Day 7 – AI’s Resource Use and the Precautionary Principle
Welcome to day 7 of the 12 Days of AI. Today we’re going to take a pause from using an AI tool to reflect on the relationship between creative experimentation and environmental ethics.
Today’s activity was written by George Barker, Organisational Development Consultant (Climate and Net Zero) at UAL.
AI and Resources
The ecological impact of artificial intelligence is a subject of much speculation within wider conversations surrounding climate conscious use of technology. Some argue that AI will be a proponent for climate action, helping mitigate greenhouse gas emissions, accelerating adaptation strategies and a fossil fuel transition. AI’s ability to process large sets of data is already being harnessed to save lives, creating early warning signals for extreme weather events, and speeding up mapping of safe evacuation routes in crisis scenarios.
On the other hand, discussions about the ethical implications of AI cannot easily set aside the material resources (energy, water, metals) that are required for its expansion. This year, the deployment of AI has resulted in a meteoric rise of reported carbon emissions for Google and Microsoft, significantly weakening the integrity of their respective organisational net zero targets. In the short term, most data points towards AI as being an overall exacerbator of – rather than solution for – the climate crisis.
We should also consider broadening our understanding of AI’s impact beyond its material conditions. We know that there is no bigger threat to human security than rising, unabated greenhouse gas emissions and that we have less than a decade in which to prevent and limit runaway ecological collapse. When we think about the social and cultural capital of AI, critical questions therefore arise:
- Do the promises and debates of AI draw our time, attention, communities and capital towards or away from the realities of ecological collapse?
- Do creative AI tools place agency in our hands? Or are they instead maximising the profits of a few at the expense of wider environmental and social stability? Who stands to ultimately gain from the efficiencies promised by AI?
- Is a completely decolonial, decarbonised, application of AI a realistic objective or goal? If not, to what extent can its use be justified?
Applying a Precautionary Principle
The precautionary principle is an applied debate and theoretical underpinning in international environmental law. It stresses the value of action-mitigation as a strategy that prevents unnecessary environmental degradation and human harm. In short: if the impact of an innovation remains unclear, but it may cause irreversible ecological damage or loss, then the precautionary principle asks us to cautiously refrain from experimentation and exploration in that field.
Detractors of the precautionary principle criticise its ambiguity, citing the need for a less tentative, more nuanced approach to innovation where we can effectively weigh risk against the potential transformative outcomes of a new approach, technology or idea. However, it is also clear that not all uses of AI are equal. While some AI use may be oriented towards socially and ecologically just transformations, many applications are less transformative, being instead oriented towards creativity for creativity’s sake.
The 12 Days of AI primarily takes an exploratory approach to the use and adoption of artificial intelligence as a tool in education. Today, we want you to flip this approach and take a precautionary stance. Let’s think about what we stand to gain or lose by applying a precautionary principle to the use of AI.
Activity
- Revisit an AI tool or task completed from any of the days before this one.
- Take 20 minutes as desk research time. We want you to find out what you can about the potential resources used by the tool. Follow the discussion prompts below to guide your research.
- Reflect on what you know, and what you don’t know about your chosen tool and its resource implications. Apply a precautionary principle to your understanding of the tool, reflecting on whether its application should be encouraged, or limited in your practice.
Discussion
Join us in the Teams space to share your responses and reflections. If you don’t have access to the space, email us at teachingexchange@arts.ac.uk for the attention of Hannah.
After you’ve chosen your AI tool, do some research to find the answers – or find out which answers you can’t find!
- Is it possible to find out the energy you used (kwh) in your previous experiments? If so, you should be able to find out its estimated carbon footprint by using RenSmart’s conversion tool.
- How about the energy that was required to build the tool?
- Can you find out anything about the full-life cycle of the tool you’ve used, and the potential resources (water, metals, materials) used to build it in the first place?
- What about the organisation behind the technology?
- How could you include conversations about environmental impact with students as it relates to your subject area?
Additional resources
If you’re an employee at UAL, you can sign up to a Carbon Literacy Training day to understand more about environmental impact assessment and measurement, discovering strategies for minimising emissions on personal, organisational and systemic scales.
If you’re external to UAL, the Carbon Literacy Project: offers numerous courses taking place across the UK and beyond.