Elija Gayen and Chandan Bandyopadhyay write about the hidden ecological costs and ethical-legal issues related to the impact of artificial intelligence on the environment.
Elija Gayen and Chandan Bandyopadhyay
Abstract:
Objective:
To ensure sustainable and harmonious integration with various economic
sectors by identifying optimal moral-ethical and political-legal strategies, and
to identify the hidden ecological costs associated with the elaboration,
implementation, and development of artificial intelligence technologies.
Methods: The study is based on an ecological approach to the
development and implementation of artificial intelligence, as well as on an
interdisciplinary and political-legal analysis of ecological problems and risks
of algorithmic bias, errors in artificial intelligence algorithms, and
decision-making processes that may exacerbate environmental inequalities and
injustice towards the environment. In addition, analysis was performed in
regard to the consequences of natural ecosystems destruction caused by the
development of artificial intelligence technologies due to the computing
energy-intensiveness, the growing impact of data centers on energy consumption
and problems with their cooling, the electronic waste formation due to the
rapid improvement of equipment, etc.
Results:
The analysis shows a range of environmental, ethical and political legal issues
associated with the training, use and development of artificial intelligence,
which consumes a significant amount of energy (mainly from non-renewable
sources). This leads to an increase in carbon emissions and creates obstacles
to further sustainable ecological development. Improper disposal of artificial
intelligence equipment exacerbates the problem of E-waste and pollution of the
planet, further damaging the environment.
Errors in artificial intelligence algorithms and decision-making processes lead to environmental injustice and inequality. AI technologies may disrupt natural ecosystems, jeopardizing wildlife habitats and migration patterns.
Scientific
novelty: The environmental consequences of the
artificial intelligence use and further development, as well as the resulting
environmental violations and costs of sustainable development, were studied.
This leads to the scientific search for optimal strategies to minimize
environmental damage, in which legal scholars and lawyers will have to
determine ethical-legal and political-legal solutions at the national and
supranational levels.
Practical
significance: Understanding the environmental
impact of AI is crucial for policy makers, lawyers, researchers, and industry
experts in developing strategies to minimize environmental harm. The findings
emphasize the importance of implementing energy efficient algorithms, switching
to renewable energy sources, adopting responsible E-waste management practices,
ensuring fairness in AI decision-making and taking into account ethical
considerations and rules of its implementation.
Contents
Introduction
A. Role
of AI in water consumption with special reference to India
1. Water and Energy
consumption
1.1 The
energy-intensive nature of AI computations
1.2. Data
centers: Energy hogs of the AI infrastructure
1.3. Non-renewable
energy sources and carbon emissions
1.4. Exploring
the need for energy-efficient AI algorithms and hardware
2. Electronic
Waste Generation
2.1. The
rapid pace of AI hardware advancements
2.2. Device
lifecycles and the E-waste predicament
2.3. Strategies
for responsible E-waste management in AI
3. Data
Centre Infrastructure
4. Understanding
biases in AI training data
5. Disruption
of Natural Ecosystems
6. Existing
Regulations Related to AI’s Environmental Impact in the European Union
7. Conclusion
8. Limitations
and Suggestions
References
Keywords:
Evapotranspiration (ET),
Artificial intelligence, Water quality, Runoff, Sediment, algorithmic bias, artificial intelligence,
data center, digital technologies, ecological costs, electronic waste, energy
consumption, law, natural ecosystems, sustainability
Introduction
When
someone mentions the negative environmental impacts of AI, electricity and
their respective carbon emissions may be the first thing that comes to mind. In
fact, the intersection of artificial intelligence (AI) and environmental
sustainability is becoming a critical area of study and concern. As AI systems
like the GPT-4 model become more complex and difficult to train, their
environmental impact is coming into focus. However, what most people don’t
realize is the alarming water footprints that AI models leave behind.
The
concept of a "water footprint" in AI refers to the total volume of
water used directly and indirectly during the lifecycle of AI models. This
includes the water used for cooling vast data centers and the water consumed in
generating the electricity that powers them. To put it in perspective, training
a single AI model like GPT-3 could require as much as 700,000 liters of water,
a figure that rivals the annual water consumption of several households.
Moreover, a simple conversation with ChatGPT consisting of 20 to 50 questions
can cost up to a 500ml bottle of freshwater.
Understanding
the implications of this requires us to differentiate between "water
withdrawal" and "water consumption." Withdrawal is the total
volume taken from water bodies, which often returns to the source, albeit
sometimes in a less pristine condition. Consumption, however, is the portion of
withdrawn water that is evaporated, incorporated into products, or otherwise
not returned to its source. It's this consumption that can exacerbate water
scarcity, leading to a pressing global issue.
AI's
water footprint is tied to its energy needs. The electricity that powers AI
systems often comes from sources like hydroelectric power, which, while
renewable, has its own set of environmental impacts, including water use. As AI
models grow in complexity and size, their energy—and therefore water—demands
follow suit.
Reducing
the water footprint of AI models can be done from several aspects. This
includes optimizing algorithms for energy efficiency, developing more
sustainable data center designs, and shifting the timing of intensive AI tasks
to coincide with periods of lower electricity demand, which can reduce the
reliance on non-renewable energy sources.
How is AI Using
Water?
Before
discussing and investigating how to mitigate the significant water footprints
of AI models, we need to first understand how water is used in training machine
learning models. AI's water use is multifaceted, encompassing not just direct
usage for cooling but also indirect consumption through power generation and
server manufacturing.
Why is Water Needed for Cooling
Systems?
AI
systems, particularly those involving high-performance servers with hundreds if
not thousands of graphics processing units (GPUs), generate a considerable
amount of heat. To maintain a reasonable temperature in data centers and avoid
overheating of processing units, data centers employ cooling mechanisms that
are often water-intensive.
Water
is a superior cooling agent for data centers due to its high specific heat
capacity and thermal conductivity. It can absorb a lot of heat before it gets
warm, making it efficient for regulating the temperatures of high-performance
servers. This is especially crucial in AI data centers, where equipment runs
continuously and generates significant amounts of heat.
Moreover,
water's ability to be pumped and re-circulated through cooling systems allows
for a stable and continuous transfer of heat away from critical components.
This ensures that the data centers remain at optimal operating temperatures,
which is essential for the reliability and efficiency of the AI systems they
house.
The Three Scopes of Usage
Data
center cooling systems can be generally categorized into 3 scopes in different
stages of data centers.
The
first scope of water use in AI involves the direct operational cooling of data
centers. These facilities house the servers responsible for AI computations
and, due to their high-density power configurations, generate considerable heat.
To maintain optimal temperatures, data centers traditionally use water cooling
systems. These systems function either by absorbing heat through evaporation in
cooling towers or by transferring heat away in a closed-loop system. The water
can be recycled and reused to some extent, but there's always a portion that is
lost to evaporation or discharge, making it a consumptive use of water.
The
second scope is less direct but equally significant. It pertains to the water
used in the generation of electricity that powers AI systems—often referred to
as 'off-site' water use. This water usage is dependent on the energy mix of the
grid supplying the power. For instance, a data center powered by
hydroelectricity or a fossil-fuel plant will have different levels of water
withdrawal and consumption based on the efficiency and cooling requirements of
these power plants.
Lastly,
the third scope encompasses the water used in the manufacturing of AI hardware,
including the servers and the semiconductors within them. This process, which
often requires high-purity water, can have a substantial footprint, considering
the sheer scale of global semiconductor production. This is a more indirect
form of water use but contributes to the cumulative water footprint of AI and
AI advancement. The continued development of larger and mode complex models
will increase the demand of processing units and thus the production of
hardware will rise as well.
Estimating Water Consumption of AI
Models
The task of
quantifying the water footprint of AI models is a complex task that involves
multiple factors and aspects. To accurately measure this footprint, a
multi-faceted approach is required, taking into account direct water use, the
intricacies of energy generation, and the manufacturing processes of AI
components.
The authors of
the paper “Making Al Less "Thirsty": Uncovering and Addressing the
Secret Water Footprint of AI Models”, propose a novel approach to estimating
the water footprint of AI computations.
At the
operational level, direct water usage is tracked through the Water Usage
Effectiveness (WUE) metric, which captures the volume of water used for cooling
per kilowatt-hour of energy consumed by AI systems. This is particularly
relevant for data centers where AI models are operational, as these facilities
often require significant cooling to manage the heat generated by high-density
servers. The WUE is sensitive to a range of factors, including the external
temperature, which can lead to higher cooling requirements during warmer periods.
Turning to
indirect water use, the equation becomes more complex. Here we introduce the
concept of water consumption intensity factor (WCIF), which is linked to the
energy source powering the data center. Since the energy mix can vary —
incorporating coal, natural gas, hydroelectric power, nuclear energy, and
renewable sources — the WCIF is subject to change. It encapsulates the average
water used for every unit of electricity that comes from the grid, inclusive of
variables like evaporative losses at power plant cooling towers.
The overall
operational water footprint (W_operational) of a machine learning model is then
calculated using the following equation:
But the footprint
extends beyond operational use. The embodied water footprint (W_embodied)
accounts for the water used in the life cycle of server manufacturing and
maintenance, distributed over the hardware’s expected lifespan:
Here W represents the total water used in the manufacturing, T representing the operational time period, and T0 (T, Zero) being the hardware’s projected lifespan.
Combining these
elements gives us the total water footprint:
This formula provides a holistic view of the AI model’s water footprint, combining real-time operational data with the broader environmental context of AI's lifecycle.
Below is a data table from the paper outlining water consumption of the GPT-3 model.
Diving into the granular data provided on GPT-3's operational water consumption footprint, we observe significant variations across different locations, reflecting the complexity of AI's water use. For instance, when we look at on-site and off-site water usage for training AI models, Arizona stands out with a particularly high total water consumption of about 10.688 million liters. In contrast, Virginia's data centers appear to be more water-efficient for training, consuming about 3.730 million liters.
These differences
are largely attributable to the Water Usage Effectiveness (WUE) and Power Usage
Effectiveness (PUE) values, which vary by location. WUE and PUE are indicative
of the efficiency of a data center's cooling infrastructure and overall power
usage, respectively. For instance, the WUE in Virginia is recorded at a low
0.170 liters/kWh, suggesting that the data centers in this state are among the
most water-efficient in the table provided. This is a stark contrast to
Arizona, where WUE is much higher at 2.240 liters/kWh, hinting at a less
water-efficient cooling process in a hotter climate.
The table also
points to the water used for each AI inference, which can be startling when we
consider the scale of AI operations. Taking the U.S. average, for instance, we
can see that it takes approximately 16.904 milliliters of water to run a single
inference, of which 14.704 milliliters are attributed to off-site water usage.
To put this into perspective, if we think about the number of inferences that
could be run on 500ml of water — the size of a standard bottle of water — the
U.S. average would allow for about 29.6 inferences.
With more than 100 million active users to date along with the
introduction of GPT-4, the water consumption resulting from day-to-day usage is
staggering. Additionally, GPT-4 is almost 10 times as large as GPT-3, possibly
increasing the water consumption of inference and training by multiple folds
compared to GPT-3. It is somewhat ironic that we are taught to reduce shower
time or reuse water in order to conserve the usage of water, without knowing
just how much water is disappearing from talking to an AI Chatbot.
What Does it Mean for the Future?
Of course, this
is not to say that all AI training and usage should be halted in favor of
saving water. The paper's results and insights serve as a fair warning to the
resources being consumed, many of which have mostly gone unnoticed. That said,
the tech industry is not turning a blind eye to the environmental costs of AI
and data center operations. In fact, there's a wave of innovative efforts to
mitigate these costs.
For instance,
Google's AI subsidiary, DeepMind, has applied machine learning to enhance the efficiency of Google's data centers, achieving a 40% reduction in
energy use for cooling. This advance translates to a 15% reduction in overall
Power Usage Effectiveness (PUE) overhead, which is a measure of data center
energy efficiency. This innovation not only reduces energy consumption but also
indirectly decreases water use, as less energy required generally means less
water used for cooling.
Moreover,
Microsoft has ventured into the depths with Project Natick, experimenting with underwater data centers. The ocean provides
a natural cooling environment, which dramatically reduces the cooling costs
that contribute significantly to a data center's operational expenses. Notably,
an underwater data center tested by Microsoft off the Scottish coast achieved a
PUE as low as 1.07, which is impressively more efficient than the PUE for
conventional, newly-constructed land-based data centers, which is about 1.125.
This innovation not only speaks to cooling efficiency but also to the potential
for integrating data centers with renewable energy sources like offshore wind,
solar, tidal, and wave power.
These initiatives
are not mere drops in the ocean but significant strides toward sustainable
computing. They underscore a commitment to finding solutions that balance the
unrelenting demand for AI and computing power with the imperative to preserve
environmental resources. As these technologies continue to develop, they offer
a blueprint for reducing the water footprint of AI and other energy-intensive
industries.
In conclusion,
the journey towards eco-friendly AI is multi-faceted, involving advancements in
AI itself to optimize energy use, exploring unique solutions like underwater
data centers, and a broader shift towards renewable energy sources. These
efforts collectively form a promising horizon where AI's thirst for water is
quenched not by drawing from our precious reserves, but through innovation and
the relentless pursuit of efficiency. The insights from the paper serve as a
reminder and a catalyst for ongoing efforts to ensure that AI's footprint on our
planet is as light as possible while ensuring the pace of technological
advancement.
Artificial
intelligence (AI) has emerged as a powerful and transformative force,
revolutionizing various aspects of human lives, from healthcare to
transportation, and from customer service to financial systems. With its
ability to process vast amounts of data and learn from patterns, AI has opened
up new frontiers of innovation and efficiency. However, as society marvels at
the advancements brought by AI, it becomes crucial to recognize and examine the
hidden ecological cost associated with this technological revolution. As the
demand for AI applications grows, the energy consumption required to power the
computational infrastructure also increases. According to a study conducted by
Strubell et al. (2019), training a single state-of-the-art AI model can emit as
much carbon dioxide as the lifetime emissions of five cars. Data centers, which
are responsible for housing and running AI systems, contribute significantly to
this energy consumption, often relying on non-renewable energy sources. The
exponential growth of AI technology raises concerns about the long-term
environmental impact, as the environmental cost associated with the AI
revolution remains largely unnoticed and unaccounted for. Moreover, the rapid
evolution of AI hardware leads to shorter device lifecycles, resulting in a
surge of electronic waste (E-waste). The Global E-waste Monitor 2020 report
indicates that E-waste generation reached a record 53.6 million metric tonnes,
with only 17.4% being officially collected and recycled1.
Improper management
of outdated AI hardware components poses significant environmental risks,
contributing to pollution and resource depletion. Whilst AI presents immense
potential for environmental monitoring and conservation efforts, its deployment
can also disrupt natural ecosystems. Environmental monitoring drones and
autonomous vehicles used for resource exploration, for example, have the
potential to disturb wildlife habitats, interfere with migration patterns, and
exacerbate ecosystem imbalances. The unintended consequences of AI on
biodiversity and ecosystems necessitate careful consideration to ensure
responsible and sustainable deployment. In light of these concerns, it becomes
essential to delve deeper into the environmental footprint of AI and explore
strategies for mitigating its negative ecological impacts. This article will
examine various aspects of the ecological cost associated with AI, highlighting
the need for energy-efficient algorithms, responsible E-waste management
practices, sustainable data centre infrastructure and ethical considerations in
AI decision-making. By shedding light on these issues, it aims to foster
discussions and actions that lead to a more environmentally conscious approach
to AI development and deployment.
********
1. Water and Energy
consumption
As society
continues to harness the power of AI, it becomes imperative to acknowledge and
tackle the substantial water and energy consumption that accompanies this
technological revolution. This section explores the water and energy-intensive
nature of AI computations, the significant water and energy demands of data
centers, and the concerning reliance on non-renewable energy sources. Water
covers 70% of the Earth, and it is our most essential and important ingredient
to survive—for all living things. However, freshwater—what we need to drink and
irrigate our farms—is only 3% of the world’s water, and over two-thirds of that
is tucked away in frozen glaciers and unavailable for consumption. Up to 90% of
some organisms’ body weight comes from water. Did you know that 60% of an adult
human body is water? It’s nearly 80% of a baby’s body. Each day, we must
consume water to survive. An adult male needs about 3 liters (3.2 quarts)
per day while an adult female needs about 2.2 liters (2.3 quarts) per day. As
a result, some 1.1 billion people worldwide lack access to water, and a total
of 2.7 billion find water scarce for at least one month of the year. Inadequate
sanitation is also a problem for 2.4 billion people—they are exposed to
diseases such as cholera and typhoid fever and other water-borne illnesses. Two
million people, mostly children, die each year from diarrheal diseases alone.
According to the United Nations Environmental Report, nearly two-thirds of our
world's population experiences severe water shortages for at least one month a
year, and by 2030, this gap is predicted to become much worse, with almost half
of the world's population facing severe water stress. To avoid this fate, the
report said, water use must be “‘decoupled’ from economic growth by developing
policies and technologies to reduce or maintain consumption without
compromising performance.” The necessity for water is fundamental to our
ability to live. However, we have a major problem, and it’s accelerating. At
the same time, our world is racing ahead to advance AI into every aspect of our
world. With the rise of generative AI, companies have significantly raised
their water usage, sparking concerns about the sustainability of such practices
amid global freshwater scarcity and climate change challenges. Tech giants have
significantly increased their water needs for cooling data centers due to the
escalating demand for online services and generative AI products. AI server
cooling consumes significant water, with data centers using cooling towers and
air mechanisms to dissipate heat, causing up to 9 liters of water to evaporate
per kWh of energy used. The U.S. relies on water-intensive thermoelectric
plants for electricity, indirectly increasing data centers' water footprint,
with an average of 43.8L/kWh withdrawn for power generation. Wafer fabrication
demands high water usage with low recycling rates, contributing to the
supply-chain water footprint with limited transparency on actual usage data.
By shedding light
on the hidden ecological costs of the AI revolution, we can gain a deeper
understanding of the environmental implications associated with AI’s remarkable
impact on various domains of human life. AI computations are known for their
substantial energy requirements due to the processing of vast amounts of data
and the execution of complex algorithms. Training state-of-the-art AI models,
in particular, consumes a significant amount of energy, with large scale models
consuming as much energy as hundreds of megawatt-hours, equivalent to the
energy required to power thousands of homes for several months (Strubell et
al., 2019). The computational demands and iterative processes involved in
training AI models contribute to their high energy consumption. These energy
requirements are driven by the need to process large datasets, perform complex
matrix operations, and optimize model parameters through multiple iterations.
Understanding the energy footprint of AI computations is essential for
comprehending the environmental impact associated with their widespread
adoption. Data centers play a vital role in supporting AI systems by housing
and running the computational infrastructure. However, they contribute
significantly to the overall energy consumption of AI. These facilities require
substantial electricity to power servers, cooling systems, and networking
equipment. The high-performance computing capabilities necessary for AI
computations result in increased energy demands for data centers. Hanus et al.
(2023) underscore the energy-intensive nature of data centers and the
challenges they face in achieving energy efficiency. The growth of AI
technology has led to an increase in the number and size of data centers,
amplifying their environmental impact. The inefficient utilization of computing
resources and cooling systems in data centers further exacerbates their energy
consumption and environmental footprint. A pressing concern regarding AI’s
energy consumption is the reliance on non-renewable energy sources.
Conventional power grids, often fueled by fossil fuels, are the primary sources
of electricity for AI computations. This reliance on non-renewable energy
exacerbates greenhouse gas emissions and environmental challenges. Şerban et
al. (2020) stress the importance of transitioning to renewable energy sources
for sustainable AI infrastructure. Incorporating renewable energy solutions,
such as solar or wind power, in data centers can reduce the carbon footprint of
AI systems and mitigate their environmental impact. The adoption of renewable
energy technologies not only reduces greenhouse gas emissions but also promotes
the development of a more sustainable energy infrastructure to support the
growing demands of AI computations.
1.1.
The energy-intensive nature of AI
computations
The
energy-intensive nature of AI computations has become a growing concern due to
the significant energy requirements associated with training and running
sophisticated AI models (Henderson et al., 2018). As AI applications continue
to advance and become more complex, the demand for computational power has
skyrocketed, leading to increased energy consumption. One primary contributor
to the energy consumption of AI computations is the training phase. Training a
deep learning model involves feeding vast amounts of data into neural networks,
which then adjust their internal parameters through iterative processes to
optimize performance. This training process often requires multiple iterations
over large datasets, utilizing powerful hardware infrastructure such as
graphics processing units (GPUs) or specialized tensor processing units (TPUs)
(Strubell et al., 2019). These hardware components are highly energy-intensive,
consuming significant amounts of electricity to perform the complex
calculations necessary for training AI models. The energy consumption during
training can range from several hundred kilowatt-hours (kWh) to several
thousand kWh, depending on the size and complexity of the model, the size of
the dataset, and the hardware infrastructure used (Schwartz et al., 2020). For
example, a study by Schwartz et al. (2020) estimated that training a single
state-of the-art language model can emit as much carbon dioxide as the lifetime
emissions of five cars. This highlights the substantial environmental impact
associated with the energy consumption of AI computations. In addition to the
training phase, the deployment and inference of AI models also contribute to
energy consumption. Once a model is trained, it needs to be deployed and run on
various devices or cloud servers to make predictions or perform specific tasks
in real-time. This inference phase also requires computational resources,
although typically less intensive compared to training. However, when AI models
are deployed at scale, the cumulative energy consumption can still be
substantial (Strubell et al., 2019). The energy-intensive nature of AI
computations raises concerns about the environmental impact and sustainability
of AI technologies. As AI applications continue to proliferate across
industries and sectors, the demand for computational resources will only
increase, leading to even higher energy consumption. It becomes crucial to
explore energy-efficient computing architectures, develop algorithms that
minimize computational requirements, and adopt renewable energy sources to
power AI infrastructure (Ding et al., 2021). Efforts are underway to address
these challenges. Researchers and industry experts are actively working on
developing more energy-efficient algorithms and hardware architectures,
exploring techniques such as model compression, quantization, and distributed
training. These approaches aim to reduce the computational requirements of AI
models without significantly compromising performance (Ding et al., 2021).
Furthermore, there is an increasing focus on optimizing data centre operations
and adopting renewable energy sources to power AI infrastructure, reducing the
carbon footprint associated with AI computations (Strubell et al., 2019).
1.2.
Data centers: Energy hogs of the AI
infrastructure
Data centers play a
critical role in supporting the AI infrastructure, serving as the backbone for
storing and processing vast amounts of data. However, these data centers are
also significant energy consumers, raising concerns about their environmental
impact (Dhar, 2020). Data centers house the servers, networking equipment, and
storage systems required to handle the computational demands of AI workloads.
These facilities operate around the clock, consuming substantial amounts of
electricity for powering and cooling the equipment, as well as providing
uninterruptible power supply systems (UPSS) for backup (Shah et al., 2010). The
energy consumption of data centers is driven by various factors, including the
number and efficiency of servers, the cooling systems, and the overall
infrastructure design. Server racks and cooling equipment consume a significant
portion of the energy, with cooling alone accounting for up to 40% of the total
energy consumption (Masanet et al., 2020). A study estimated that data centers
globally consumed approximately 196 to 400 terawatt-hours (TWh) of electricity
in 2020, accounting for about 1 % of the global electricity consumption 2. The energy efficiency
of data centers has become a major focus in reducing their environmental
footprint. Efforts are underway to improve server efficiency, optimize cooling
systems, and design data centers with energy-efficient principles in mind. Techniques
such as server virtualization, advanced cooling technologies, and power
management strategies are being implemented to enhance energy efficiency (Shah
et al., 2010). Furthermore, there is a growing interest in adopting renewable
energy sources to power data centers. Many companies are investing in renewable
energy projects and purchasing renewable energy certificates (RECs) to offset
their electricity consumption (Dhar, 2020). For example, Google announced in
2017 that it had achieved a milestone of purchasing enough renewable energy to
match 100 % of its global electricity consumption for its data centers and
offices 3. To
address the energy challenges posed by data centers, industry collaborations,
government regulations, and study initiatives are being established. These
efforts aim to develop standards, promote best practices, and encourage the
adoption of energy efficient technologies in data center operations (Shah et
al., 2010). In the UK, the Data Centre Alliance is actively working to drive
energy efficiency improvements and sustainability in the data center industry 4.
1.3. Non-renewable energy sources and carbon emissions
The reliance of AI
infrastructure on non-renewable energy sources has significant implications for
carbon emissions and the overall environmental impact. The generation of
electricity from fossil fuels, such as coal and natural gas, contributes to
greenhouse gas emissions and exacerbates climate change (Ram et al., 2018). In
fact, the carbon emissions from data centers alone are estimated to rival those
of the aviation industry 5.
Data centers, which house the computational infrastructure for AI, are known to
be energy-intensive facilities. They require substantial amounts of electricity
to power the servers, cooling systems, and other supporting infrastructure. In
many regions, the grid electricity used to power data centers predominantly
comes from non-renewable sources. For example, in the United Kingdom, a
significant portion of the electricity generation still relies on fossil fuels 6. The carbon emissions
associated with non-renewable energy sources directly contribute to the carbon
footprint of AI systems. A study by Rolnick et al. (2022) estimated that
training a large AI model can emit as much carbon as an average American car
over its entire lifetime. To address these concerns, there is a growing
movement within the AI community to transition towards renewable energy sources
and reduce carbon emissions. Several major technology companies, including
Microsoft and Amazon, have made commitments to achieve carbon neutrality and
rely on renewable energy for their data centers 7. Moreover, governments and organizations are
taking steps to promote the adoption of renewable energy in the AI sector. The
European Union, for instance, has set targets to increase the share of
renewable energy and reduce greenhouse gas emissions in its member states 8. Additionally, study
efforts are focused on developing energy-efficient algorithms and hardware
designs to minimize energy consumption and carbon emissions during AI
computations. Techniques like model compression, quantization, and specialized
hardware architectures are being explored to optimize the energy efficiency of
AI systems (Strubell et al., 2019).
1.4. Exploring the need for energy-efficient AI algorithms and
hardware
As the demand for
AI continues to grow, there is a pressing need to develop energy-efficient
algorithms and hardware to mitigate the environmental impact of AI
computations. The energy consumption of AI systems is a significant concern,
considering the carbon emissions associated with non-renewable energy sources
(Rolnick et al., 2022). Researchers are actively exploring techniques to
improve the energy efficiency of AI algorithms. Model compression, for
instance, aims to reduce the computational requirements of deep neural networks
by pruning unnecessary connections or reducing the precision of weights and
activations (Han et al., 2015). This approach can significantly decrease the energy
consumption and inference time without sacrificing model performance. Another
approach is quantization, which involves representing numerical values with
fewer bits. By reducing the precision of parameters and activations, quantization
reduces memory usage and computational complexity, leading to energy savings
during both training and inference (Hubara et al., 2016). Efforts are also
being made to improve the energy efficiency of training algorithms. Gradient
compression techniques, such as sparsification and quantization, aim to reduce
the communication overhead between distributed devices during distributed
training, thus decreasing the energy consumption (Alistarh et al., 2017).
Additionally, advancements in optimization algorithms and learning rate schedules
can minimize the number of training iterations required, resulting in energy
savings (You, Y. et al., 2017). The development of
energy-efficient AI hardware is also a crucial aspect of mitigating energy
consumption. Traditional computing architectures are often not optimized for AI
workloads, leading to inefficient energy usage. To address this, researchers
are exploring new hardware designs, including neuromorphic computing and
memristive devices, which mimic the structure and functioning of the human
brain, offering potential energy efficiency improvements (Merolla et al., 2014;
Prezioso et al., 2015).
2. Electronic Waste Generation
In addition to the
energy-intensive nature of AI computations, the hardware used in AI systems
also contributes to another significant environmental challenge: electronic
waste generation. The rapid pace of technological advancement and the constant
need for more powerful hardware result in a high turnover rate, leading to a
growing accumulation of electronic waste (Ferro et al., 2021). AI hardware,
including GPUs, application-specific integrated circuits (ASICs), and other
specialized components, have relatively short life spans due to the relentless
progress in technology. As newer generations of hardware are developed, older
ones quickly become obsolete and are often discarded, exacerbating the issue of
electronic waste 9.
The disposal of AI hardware contributes to the release of hazardous substances
and materials into the environment when not properly managed. These substances
can contaminate soil, water, and air, posing risks to human health and
ecosystems. The improper disposal of electronic waste not only leads to
environmental degradation but also wastes valuable resources embedded in the
hardware. Moreover, the disposal of hardware that contains toxic materials such
as lead, mercury, and flame retardants can further contribute to pollution if
not handled properly 10.
To tackle the issue of electronic waste generation in the AI industry, it is
crucial to implement sustainable practices. One approach is to promote the
reuse and recycling of AI hardware. By refurbishing and remanufacturing older
hardware, its lifespan can be extended, reducing the need for constant
production of new devices (Ferro et al., 2021). Additionally, implementing
take-back programs and establishing recycling facilities ensure that discarded
hardware is properly managed and valuable materials are recovered for reuse 11. In the design and
manufacturing of AI hardware, eco-friendly principles should be embraced. Using
materials with lower environmental impacts, designing for recyclability, and
reducing the presence of hazardous substances can contribute to a more
sustainable hardware lifecycle. Adopting modular designs that allow for
component replacement and upgrading can also help prolong the usefulness of AI
hardware, reducing the frequency of complete device replacement (Ferro et al.,
2021).
2.1. The rapid pace of AI hardware advancements
The hardware
technologies in the field of AI are undergoing rapid advancements, fueled by
continuous innovation that leads to the creation of increasingly powerful and
efficient AI systems (Amodei et al., 2016). A notable development in AI
hardware is the evolution of GPUs into a key component for AI computations.
Originally designed for graphics rendering, GPUs have found extensive adoption
in AI due to their ability to handle parallel processing tasks effectively
(Amodei et al., 2016). Their high throughput and computational power make them
well-suited for training and running AI models. Furthermore, specialized
hardware known as ASICs has emerged to cater specifically to AI workloads.
ASICs offer improved performance and energy efficiency by customizing the
hardware architecture to optimize AI algorithm execution (Amodei et al., 2016).
These dedicated AI chips provide higher computational density and faster
processing speeds compared to general-purpose processors. The rapid
advancements in AI hardware have been instrumental in enabling significant
breakthroughs across various AI applications. In computer vision, for example,
the availability of high-performance hardware has facilitated complex image
recognition and object detection tasks with remarkable accuracy (Amodei et al.,
2016). Similarly, in natural language processing, powerful hardware accelerates
the training and inference of language models, enabling applications such as
machine translation and sentiment analysis. However, the swift progress in AI
hardware also brings challenges. The rapid turnover of hardware due to newer
generations becoming available leads to a significant accumulation of
electronic waste. Outdated hardware components contribute to the growing E-waste
problem, requiring proper disposal and recycling measures to minimize
environmental impact (Ferro et al., 2021). The continuous introduction of new
AI hardware also presents a learning curve for developers and researchers.
Staying up to date with the latest hardware technologies demands constant
adaptation, training, and investment, posing challenges for those involved in
AI development (Amodei et al., 2016). Furthermore, optimizing AI algorithms and
software to leverage the capabilities of different hardware architectures adds
complexity to the development process.
2.2. Device lifecycles and the E-waste predicament
The rapid
advancement of AI technologies has led to a proliferation of electronic
devices, resulting in a concerning rise in electronic waste, or E-waste, which
poses significant environmental and health risks 12. The lifecycles of AI hardware play a crucial
role in determining the extent of E-waste generated and the environmental
impact associated with it.
The lifecycle of AI
hardware begins with the extraction of raw materials and the manufacturing
process. The production of AI devices involves the extraction of precious
metals, rare earth elements, and other valuable materials, many of which are
non-renewable and require substantial energy inputs (Ferro et al., 2021). The
extraction and processing of these materials contribute to environmental
degradation and often involve hazardous substances that can harm ecosystems and
human health. As AI hardware advances rapidly, the lifecycle of devices becomes
shorter, with newer models frequently replacing older ones. This phenomenon,
known as planned obsolescence, exacerbates the E-waste predicament, as outdated
AI devices are discarded, leading to a significant accumulation of electronic
waste 13. E-waste
contains hazardous components such as lead, mercury, and flame retardants,
which can leach into the environment and contaminate soil, water sources, and
air if not properly managed. The improper disposal and inadequate recycling of E-waste
further compound the problem. Many electronic devices end up in landfills or
are incinerated, releasing toxic substances and contributing to air and soil
pollution 14.
Inadequate recycling practices also result in the loss of valuable resources
that could be recovered and reused. Policymakers play a crucial role in
establishing regulations and incentives to promote proper E-waste management.
Policies such as extended producer responsibility (EPR) can hold manufacturers
accountable for the environmental impact of their products throughout their
lifecycle, encouraging them to adopt sustainable practices and invest in
recycling infrastructure 15.
Additionally, the development of effective collection systems, recycling
programmes, and refurbishment initiatives can help divert AI devices from
landfills and promote their reuse. The circular economy approach offers a
promising solution to the E-waste predicament. It emphasizes the reuse,
refurbishment, and recycling of electronic devices, aiming to minimize resource
consumption and environmental impact (Ferro et al., 2021). By adopting circular
economy principles, AI hardware can be designed and managed in a way that
maximizes its lifespan and reduces the need for constant upgrades, thus mitigating
the generation of E-waste.
2.3. Strategies for responsible E-waste management in AI
In order to address
the environmental concerns associated with electronic waste generated by AI
hardware, several strategies have been proposed to promote responsible E-waste management
throughout the AI lifecycle. These strategies aim to mitigate the adverse
impacts of E-waste disposal and contribute to a more sustainable approach to AI
technology.
1. Incorporating
Design for Disassembly (DfD) and Design for Recycling (DfR) principles
in the design and
manufacturing of AI hardware can facilitate the efficient separation and
recycling of components. By ensuring that devices are designed with ease of
disassembly and recyclability in mind, the amount of E-waste generated can be
reduced.
2. The concept of
Extended Producer Responsibility holds manufacturers accountable
for the entire
lifecycle of their products, including their proper disposal (Kahhat et al.,
2008). Implementing EPR regulations specific to AI hardware can incentivize
manufacturers to design products with recyclability in mind and take
responsibility for their environmentally sound disposal and recycling.
3. Establishing
effective take-back and recycling programs is crucial for facilitating the
responsible
disposal of AI hardware. Manufacturers can collaborate with specialized E-waste
recyclers or set up collection points to ensure the proper recycling of AI
devices and prevent them from ending up in landfills or informal recycling
facilities. Embracing the principles of a circular economy can help minimize E-waste
generation by promoting resource efficiency and product reuse (Geissdoerfer et
al., 2017). Strategies such as refurbishing and repurposing AI hardware, as
well as creating secondary markets for used devices, can extend the lifespan of
AI systems and reduce the need for new production.
Continued study and
development of advanced recycling technologies are essential for improving the
efficiency and effectiveness of E-waste recycling (Widmer et al., 2005).
Innovations such as hydrometallurgical and biotechnological processes can
extract valuable materials from AI hardware while minimizing environmental
impact and reducing the reliance on traditional extraction methods.
By implementing
these strategies, responsible E-waste management practices can be integrated
into the AI industry, leading to a more sustainable approach to AI hardware
production, use, and disposal.
3. Data Centre Infrastructure
Data centers have
witnessed significant growth in recent years due to the increasing demand for
digital services. This expansion has resulted in a heightened environmental
impact. The construction and operation of data centers require substantial land
and resources, contributing to land use changes and habitat destruction (Mell
& Grance, 2011). Moreover, the proliferation of data centers in urban areas
has raised concerns about their impact on local communities and infrastructure Data centers are renowned for their high
energy consumption. The constant operation of servers, networking equipment,
and cooling systems demands a considerable amount of electricity. Cooling data centers
poses particular challenges. The heat generated by servers and other IT
equipment needs efficient dissipation to maintain optimal operating conditions.
However, traditional cooling methods, such as air conditioning, are
energy-intensive and inefficient. This has prompted the exploration of
innovative cooling technologies, including liquid cooling and advanced airflow
management systems, to enhance energy efficiency and reduce the environmental
impact of data centers (Masanet et al., 2020). Water is a vital resource used
in data centers for cooling purposes. However, the substantial water consumption
of data centers can strain local water resources, especially in regions already
grappling with water scarcity or competing demands. Cooling towers, relying on
evaporation, can consume significant volumes of water. To address the
environmental impact of data centers, industry stakeholders are actively
exploring and implementing sustainable practices. These practices include:
Energy-efficient design: Data centers can adopt energy-efficient design
principles, such as optimizing server utilization, improving power distribution
systems, and utilizing energy efficient hardware. These measures can
significantly reduce energy consumption and carbon emissions (Beloglazov et
al., 2011). Transitioning to renewable energy sources, such as solar or wind
power, can assist data centers in reducing their dependence on fossil fuels and
decreasing greenhouse gas emissions. Rather than dissipating the heat generated
by data centers, waste heat can be captured and utilized for other purposes,
such as heating buildings or generating electricity. This approach maximizes
the energy efficiency of data centers and reduces their overall environmental
impact. Implementing water-efficient cooling technologies, such as closed-loop
cooling systems and water-saving cooling towers, can help reduce water
consumption in data centers. Additionally, recycling and reusing water within
data centre operations can minimize the strain on local water resources. By
adopting these sustainable practices, data centers can strike a balance between
meeting the increasing demand for digital services and minimizing their
environmental impact, contributing to a more sustainable and responsible
digital infrastructure.
4. Understanding biases in AI training data
AI algorithms
heavily rely on training data to make informed decisions. However, these
datasets can often contain inherent biases, which can lead to biased outcomes
in environmental decision-making. Biases in training data can arise from
various sources, including historical data reflecting existing societal
inequalities and systemic biases (Caliskan et al., 2017). It is crucial to
recognize and address these biases to ensure fair and equitable environmental
decision-making processes. Biased AI applications in environmental
decision-making can exacerbate existing environmental disparities faced by marginalized
communities. For example, if AI algorithms are trained on datasets that
disproportionately represent affluent areas, decisions regarding resource
allocation or environmental policies may neglect the needs and concerns of
marginalized communities (Benjamin, 2019). This further marginalizes these
communities, perpetuating environmental injustices. Biased AI applications can
perpetuate and amplify inequalities by reinforcing existing social, economic,
and environmental disparities. For instance, if AI algorithms are biased
against certain demographics or geographic areas, it can lead to unequal
distribution of environmental benefits, such as access to clean air, water, or
green spaces. Furthermore, biased algorithms can result in discriminatory outcomes,
such as disproportionate pollution burdens or inadequate environmental
protections in marginalized communities. To mitigate the biases and promote
fairness in AI environmental decision-making, several measures need to be
taken: It is essential to ensure that AI training datasets encompass diverse
perspectives and accurately represent the affected communities. This requires
careful curation of data to address underrepresentation and avoid reinforcing
existing biases (Sweeney, 2013). Developing AI algorithms that are transparent
and explainable allows for scrutiny and identification of biases. This helps
stakeholders, including affected communities, to understand how decisions are
made and challenge potential biases (Burrell, 2016). Continual monitoring and
evaluation of AI systems are crucial to identify and rectify biases that may
emerge over time. This involves ongoing assessment of AI applications’ impacts
on different populations and their alignment with equity and fairness goals
(Crawford & Calo, 2016). Involving affected communities in the design,
implementation, and evaluation of AI environmental decision-making processes
can help ensure fairness and equity. By addressing biases in AI training data,
acknowledging environmental disparities faced by marginalized communities, and
implementing measures to promote fairness and equity, it is possible to
mitigate the risks of AI amplifying environmental injustices. Responsible and
inclusive AI applications can support informed and equitable decision-making processes
that contribute to a more just and sustainable environment for all.
5.
Disruption of Natural Ecosystems
The expansion of AI technologies and their integration
into various sectors has raised concerns about their potential impact on
natural ecosystems. One area of concern is the disruption of wildlife habitats
and migration patterns. AI-driven infrastructure, such as the construction of
data centers and communication networks, often requires significant land use,
leading to habitat fragmentation and loss. This disruption can have adverse
effects on wildlife populations by limiting their access to resources and
disrupting crucial migration routes, ultimately posing a threat to biodiversity
and ecological resilience. The use of AI for environmental monitoring and
conservation presents both opportunities and challenges. On one hand, AI
enables efficient data collection, analysis, and interpretation, thereby
enhancing our understanding of biodiversity, climate change, and ecosystem
health. It enables us to detect patterns, make predictions, and inform
conservation strategies. On the other hand, an overreliance on AI may result in
a reduction in field-based study and human involvement, potentially overlooking
the nuanced ecological processes that can only be observed through direct
observation (Koh & Wich, 2012). To mitigate the ecological disruption
caused by AI, it is crucial to adopt responsible deployment practices. This
includes conducting comprehensive environmental impact assessments before
implementing AI technologies, evaluating potential risks to ecosystems, and
identifying appropriate mitigation strategies. Moreover, it is important to
integrate AI into existing conservation strategies and involve local
communities in decision-making processes. This participatory approach fosters a
holistic understanding of ecological systems and facilitates the co-design of
AI applications that benefit both biodiversity and human well-being.
6.
Existing Regulations Related to AI’s
Environmental Impact in the European Union and in India
The growth of artificial intelligence has prompted
governments and regulatory bodies to address its potential environmental
impact. Some countries and regions have already taken steps to regulate the
ecological cost of AI. In the European Union, the Eco-Design Directive (2009/125/EC)
has been extended to cover servers and data storage products since March 2020.
This regulation sets minimum energy efficiency requirements for these products,
including those used in AI hardware. It aims to reduce energy consumption and
curb the environmental impact of data centers and other AI infrastructure
components 16.
Along
with the Eco-Design Directive, the Waste Electrical and Electronic Equipment
(WEEE) Directive plays a crucial role in the sustainable management of
electronic waste, including AI hardware components. The WEEE Directive outlines
rules for the proper handling and disposal of electronic waste, ensuring that
discarded AI hardware is managed in an environmentally responsible manner. The
responsibility for the collection and recycling of E-waste is placed on
manufacturers and users, promoting the circular economy and minimizing the
environmental impact of AI hardware disposal 17. As part of the WEEE Directive’s evaluation
process, a public consultation on the EU Directive on waste electrical and
electronic equipment was scheduled for June 2023. This consultation allows
stakeholders and the public to provide feedback and input on the effectiveness
and future improvements of the WEEE Directive. The European Union has also
implemented the Regulation (EU) 2019/424 on the eco-design requirements for
servers and data storage products. This regulation, which entered into force in
March 2020, aims to set minimum energy efficiency requirements for these
products, including those used in AI hardware, with the purpose of reducing
energy consumption and curbing the environmental impact of data centers and
other AI infrastructure components 18. These regulations within the European Union demonstrate the
commitment to address the environmental impact of artificial intelligence and
promote sustainable practices in the technology sector. By setting energy
efficiency standards and promoting responsible E-waste management, the EU aims
to foster a greener and more environmentally friendly approach to AI
development and deployment.
So far as we came to know that, in India, while there's no
specific AI-centric environmental law, existing environmental legislation and
AI-related regulations address the environmental impact of AI. The
Environmental Protection Act, 1986 (EPA) and other environmental laws provide a
framework for regulating AI's impact on the environment. Additionally, the
Digital Personal Data Protection Act, 2023 (DPDPA) and the Information
Technology Act, 2000 (IT Act) also indirectly address AI's environmental impact
through their focus on data handling and technology use.
Key Aspects of AI's
Environmental Impact and Related Laws:
Environmental Laws: The EPA and other
environmental laws in India are crucial
in regulating the environmental impacts of
AI, including the energy consumption of AI systems, the disposal of AI-related equipment,
and the overall effects of AI on the environment.
Data Protection: The DPDPA and IT Act, though not
specifically focused on AI's environmental impact, are relevant because AI
systems rely heavily on data, and these laws ensure responsible data handling,
which can have environmental implications.
National Strategy for AI: The National
Strategy for Artificial Intelligence aims to guide AI development and
deployment in India, including addressing ethical and environmental
considerations.
OECD Recommendation: The
OECD Recommendation on AI includes a focus on environmental sustainability,
emphasizing the need to leverage AI for sustainable development and minimize
its negative environmental impacts.
Sector-Specific Regulations: Some sectors, like energy and
transportation, have specific regulations that can be influenced by AI
applications, further contributing to the regulation of AI's environmental
impact.
Digital India Act: The draft Digital India Act is expected to
include provisions related to AI, potentially providing a more comprehensive
regulatory framework for AI, including its environmental impact.
Findings
In
a study, a total of 27 participants were interviewed to explore the
environmental impacts of artificial intelligence (AI) and digital technologies.
The demographic characteristics of the participants were diverse, ensuring a
wide range of perspectives. Of the participants, 15 identified as male (55.6%),
and 12 identified as female (44.4%), representing a balanced gender
distribution. The participants hailed from various professional backgrounds,
including 9 (33.3%) from the technology sector, specifically in AI and digital
technologies development; 6 (22.2%) were environmental researchers; 5 (18.5%)
worked within governmental or non-governmental organizations focusing on policy
and regulation; and 7 (25.9%) were from the academic sector, involved in study
or teaching related to technology and environmental studies.
Table: Categories, Subcategories, and Concepts
Categories |
Subcategories |
Concepts |
Direct Environmental
Impact |
Energy Consumption |
Data centers power
usage, AI training energy demands, Cooling systems efficiency, Renewable
energy integration, Efficiency in algorithms, Hardware optimization |
|
E-waste |
Recycling challenges,
Product lifecycle extension, Disposal practices, Recycling rate improvement,
Consumer awareness on E-waste, Reduction of hazardous materials |
|
Resource Extraction |
Rare earth minerals,
Impact on local ecosystems, Water use, Sustainable extraction practices,
Recycling of materials, Reduction of dependency on scarce resources |
|
Carbon Footprint |
GHG emissions,
Comparison to other industries, Reduction targets, Transition to low-carbon
technologies, Lifecycle analysis, Emissions tracking and reporting |
|
Biodiversity Loss |
Habitat disruption,
Species endangerment, Conservation efforts, Impact assessments, Restoration
projects, Biodiversity-friendly technologies |
Mitigation Strategies |
Renewable Energy |
Solar-powered data
centers, Wind energy in operations, Hydropower sources, Transition
strategies, |
|
Adoption |
Investment in
renewable energy, Energy purchase agreements |
|
E-waste Recycling Programs |
Manufacturer take-back
programs, Urban mining, Circular economy practices, Design for recyclability,
Consumer return incentives, Legislation support |
|
Carbon Offsetting |
Corporate carbon
credits, Project-based offsets, Community forestry initiatives, Investment in
carbon reduction, Global carbon market participation, Verification standards |
|
Sustainable Design |
Eco-friendly
materials, Modular design for reparability, Energy-efficient software,
Lifecycle assessment in design, Reduction in material use, Innovations in
sustainability |
Technological
Innovations |
Energy-Efficient
Hardware |
Low-power CPUs and
GPUs, Optimized algorithms for efficiency, Thermal management innovations,
Sustainable computing practices, Advanced cooling technologies, Power-saving
modes |
|
AI for Environmental
Monitoring |
Satellite imagery for deforestation,
AI in climate modeling, Pollution tracking systems, Real-time environmental
data analysis, Predictive modeling for conservation, IoT for environmental
monitoring |
|
Biodegradable
Materials |
Polymers from
renewable sources, Electronics recycling into new products, Packaging
reduction, Biodegradable electronics, Sustainable material sourcing,
Lifecycle impact reduction |
|
Green Data Centers |
Cooling with renewable
energy, Usage of ambient temperature, Server virtualization, Energy-efficient
infrastructure, Smart grid integration, Renewable energy certifications |
Policy and Regulation |
Legislation and
Standards |
E-waste regulations,
Greenhouse gas emission limits, Energy efficiency standards, Policy
frameworks, Enforcement mechanisms, International standards harmonization |
|
International
Cooperation |
UN sustainability
goals, Cross-border environmental agreements, Tech industry guidelines,
Collaborative projects, Global sustainability initiatives, Policy alignment
for climate goals |
|
Incentives for Sustainability |
Tax breaks for green
tech, Grants for clean energy projects, Subsidies for sustainable practices,
Financial incentives for innovation, Support for green startups, Economic
benefits of sustainability |
|
Data Transparency |
Open data for environmental
impact, Blockchain for supply chain transparency, Privacy-preserving data
sharing, Data accuracy and reliability, Stakeholder access to data, Public
reporting standards |
Public Awareness and
Engagement |
Educational Campaigns |
School programs on
digital footprint, Online courses on sustainability, Workshops on E-waste
management, Curriculum integration, Interactive learning platforms, Awareness
campaigns
|
|
Community Initiatives |
Local clean-up events,
Repair cafes, Sustainability hackathons, Community recycling initiatives,
Volunteer for green projects, Public engagement in sustainability, Local
sustainability challenges |
|
Digital Literacy |
Media literacy on
digital use, Online security and privacy, Ethical computing practices,
Digital responsibility, Sustainable digital habits, Critical thinking on
digital consumption |
|
Stakeholder
Collaboration |
NGO and corporate
partnerships, Community-driven sustainability projects, Public consultations
on tech policy, Collaborative platforms, Stakeholder engagement, Collective
action for sustainability |
7. Conclusion
In
conclusion, when reflecting on the hidden ecological cost of AI, it becomes
evident that we must acknowledge and address the environmental implications
that come with its development and integration. The energy-intensive nature of
AI computations, the generation of electronic waste, the disruption of natural
ecosystems, and the potential for biased decision-making all highlight the need
for proactive measures. By recognizing the importance of sustainable practices
such as energy-efficient algorithms, transitioning to renewable energy sources,
responsible E-waste management, and ethical considerations, we can strive
towards a more harmonious and environmentally conscious integration of AI.
It is our collective responsibility
to navigate the path towards a better future where AI benefits both humanity
and the planet. By prioritizing environmental sustainability and taking
proactive steps to mitigate the ecological footprint of AI, we can create a
future that harnesses its potential while preserving and protecting our natural
resources. Through collaboration, study, and the development of policies and
regulations, we can shape the evolution of AI towards a more sustainable and
ethically sound direction.
This study embarked on an exploratory journey to
delineate the environmental impacts of artificial intelligence (AI) and digital
technologies, aiming to contribute to the growing discourse on sustainable
technological development. The findings reveal significant environmental
considerations across the lifecycle of AI technologies, including high energy
consumption, carbon emissions, E-waste generation, and biodiversity loss.
Simultaneously, our study identifies potential mitigation strategies such as
the adoption of renewable energy sources, E-waste recycling programs, and the
development of energy-efficient AI hardware and algorithms. Moreover, the study
underscores the pivotal role of policy, regulation, and public awareness in
fostering sustainable AI practices. The qualitative analysis of the
environmental impacts of artificial intelligence (AI) and digital technologies
yielded five main themes, each encompassing a range of categories that further
detail the specific aspects of the theme. The main themes identified are Direct
Environmental Impact, Mitigation Strategies, Technological Innovations, Policy
and Regulation, and Public Awareness and Engagement. Under these themes,
various categories were identified, such as Energy Consumption, E-waste, and
Carbon Footprint under Direct Environmental Impact; Renewable Energy Adoption, E-waste
Recycling Programs, and Sustainable Design under Mitigation Strategies; Energy Efficient
Hardware, AI for Environmental Monitoring, and Green Data Centers under
Technological Innovations; Legislation and Standards, International
Cooperation, and Data Transparency under Policy and Regulation; and Educational
Campaigns, Community Initiatives, and Digital Literacy under Public Awareness
and Engagement. These themes and categories collectively provide a
comprehensive framework for understanding the multifaceted environmental
impacts of AI and digital technologies, as well as potential pathways toward
sustainability. The Direct Environmental Impact theme highlights the immediate
ecological consequences of AI and digital technologies, divided into categories
such as Energy Consumption, E-waste, Resource Extraction, Carbon Footprint, and
Biodiversity Loss. Energy Consumption focuses on the substantial power
requirements of data centers and AI training processes. E-waste encompasses the
challenges and practices related to the disposal and recycling of electronic
waste. Resource Extraction discusses the environmental degradation resulting
from the extraction of necessary raw materials. Carbon Footprint examines the
greenhouse gas emissions associated with these technologies, and Biodiversity
Loss addresses the adverse effects on wildlife and ecosystems. Mitigation
Strategies theme explores approaches to minimize the environmental impacts
identified under the Direct Environmental Impact theme. This includes Renewable
Energy Adoption, emphasizing the transition to solar, wind, and hydropower
sources for energy requirements; E-waste Recycling Programs, which focus on
improving recycling rates and consumer awareness; Carbon Offsetting, detailing
efforts to compensate for emissions through forestry and conservation projects;
and Sustainable Design, advocating for the development of eco-friendly
materials and energy-efficient product designs. The Technological Innovations
theme captures the advancements aimed at enhancing environmental sustainability
through smarter and greener technologies. It includes categories like
Energy-Efficient Hardware, which details the development of low-power computing
devices; AI for Environmental Monitoring, highlighting AI applications in tracking
and managing environmental changes; and Green Data Centers, focusing on the
efforts to reduce the carbon footprint of data storage and processing
facilities. Policy and Regulation theme underscores the role of governance and
legal frameworks in promoting environmental sustainability in the AI and
digital technology sectors. Legislation and Standards refer to the laws and
guidelines established to manage the environmental impacts of these
technologies. International Cooperation highlights the global partnerships and
agreements aimed at addressing cross-border environmental issues, and Data
Transparency emphasizes the importance of open and accessible data to monitor
and regulate AI's environmental impact. Finally, the Public Awareness and
Engagement theme focuses on the importance of educating and involving the
public in sustainability efforts related to AI and digital technologies.
Educational Campaigns are aimed at raising awareness about the environmental
impacts of these technologies and promoting sustainable practices. Community
Initiatives encourage grassroots movements towards sustainability, and Digital
Literacy stresses the need for understanding the ecological implications of
digital consumption and the benefits of sustainable digital practices. The
integration of artificial intelligence (AI) and digital technologies across
various sectors has prompted a nuanced discussion about their environmental
impacts and their role in achieving sustainable development goals. This
discussion section explores our study's findings within the context of existing
literature, aligning our insights with those of previous studies to provide a
comprehensive understanding of the environmental implications of AI and digital
technologies. Our study underscores a critical gap in systematic studies
assessing AI's impact on sustainable development, echoing concerns raised by
Vinuesa et al. (2020). The need for comprehensive study in this area is
paramount, as it would illuminate the broader environmental implications of AI
deployment across various domains (Vinuesa et al., 2020). Furthermore,
Kindylidi & Cabral (2021) suggested the importance of conducting
larger-scale studies to map the environmental footprint of AI, focusing on
energy consumption and carbon dioxide emissions (Kindylidi & Cabral, 2021).
Our findings support this suggestion, highlighting the significant
environmental impact associated with algorithmic training and use, and the
urgent need for methodologies that quantify these impacts comprehensively.
Akter et al. (2023) explored AI's impact across society, the environment, and
the economy, providing valuable case studies in agriculture, waste
classification, smart water management, and HVAC systems (Akter et al., 2023).
These case studies demonstrate AI's potential in promoting sustainable
development, aligning with our findings that showcase AI's diverse applications
in environmental sustainability. However, as Richie (2022) emphasized,
considering the broader ecological footprint of AI technologies is crucial, especially
in sectors such as healthcare, where the environmental impact may not be
immediately evident (Richie, 2022). The ambiguity surrounding AI's
contributions to environmental sustainability, discussed by Yigitcanlar (2021),
reflects the complexities we encountered in delineating the precise role of AI
in enhancing efficiencies and promoting environmental sustainability
(Yigitcanlar, 2021). Our study contributes to clarifying this role, reinforcing
the need for a clearer understanding articulated by Yigitcanlar (2021)
(Yigitcanlar, 2021). In line with Zhao & Farinas (2022), our study also
highlights AI's potential to address complex environmental challenges,
suggesting that AI can play a significant role in tackling global
sustainability issues (Zhao & Fariñas, 2022). The ethics of sustainable AI,
as discussed by Bossert & Hagendorff (2023), resonate with our findings on
the importance of considering environmental impacts, particularly greenhouse
gas emissions and energy consumption (Bossert & Hagendorff, 2023). This
ethical perspective is crucial for guiding the development and deployment of
environmentally sustainable AI technologies. Moreover, the concerns about
potential biases in AI-generated sustainability reports raised by Villiers
(2023) underline the importance of critical scrutiny in the use of AI for
sustainability reporting, to avoid perpetuating inequalities and overlooking
crucial perspectives (Villiers, 2023). In conclusion, our study aligns with and
expands upon existing literature by providing a nuanced understanding of the
environmental impacts of AI and digital technologies. It underscores the urgent
need for comprehensive study and larger-scale studies to map these impacts
accurately. By integrating insights from various domains, our study contributes
to a holistic view of AI's role in promoting sustainable development,
advocating for ethical considerations, and addressing potential biases in AI
applications for sustainability. As the discourse on sustainable AI evolves, it
is clear that a multifaceted approach, encompassing ethical considerations,
comprehensive impact assessments, and mitigation strategies, is essential for
harnessing AI's potential in advancing global sustainability goals. In
conclusion, this study illuminates the dual-edged nature of AI and digital
technologies in the context of environmental sustainability. While these
technologies hold remarkable potential to address some of the most pressing
environmental challenges of our time, their deployment is not without significant
ecological implications. The findings underscore the imperative for a balanced
approach that leverages AI's capabilities for environmental good while
vigilantly mitigating its adverse impacts. Achieving this balance necessitates
concerted efforts across multiple domains, including technological innovation,
policy formulation, and societal engagement, to steer the development and
application of AI towards a more sustainable future.
8. Limitations
and Suggestions
This
study is not without its limitations. The primary constraint lies in its
reliance on qualitative data from semi-structured interviews, which, although
rich in insights, may not capture the full spectrum of perspectives on the
environmental impacts of AI and digital technologies. The rapidly evolving
nature of AI technology means that, the findings might quickly become outdated.
Ongoing study is required ongoing study to stay current with technological
advancements and their environmental implications. Future study should aim to
broaden the empirical base by incorporating quantitative assessments that can
offer a more comprehensive view of the environmental impacts of AI and digital
technologies. Longitudinal studies could provide valuable insights into the
evolving nature of these impacts over time. Furthermore, comparative studies
across different sectors and geographical regions could illuminate diverse
challenges and opportunities for sustainable AI practices globally. For
practitioners and policymakers, this study highlights the critical need for
integrating sustainability considerations into the development, deployment, and
regulation of AI and digital technologies. Organizations should prioritize
energy efficiency, waste reduction, and the use of sustainable materials in AI
technology development. Policymakers are urged to formulate and enforce
regulations that promote environmental sustainability in the tech industry.
Additionally, fostering public awareness and engagement can play a crucial role
in driving demand for sustainable AI solutions, thereby encouraging businesses
and governments to adopt more responsible practices.
Companies
like Microsoft, Google, and Meta are vowing to mitigate their environmental
impact by aiming to replenish more water than they consume by 2030 through
various ecological projects. But it’s not clear how they’ll be able to do that
when there’s simply not enough water.
Rising
water use in data centers is concerning due to the global freshwater scarcity we face. CEOs and board directors investing in AI should
reflect on these three questions:
1. What is the impact of your AI strategy on water consumption, and how are you planning to replenish what you are draining from the Earth?
2.
Will your investments in AI create more social problems than benefits?
3. Have you quantified the social risks in your AI investment business cases, and is your board involved in reviewing the stakeholder and brand reputation risks related to your ESG goals?
Holistic
thinking is the key to advancing AI with corporate purpose. Our tech titans have
opened the AI Pandora’s box, and how we ethically take more social
responsibility remains to be seen. This will require more regulation and
scrutiny.
Google’s
water commitment recently stated: “Fresh, clean water is one of the most
precious resources on Earth ... we’re taking urgent action to support water
security and healthy ecosystems.”
Already, AI's projected water usage could reach 6.6 billion m³ by 2027, signalling a need to address its water footprint.
In
closing, Antonio Guterres, UN Secretary General, said at the UN Water Conference
that “Water is a human right and the common development denominator to shape a
better future. But water is in deep trouble.”
References
‘A’:
1
Forti, V., Baldé, C. P., Kuehr, R., & Bel, G. (2020). The
global e-waste monitor 2020: Quantities, flows and the circular economy
potential. United Nations University (UNU), International
Telecommunication Union (ITU) & International Solid Waste Association
(ISWA). Bonn; Geneva; Rotterdam.
2 Garcia,
C. (2022). The Real Amount of Energy A Data Center Uses. https://clck.ru/36kxEN
3 Google.
(2021). Google reaches 100% renewable energy goal. https://clck.ru/36kxG4
4 Data
Centre Alliance. (n.d.). About the DCA. https://clck.ru/36kxHA
5 Lim,
S. (2022, July 14). Media industry’s pollution equivalent to aviation, study
finds. Campaign. https:// clck.ru/36kxHu
6 Department
for Business, Energy & Industrial Strategy. (2020). BEIS Electricity
Generation Costs. https:// clck.ru/36kxKS
7 Microsoft. (2022). Microsoft announces plan to
be carbon negative by 2030. https://clck.ru/36kxLJ; See also Amazon. (n.d.).
Amazon and Global Optimism announce The Climate Pledge. https://clck.ru/36kxMP
8 European
Commission. (n.d.). EU Climate Action. https://clck.ru/36kxSS
9 Baldé,
C. P., Forti, V., Gray, V., Kuehr, R., & Stegmann, P. (2017). The global
e-waste monitor 2017: Quantities, flows and resources. United Nations
University, International Telecommunication Union, and International Solid
Waste Association.
10 Ibid.
11 Ibid.
12 Ibid.
13 Baldé,
C. P., Forti, V., Gray, V., Kuehr, R., & Stegmann, P. (2017). The global
e-waste monitor 2017: Quantities, flows and resources. United Nations
University, International Telecommunication Union, and International Solid
Waste Association.
14 Ibid.
15 Ibid.
16 Directive
2009/125/EC of the European Parliament and of the Council of 21 October 2009
establishing a framework for the setting of eco-design requirements for
energy-related products (recast) (Text with EEA relevance). (2009). Official
Journal of the European Union, L 285, 10–35. https://clck.ru/36kxU5
17 Consolidated
text: Directive 2012/19/EU of the European Parliament and of the Council of 4
July 2012 on waste electrical and electronic equipment (WEEE) (recast) (Text
with EEA relevance). https://clck.ru/36kxYS
18 Commission
Regulation (EU) 2019/424 of 15 March 2019 laying down ecodesign requirements
for servers and data storage products pursuant to Directive 2009/125/EC of the
European Parliament and of the Council and amending Commission Regulation (EU)
No 617/2013 (Text with EEA relevance). (2019). Official Journal of the European
Union, L 74, 46–66. https://clck.ru/36kxbm
References
‘B’:
Alistarh, D., Grubic, D., Li, J.,
Tomioka, R., & Vojnovic, M. (2017). QSGD: Communication-
efficient SGD via gradient quantization and
encoding. Advances in neural information processing systems, 30.
https://doi.org/10.48550/ arXiv.1610.02132
Amodei, D., Olah, C., Steinhardt, J.,
Christiano, P., Schulman, J., & Mané, D. (2016).
Concrete problems in AI safety. https://doi.org/10.48550/arXiv.1606.06565
Beloglazov, A., Buyya, R., Lee, Y. C.,
& Zomaya, A. (2011). A taxonomy and survey of
energy-efficient data centers and cloud computing
systems. Advances in computers, 82, 47–111. https://doi.org/10.1016/B978-0-12- 385512-1.00003-7
Benjamin, R. (2019). Race after
technology: Abolitionist tools for the new Jim code.
Cambridge: Polity. Burrell, J. (2016). How the
machine ‘thinks’: Understanding opacity in machine learning algorithms. Big
Data & Society, 3(1). https://doi.org/10.1177/2053951715622512
Caliskan, A., Bryson, J. J., &
Narayanan, A. (2017). Semantics derived automatically from
language corpora contain human-like biases. Science,
356(6334), 183–186. https://doi.org/10.1126/science.aal4230
ChatGPT
Crawford, K., & Calo, R. (2016).
There is a blind spot in AI study. Nature, 538(7625),
311–313. https://doi. org/10.1038/538311a
Dhar, P. (2020). The carbon impact of
artificial intelligence. Nat. Mach. Intell., 2(8), 423–
425. https://doi. org/10.1038/s42256-020-0219-9
Ding, Q., Zhu, R., Liu, H., & Ma,
M. (2021). An overview of machine learning-based energy-
efficient routing algorithms in wireless sensor
networks. Electronics, 10(13), 1539. https://doi.org/10.3390/electronics10131539
Ferro, M., Silva, G. D., de Paula, F.
B., Vieira, V., & Schulze, B. (2021). Towards a
sustainable artificial intelligence: A case study of
energy efficiency in decision tree algorithms. Concurrency and Computation:
Practice and Experience, e6815. https://doi.org/10.1002/cpe.6815
Geissdoerfer, M., Savaget, P., Bocken,
N. M., & Hultink, E. J. (2017). The Circular
Economy–A new sustainability paradigm? Journal of
cleaner production, 143, 757–768. https://doi.org/10.1016/j.jclepro.2016.12.048
Han, S., Pool, J., Tran, J., &
Dally, W. (2015). Learning both weights and connections for
efficient neural network. In Advances in neural
information processing systems 28. https://doi.org/10.48550/arXiv.1506.02626
Hanus, N., Newkirk, A., &
Stratton, H. (2023). Organizational and psychological measures
for data center energy efficiency: barriers and
mitigation strategies. Energy Efficiency, 16(1), 1–18. https://doi.org/10.1007/
s12053-022-10078-1
Henderson, P., Islam, R., Bachman, P.,
Pineau, J., Precup, D., & Meger, D. (2018). Deep
reinforcement learning that matters. In Proceedings
of the AAAI conference on artificial intelligence (Vol. 32, No. 1).
https://doi. org/10.1609/aaai.v32i1.11694
Hubara, I., Courbariaux, M., Soudry,
D., El-Yaniv, R., & Bengio, Y. (2016). Quantized
neural networks: Training neural networks with low
precision weights and activations. The Journal of Machine Learning Study,
18(1), 6869–6898. https://doi.org/10.48550/arXiv.1609.07061
Kahhat, R., Kim, J., Xu, M., Allenby,
B., Williams, E., & Zhang, P. (2008). Exploring e-
waste management systems in the United States.
Resources, conservation and recycling, 52(7), 955–964. https://doi.org/10.1016/j. resconrec.2008.03.002
Koh, L. P., & Wich, S. A. (2012).
Dawn of drone ecology: low-cost autonomous aerial
vehicles for conservation. Tropical conservation
science, 5(2), 121–132. https://doi.org/10.1177/194008291200500202 Masanet, E.,
Shehabi, A., Lei, N., Smith, S., & Koomey, J. (2020). Recalibrating global
data
center energy-use estimates. Science, 367(6481),
984–986. https://doi.org/10.1126/science.aba3758
Mell, P., & Grance, T. (2011). The
NIST Definition of Cloud Computing. Special Publication
(NIST SP). National Institute of Standards and
Technology, Gaithersburg, MD. https://doi.org/10.6028/NIST.SP.800-145
Merolla, P. A., Arthur, J. V., Alvarez-Icaza,
R., Cassidy, A. S., Sawada, J., Akopyan, F., ... &
Modha, D. S. (2014). A million spiking-neuron
integrated circuit with a scalable communication network and interface.
Science, 345(6197), 668–673. https://doi.org/10.1126/science.1254642
Prezioso, M., Merrikh-Bayat, F.,
Hoskins, B. D., Adam, G. C., Likharev, K. K., & Strukov,
D. B. (2015). Training and operation of an
integrated neuromorphic network based on metal-oxide memristors. Nature,
521(7550), 61–64. https://doi.org/10.1038/nature14441
Ram, M., Child, M., Aghahosseini, A.,
Bogdanov, D., Lohrmann, A., & Breyer, C. (2018). A
comparative analysis of electricity generation costs
from renewable, fossil fuel and nuclear sources in G20 countries for the period
2015-2030. Journal of Cleaner Production, 199, 687–704. https://doi.org/10.1016/j.jclepro.2018.07.159
Rolnick, D., Donti, P. L., Kaack, L.
H., Kochanski, K., Lacoste, A., Sankaran, K., ... &
Bengio, Y. (2022). Tackling climate change with machine learning. ACM
Computing
Surveys (CSUR), 55(2), 1–96. https://doi.org/10.1145/3485128
Schwartz, R., Dodge, J., Smith, N. A.,
& Etzioni, O. (2020). Green AI. Communications of
the ACM, 63(12), 54–63. https://doi.org/10.1145/3381831
Şerban, A. C., & Lytras, M. D.
(2020). Artificial intelligence for smart renewable energy
sector in Europe – smart energy infrastructures for
next generation smart cities. IEEE access, 8, 77364–77377. https://doi. org/10.1109/ACCESS.2020.2990123
Shah, A., Bash, C., Sharma, R.,
Christian, T., Watson, B. J., & Patel, C. (2010). The environmental
footprint
of data centers. In ASME 2009 InterPACK Conference
(Vol. 2, pp. 653–662). San Francisco, CA. https://doi.org/10.1115/InterPACK2009-89036
Strubell, E., Ganesh, A., &
McCallum, A. (2019). Energy and policy considerations for deep
learning in NLP. https://doi.org/10.48550/arXiv.1906.02243
Sweeney, L. (2013). Discrimination in
online ad delivery. Communications of the ACM,
56(5), 44–54. https://doi.org/10.1145/2447976.2447990
Widmer, R., Oswald-Krapf, H.,
Sinha-Khetriwal, D., Schnellmann, M., & Böni, H. (2005).
Global perspectives on E-waste. Environmental Impact
Assessment Review, 25(5), 436–458. https://doi.org/10.1016/j. eiar.2005.04.001
You, Y., Gitman, I., & Ginsburg,
B. (2017). Large batch training of convolutional networks.
https://doi.org/10.48550/ arXiv.1708.03888
Zian, (Andy) Wang AI Content Fellow
Authors
Details:
Elija
Gayen is an author-illustrator, traveler,
photographer and a vivid reader of Bengali literature but not an avid one; was
a daydreamer in high school, and, unsure of future goals, went to self-train at
different fields of technology, culinary and languages; graduated from the
Jadavpur University in Civil Engineering in 1995; presently engaged in making
stories from a pretty little apartment in the old city of Berlin in Germany and
for bread and butter employed in an ITes.
Chandan
Bandyopadhyay presently employed in a
State-owned PSU in West Bengal, India; is an author, researcher, motivator,
traveler and photographer and fascinated by the power of mind.
Both the authors are
obsessed with self-improvement; personal mission is to help people realize
their potential and reach higher levels of consciousness without causing harm
to fellow people or the nature.
We respect different
socio-cultural-ethnic group across the world despite of their caste, creed and
financial-so called social status.
*************************************************************************************************
Authors’
Contributions Authors have
contributed equally to the study process and the development of the manuscript.
Declaration In order to correct and improve the academic writing of our
paper, we have used the language model Ginger.
Transparency
Statement Data are available for study
purposes upon reasonable request to the corresponding author.
Acknowledgments We would like to express our gratitude to all individuals
helped us to study this.
Declaration
of Interest The authors report no
conflict of interest.
Funding
The study had no sponsorship.
Ethical
Considerations In this study, ethical
standards including obtaining informed consent, ensuring privacy and
confidentiality were observed.
P.S.: If interested, for a detailed
version, please contact E-mail: weism09022025@gmail.com
Comments