THE ALGORITHMIC REVOLUTION

How Much Does AI Really Pollute?

a cura di Francesco D'Isa
How Much Does AI Really Pollute?

It is increasingly common to hear people criticizing the use of generative artificial intelligence “because it pollutes”; others, however, argue that these technologies will drive the scientific advances necessary to pull us out of the climate crisis. The prevailing opinion is therefore split between the uncritical enthusiasm of technology advocates, who often gloss over the material costs of progress, and the catastrophism of critics, who predict a global ecological collapse fueled by datacenters.
Recently, for example, a member of the well-known collective writing group Wu Ming (WM1) stated that AI represents “one of the most energy-intensive, resource-inequitable, and ecocidal industrial models ever to exist under capitalism.” Who is right? How much does AI really pollute?

Although, based on current data, WM1’s claim can be easily classified as a loud hyperbole unsupported by facts, the environmental impact of the AI industry is a real issue.

I have previously addressed this topic in these pages, but to provide a more up-to-date overview, I referred to the recent report by the International Energy Agency entitled Energy and AI. This extensive document presents a complex picture that cannot be grasped by simply skimming the website summary. Let’s start with the present: the electrical impact of datacenters stands at a level close to 1.5% of global electricity consumption. Considering that generative AI currently accounts for about 15% of data center consumption, its present global impact is relatively low—0.225%—comparable to the electricity consumption of video gamers in the United States in 2019. A similar argument applies to water consumption, which I will address later.

Image via Google Creative Commons

Although this figure may appear modest compared to other industrial sectors, global statistics risk obscuring the urgency of local issues. The main problem, in fact, lies in the pressure exerted on specific territories and on their electrical and water grids—a problem to which global readings do not do justice. Moreover, energy consumption for AI is predictably set to rise along with its global use—but by how much?

Predicting this precisely is extremely complex, much like estimating the consumption of a fleet when you don’t know the number of ships, their engine power, and the cruising speed. The variables involved are so numerous that they generate projections with discrepancies that can reach (according to the report itself) a magnitude of sevenfold between different hypotheses; these forecasts hinge largely on the unknowns of algorithmic efficiency and the speed at which the technology integrates into the economy.

The IEA report outlines a middle-ground scenario in which energy consumption will double by 2030, and then tend toward a phase of stabilization over the following five years. This suggests a gradual transformation of AI into a mature industry, subject to processes similar to those that affected other productive sectors in the past.

One of the sector’s major challenges lies in the profound temporal mismatch between the development of computing and that of energy distribution infrastructure. While a data processing center can be made operational in just two years, upgrading the electrical infrastructure needed to support it often requires more than a decade of planning and construction. This chronological mismatch generates logistical tensions that put the stability of local grids at risk.

In this context, national strategies diverge significantly in approach and foresight. Among the major AI powers, the United States has recently leaned toward aggressive deregulation, encouraging the installation of these facilities near urban centers (areas wholly unsuited to bearing the load of what has become a heavy industry, with potentially devastating consequences for the balance of city services). China, on the other hand, appears to be pursuing a more structured path attentive to territorial impact, through planning that shifts major computing hubs toward western regions, where energy availability is greater and the burden on resident populations is lower; this geographical prudence is accompanied by a constant search for algorithmic efficiency and greater openness in code, a sign of a strategy aimed at technological leadership through system sustainability and infrastructure resilience.

Technological evolution is also redefining the hierarchies of global dependence, shifting the focus of strategic concerns from fossil fuels to the availability of critical minerals. If the stability of the last century was largely dictated by control over oil reserves, today’s competition revolves around access to elements like gallium, indispensable for high-performance semiconductors. The refining of this resource is currently dominated by China, which holds a near-total monopoly. This imbalance introduces a particularly complex geopolitical variable, as it grants a single power the ability to regulate, through control of raw materials, the pace of innovation of its competitors.

Image via Google Creative Commons.

The progressive increase in computational efficiency, which allows relatively small models to be deployed on local systems, represents an important achievement for digital sovereignty and the protection of privacy; having tools that do not depend on large centralized computing hubs guarantees freedom of action and data protection that would otherwise be unattainable. This efficiency, however, may not solve the energy problem, because it introduces a counterintuitive dynamic known as the Jevons paradox; according to this principle, technical improvements in the use of a resource lower its marginal cost, but ultimately encourage an increase in overall demand that often cancels out the initial energy savings. In the field of artificial intelligence, reducing the power required for individual computations risks translating into such pervasive use that it burdens, rather than lightens, the global energy balance. This fragmentation of computation would nevertheless shift the energy footprint toward the periphery of the network, distributing consumption across a multitude of personal devices and easing local criticalities.

Beyond its direct energy footprint, the report highlights AI’s potential to reduce emissions in other industries. The IEA shows that effective integration of these systems could cut global CO₂ emissions of 1.4 gigatons by 2035 (a figure five times greater than the worst-case scenario for AI’s own environmental impact). These savings stem from algorithms’ ability to optimize the management of electrical grids and accelerate scientific research toward more efficient battery materials. This would more than offset the ecological impact of these technologies, but it would not be the panacea capable of solving the environmental crisis as techno-optimists dream. Moreover, the implementation of these measures is neither immediate nor guaranteed, and without legal constraints it is unlikely to be adopted by major energy companies.

Also emerging is the relevance of so-called “consumption disengagement,” a perspective that invites us to measure not only how much artificial intelligence consumes, but above all which emissions it allows us to avoid by rendering less efficient processes obsolete. Assessing the digital footprint without considering the savings induced in the analog domain would be like examining a balance sheet without the revenue line; it is necessary to recognize that the adoption of algorithmic processes can replace activities whose environmental cost is higher. This form of indirect efficiency represents the technology’s most valuable contribution, provided that development remains anchored to goals of genuine sustainability.

The takeaway I offer is to avoid the polarization dominating today’s political debate. Some proposals for a total moratorium on new data centers, while stemming from understandable environmental concern, are anachronistic and lacking in pragmatism; in addition to being unfeasible for geopolitical reasons, they risk slowing the development of very useful tools. Calls for individual boycotts are equally sterile, because they foster a climate of user blame that does not affect the structural nature of the problem. With AI users on the rise, the only outcome of this stigma is a rise in the number of people using AI in secret (currently at 54%). Even more dangerous are development projects based on unchecked deregulation, ready to sacrifice environmental safeguards in the name of muscular technological primacy and exacerbated geopolitical competition.

A more balanced proposal would prioritize the health of local energy networks and require integration with renewable sources, in order to minimize climate impact without relinquishing technological sovereignty. To do so, it would be necessary to promote a cautious rollout, focused on managing logistical friction and on transparency of consumption data through rigorous territorial planning. It may not  satisfy media alarmists or tech marathoners, but it could genuinely benefit the public.

Francesco D’Isa