As generative AI aims to revolutionize our daily lives, will it be a positive force for sustainability, or will it contribute to worldwide environmental challenges?
ChatGPT, generative AI, and “big data” have all captured headlines for the past few months, intriguing us with their endless possibilities. These powerful models have become a fundamental part of our lives that radically shift how we work, with ChatGPT gaining 100 million users in less than two months. Yet, amidst our enthusiasm for these transformative technologies, one crucial aspect remains untouched: the intricate link between AI and sustainability, and particularly the environment. As generative AI aims to revolutionize our daily lives, will it be a positive force for sustainability, or will it contribute to worldwide environmental challenges?
Artificial intelligence has brought about a new era of innovation and efficiency
Capable of human-like abilities, whether it be answering complex mathematical problems, analyzing large data, or even creating art, AI has certainly shattered our current view of productivity and how the future might look. It has been implemented in most domains, even in some courts of law. Indeed, in China, citizens can access AI machines that can provide legal documents and advice. However, despite the benefit to certain sectors of society, some of AI’s hidden environmental costs cannot be ignored any longer.
Although AI has positively impacted industries worldwide, the use of large training models, such as ChatGPT, has also demonstrated the existence of negative externalities. As extraordinarily large data sets are fed into models for training purposes, immense amounts of energy and power are required to run the systems. While in some places energy relies on natural resources such as hydropower, most energy worldwide still relies on fossil fuels. As the amount of energy grows exponentially with increased data for training, a question emerges whether the energy cost of such training is worth it.
To put this into perspective, MIT has analyzed that a single data center might consume an amount of electricity equivalent to 50,000 homes.
The electricity consumption varies, however, according to a multitude of factors: most of them depend on the data center’s location. Cooling can account for up to 40% of a center’s energy use, for instance. Therefore, a location with a cooler climate, a strong network connection, and, most importantly, readily available renewable energy is critical to decrease AI’s reliance on energy. To that end, data centers in Iceland and Sweden are suitable alternatives due to their cool climates and commitment to renewable energies, for instance.
The environmental impact does not end at high electricity use
Data centers also utilize large amounts of water to cool down systems. Training one AI model can emit more than five times the lifetime carbon emissions of an average American car. However, although progress and innovation often come at a cost, leveraging AI’s large range of capabilities could also revolutionize sectors, notably in agriculture. In this industry, AI automates and provides better monitoring and management of resources. An example would be in vertical farming, a newer efficient method of agriculture, where AI and Machine Learning play a key role in optimizing resources.
To put this trillion-dollar industry optimization into perspective: if it is well implemented, AI could reduce greenhouse gas emissions, by up to four percent.
With AI unleashing its power upon our daily lives, making sense of digitalization and climate change can be tricky, and weighing benefits and costs becomes evermore necessary. However, it is not a question of whether both AI and sustainability can co-exist, but rather, it is a necessary goal to ensure that they work together for our future and that of our planet.