As AI continues to become more of a day-to-day tool for many in the modern workforce and within the events industry, it’s frequently been heralded with the potential to increase and accelerate our societies move to a more sustainable world, however, just what is the reality behind these claims?
It might come as a surprise that 54% of Americans don’t know what more than half of the ingredients are in the processed foods they're eating…so perhaps it’s less of a surprise that we don’t think too much about how artificial intelligence is actually generated. It can be easy to switch off when we don’t quite understand how something functions or the terminology that’s used, so we’ll keep this blog easy to read, in Layman's terms, if you will.
Let’s start digging with ChatGPT, a tool that most of us are likely to have used by now. Did you know that just one answer from the AI can require on average 10x times the electrical power than a Google search would? Google alone requires huge data centres, buildings that house the computer systems and servers responsible for transferring information and data across the world, essentially the premise of the “world wide web”. Keeping these computer systems “online”, requires electricity of course, much of which is still produced using fossil fuels, but let’s go a level deeper.
If you’ve noticed your laptop or computer getting hot whilst you’re working on it, think about the heat that might come from a whole building full of computer systems. To stop the systems from overheating and servers from crashing, air conditioning is a key component of data centres, something also requiring electricity to function.
The NY Times has previously reported that a large data centre such as one of Google’s many could use as much electricity as a medium town. On a global scale, in 2022, the International Energy Agency (IEA) estimated that data centres accounted for 1% of global energy consumption, with the expectation for this to double by 2026, mostly due to the increase in crypto and artificial intelligence.
So, why is artificial intelligence more power-hungry? As intelligent as it may seem, AI is, at its core, a predictive modelling system that has been ‘trained’ using huge amounts of data in order to formulate patterns and predict the best response. The computer systems responsible for processing this data need to be running constantly for the AI to work, and so it’s both the immense amount of data and the processing that accounts for its power needs.
Think about it in terms of books, if you read 1 book on gardening, let's say, it might take you a day’s worth of energy and you’ll have a basic understanding of that author's perspective on gardening and the topics discussed in that book. If you want to understand everything there is to know about gardening and be able to answer any question posed to you, you need to read all the books on gardening, and this, comparatively, would take you many, many days, worth of energy. AI, in order to function, has to read all the books, using up all that energy. Or perhaps we compare it again with Google. If we search something on Google, we’re not instantly told the answer, we’re responsible for reading through as many data sources and websites as necessary until we find the right answer for ourselves. AI is reading through all those data sources and articles for us, providing us with a bite-sized evaluation in a fraction of the time.
This dilemma was acknowledged by the UN in September, when they published an article bluntly titled, “AI Has an Environmental Problem”. Here they dug beyond the power consumption depleting fossil fuels, noting also the electronic waste produced and the large amount of water that is also required.
As much as AI is no doubt an environmental problem right now, there is still the potential that it can save itself, so to speak. A great example of its remedy in action has been showcased at none other than a Google data centre, of course. Here, their AI “DeepMind” used machine learning to analyse operations and managed to reduce energy usage in cooling systems by a staggering 40%. Similar success stories have been seen in agricultural fields, where AI has helped farmers to make data-driven decisions that can result in reduced waste and increased productivity yields.
It seems as if we’re between a rock and a hard place when it comes to the sustainability of AI. Whilst much of global energy production still relies on fossil fuels, AI’s intense use of energy will mean an increase in CO2 emissions. Yet, in the same breath, those CO2 emissions could result in unlocked knowledge that may help to “expedite and scale sustainability efforts”, as hypothesised by the World Economic Forum.
As put by the UNEP’s Chief Digital Officer, “We need to make sure the net effect of AI on the planet is positive before we deploy the technology at scale.”, and currently, it’s just too soon to know.
About the author:
Olivia Salvage is a content writer and sustainable event manager. It didn’t take long until Olivia saw the wasteful nature of events in everything from food to printed collateral, and she began educating herself on sustainable solutions and practices to implement within her projects. Having since completed sustainable certificates, Olivia now considers sustainability a true passion and continues to research ways to bring positive change to her roles.