I’ll be honest: Until recently, my knowledge of AI was limited to dystopian films in which robots are indistinguishable from humans and take over the world. Away from my sci-fi bed. But artificial intelligence (AI) is ubiquitous – it’s in boring everyday devices, companies from Netflix to Ikea are working with it and the healthcare sector is applying it too – I keep wondering: What is the climate impact of all this intelligence? How much power is needed to make a single AI model and can we power it up? How do you calculate all that?
AI is not a “thing” you communicate, it is a process that includes all kinds of things. I learned this from a visual article by Kate Crawford, a writer and computer scientist at New York University. Using some kind of diagram of the Amazon Echo, a smart home speaker, I showed that this is much more than just a device on your bookshelf. Behind it is a complete infrastructure for production processes, materials and data transmission.
If you give the speaker a command, it will send your message to the data center. Thus, the speaker is not just a speaker, but also a source that constantly sends recorded voice messages, analyzes and becomes smarter again. Thus, calculating the energy costs of all the processes required for artificial intelligence is not so simple.
8300 beds or one supercomputer
Let me give it a try. The power of AI models is mainly consumed in “training” the network. So said Rob van Newport, professor by special appointment in efficient computing at the University of Amsterdam. Want a model that recognizes faces? Then you show her a huge data set of images, so the computer can learn what facial features and what’s not. “It takes a lot of computing power to develop a network of machine,” says Van Nieuwpoort. And this training often does not stop at once. Commonly used forms, such as speech recognition, are updated regularly.
“
To train an AI network, one supercomputer per hour uses the energy equivalent of 8,300 households in the same hour
You need supercomputers to train large AI models, Van Nieuwpoort continues. “8.3 megawatts of something like this goes to powerful processors that are also used for artificial intelligence.” To train an AI network, one supercomputer per hour uses the energy equivalent of 8,300 households in the same hour.
Researchers at the University of Massachusetts in the United States in 2019 calculated the amount of greenhouse gases emitted when training common large language models, as a computer learns to master human language. It seems that this is very difficult, because it turns out that training a large language model costs 284 thousand kilograms of carbon dioxide. This is equivalent to the CO2 emissions of about 85 petrol cars in the Netherlands in a year. That’s because computers have to run at full power for days, sometimes months, to process all the data.
Energy-saving algorithms
Let’s take a look at those data centers. There are already about 200 people in the Netherlands and that number continues to grow. Almost all of our internet traffic goes through data centers. Google estimates that a single search uses as much energy as burning a 60-watt light bulb for 17 seconds. I think the number of light bulbs we have to burn for our daily online behavior… Servers in data centers get hot and have to be constantly cooled; It consumes a lot of energy. Worldwide, data centers are responsible for 2 percent of total carbon dioxide emissions, according to a report from the Global Electronic Sustainability Initiative. This is as much as the entire aviation sector.
“
Worldwide, data centers are responsible for 2 percent of total carbon dioxide emissions
Fortunately, there is a “but”: more and more scientists are devising ways to make algorithms more energy efficient. “We get inspiration from the brain,” says Sundar Buhti, an artificial intelligence researcher at Centrum Wiskunde & Informatica. “With image recognition on the street, the images are sent to data centers, which costs a lot of energy.” Buhti discovered how an artificial network, for example, a motion-recognition model, could be a thousand times more efficient. “You want to process the data locally, in the camera itself, just as it happens in our heads.”
Moreover, some AI models seem to be addressing climate change. In a podcast by the Dutch Alliance for Artificial Intelligence (an association of companies and researchers who want to accelerate the development of artificial intelligence in the Netherlands), I heard that there are AI models that help us generate better green energy. Wind turbines on large wind farms often create “wind shadows” – and then conflict with each other, reducing energy yield. With the help of artificial intelligence, researchers learn a lot about the correct location of wind turbines in a short time, so that as little green energy as possible is lost. Here’s what podcast guest Mathej de Werdt, Head of Algorithms at TU Delft, tells us. I decided to approach him.
Green AI is not always applied correctly
De Weerdt explains that there is a growing awareness about “green AI” and energy efficiency, not least because of rising costs. But does the green AI model provide enough energy to compensate for what the model itself uses for energy? “This is almost always a positive thing,” he says. “Once we train a network properly, it can be used effectively for as long as possible.”
“
Smart men use ‘green’ AI all the time to make money
However, the truth is that we are not only dealing with green AI that is fighting climate change. Most of the AI models have been developed to improve the products and functionality of Facebook and Amazons in this world. “These smart guys are using AI to make devices and make money,” says Chris Julian. At the Waag Research Institute, he focuses on the ethical issues surrounding artificial intelligence. There are great models for cancer research. Or models that help us understand climate better. Then I say: Go for it!“
Julian continues: “But” AI is now used primarily for home and kitchen garden applications, in cars or refrigerators. In many cases I find it a very stupid app. All of these systems also require computers to run and the added value is much less than improving the world.” Julian warns with a healthy dose of skepticism about promises that are often made – and forgotten – in the tech world. “The paradox of AI is: Energy needs are often not included For the systems themselves in the whole picture.”
IT Dictatorship
De Weerdt also warns of pitfalls: “We must pay attention to who is responsible and who is still responsible for AI methods.” Facebook, Amazon, and Google may at some point be the only ones with the money to train high-quality systems. We will then have to rely on the goodwill of those companies to create and deploy large, energy-efficient AI models. Scientists warn that research into AI is expensive and there are financial limits to what public institutions can research.
According to De Weerdt, it is important that research data and results are published in the field of artificial intelligence. While university research is always an “open science”, this is not often the case with large companies. “In a worst-case scenario, within ten years we will end up with some kind of IT dictatorship.”
So green AI can meet the climate in the long run, but we must continue to be wary of overuse. Because anyway: AI causes a lot of emissions. Should I be worried about robots taking over humanity? “I don’t expect that to happen in the next 100 years” — fortunately, says De Weerdt.
This article originally appeared in OneWorld in December 2020.
Reduce sustainability, be fair
“Sustainability is a green sucker.”
“Sustainability is a green sucker.”
Chill, that e-bike, but what about all the discarded batteries?
“There is already more than 7 kilograms of electronic junk on the scrap heap for every inhabitant on Earth.”
“There is already more than 7 kilograms of electronic junk on the scrap heap for every inhabitant on Earth.”
Evil tv scholar. Proud twitter aficionado. Travel ninja. Hipster-friendly zombie fanatic.