The increase in AI applications is growing rapidly – and with it, energy consumption. What solutions are there for this?
what is he talking about? Artificial intelligence (AI) aims to help when humans are too slow, work is too tedious, or when a machine is too precise. But when using AI, we often forget that training and running AI computer systems requires huge amounts of energy. Because computing power always requires electricity. The more complex the calculations, the more power servers and computers need. And: AI calculations are always complex – after all, the user expects a specific, detailed and perhaps personal answer to a specific question.
The smarter the answer, the more computing power is required and thus the more energy is consumed.
Why so much electricity? Each user query, for example in ChatGPT, consumes a lot of energy – after all, this runs large-scale computing operations on dozens of servers. But before the system is ready to provide the smartest possible answers, it must be trained. “In order to train a language model, for example, it has to perform calculations on thousands of billions of words,” says Guido Berger, digital editor of the SRF. And: “The smarter the answer, the more computing power is required and thus the more energy is consumed.” Power consumption is higher for photo or video applications.
It’s not entirely clear whether AI providers will gain anything at all.
Who pays for this electricity? “AI providers are currently burning through their investment dollars in their own data centers,” Guido Berger points out. It is still quite open whether providers will one day be able to charge users for electricity consumption – for example by making inquiries that cost something. “It’s also unclear whether AI providers will gain anything at all,” Berger says. All these uncertainties make it difficult to predict the number of future queries, and thus the energy consumption of future AI. Because it seems obvious that the more expensive it is for the user, the fewer requests there will be for AI services.
Reduce energy consumption? Because AI providers have to pay for computing power and electricity themselves, they have a strong interest in reducing this. Accordingly, there is a lot of research being done in the field of efficiency. One way might be to use smaller AI language models. You will use training data, computing power and therefore much less electricity. Status: Smaller models have yet to be good enough to be able to perform their specific applications. Berger also believes it is unrealistic that electricity consumption will increase dramatically as a result of artificial intelligence, as some experts predict. Because: “Nobody can pay for that anymore.”
Lifelong foodaholic. Professional twitter expert. Organizer. Award-winning internet geek. Coffee advocate.