You will be just as irritated as I do when you learn how much energy for a few seconds AI video costs
- Advertisement -
- Advertisement -
- AI chatbots and videos use an enormous amount of energy and water
- An AI video of five seconds uses as much energy as a microwave that runs an hour or more
- The energy consumption of data centers has doubled since 2017 and AI will be half of it by 2028
It only takes a few minutes in a microwave to explode a potato that you have not ventilated, but it requires so much energy as running that microwave for more than an hour and more than a dozen potato explosions for an AI model to make a five-second video of a potato explosion.
A new study MIT Technology Review has explained how hungry AI models are for energy. A base chatbot Answer can only use 114 or no less than 6,700 joules, between half a second and eight seconds, in a standard microwave, but it is when things become multimodal that the energy goes up to an hour plus in the microwave, or 3.4 million doulen.
It is not a new revelation that AI is energy-intensive, but MIT’s work determines math in large terms. The researchers came up with what a typical session could be with an AI-Chatbot, where you ask 15 questions, request 10 AI-generated images and throw in requests for three different five-second videos.
You can see a realistic fantasy film scene that seems to be filmed in your back garden a minute after you ask for it, but you will not notice the enormous amount of electricity that you have demanded to produce it. You have asked about 2.9 kilowatt hours, or three and a half hours microwave.
What strikes the AI costs is how painless it feels from the user’s perspective. You do not budget AI messages as we all did 20 years ago with our SMS messages.
AI Energy reconsider
Of course, you are not a Bitcoin and your video at least has a real-world value, but that is a very low bar to switch when it comes to ethical energy consumption. The increase in data centers energy requirements also takes place at a ridiculous pace.
Data centers had pledged on their energy consumption before the recent AI explosion, thanks to efficiency buyers. However, the energy used by data centers has doubled since 2017, and according to the report about half of it will be for AI by 2028.
This is not a debt trip by the way. I can claim professional requirements for part of my AI use, but I used it for all kinds of recreational pleasure and also to help with personal tasks. I would write an apology to the people who work with the data centers, but I would need AI to translate it for the language spoken at some data center locations. And I don’t want to sound heated, or at least not as heated as the same servers. Some of the largest data centers use millions of liters of water every day to stay icy.
The developers behind the AI infrastructure understand what is happening. Some try to find cleaner energy options. Microsoft Want to close a deal with nuclear power stations. AI can or may not be an integral part of our future, but I would like it if that future is not full of expansion cords and boiling rivers.
On an individual level, your use or avoidance of AI will not make much difference, but encouraging better energy solutions from the owners of Datacenter. The most optimistic result is the development of more energy-efficient chips, better cooling systems and greener energy sources. And perhaps the CO2 footprint of AI should be discussed like any other energy infrastructure, such as transport or food systems. If we are willing to debate about the sustainability of almond milk, we can certainly save a thought for the 3.4 million Joules needed to make a video of five seconds of a dancing cartoon ramp.
As tools such as chatgpt, Gemini and Claude become smarter, faster and more embedded in our lives, the pressure on energy infrastructure will only grow. If that growth happens without planning, we will remain to cool a supercomputer with a paper fan while we chew on a raw potato.
Maybe you like it too
- Advertisement -