Jevons’ Paradox: AI Could Use As Much Electricity As Entire Countries

Artificial Intelligence Humanoid Life Concept

Artificial intelligence (AI) brings numerous benefits, but its adoption might come with a significant energy cost, as highlighted by a recent study. Generative AI models, like OpenAI’s ChatGPT, consume large amounts of energy during training and operational use. While global efforts are underway to improve AI’s energy efficiency, the increased efficiency might inadvertently boost demand due to Jevons’ Paradox. Given current projections, AI’s electricity consumption could rival that of entire nations by 2027. The researchers stress the importance of mindful AI application due to its energy-intensive nature.

Artificial intelligence (AI) offers the potential to enhance coding speed for programmers, improve safety for drivers, and expedite everyday tasks. However, in a commentary recently published in the journal Joule, the founder of Digiconomist demonstrates that the tool, when adopted widely, could have a large energy footprint, which in the future may exceed the power demands of some countries.

“Looking at the growing demand for AI service, it’s very likely that energy consumption related to AI will significantly increase in the coming years,” says author Alex de Vries, a Ph.D. candidate at Vrije Universiteit Amsterdam.

Since 2022, generative AI, which can produce text, images, or other data, has undergone rapid growth, including OpenAI’s ChatGPT. Training these AI tools requires feeding the models a large amount of data, a process that is energy-intensive. Hugging Face, an AI-developing company based in New York reported that its multilingual text-generating AI tool consumed about 433 megawatt-hours (MWH) during training, enough to power 40 average American homes for a year.

And AI’s energy footprint does not end with training. De Vries’s analysis shows that when the tool is put to work—generating data based on prompts— every time the tool generates a text or image, it also uses a significant amount of computing power and thus energy. For example, ChatGPT could cost 564 MWh of electricity a day to run.

While companies around the world are working on improving the efficiencies of AI hardware and software to make the tool less energy-intensive, de Vries says that an increase in machines’ efficiency often increases demand. In the end, technological advancements will lead to a net increase in resource use, a phenomenon known as Jevons’ Paradox.

“The result of making these tools more efficient and accessible can be that we just allow more applications of it and more people to use it,” de Vries says.

Google, for example, has been incorporating generative AI in the company’s email service and is testing out powering its search engine with AI. The company processes up to 9 billion searches a day currently. Based on the data, de Vries estimates that if every Google search uses AI, it would need about 29.2 TWh of power a year, which is equivalent to the annual electricity consumption of Ireland.

This extreme scenario is unlikely to happen in the short term because of the high costs associated with additional AI servers and bottlenecks in the AI server supply chain, de Vries says. However, the production of AI servers is projected to grow rapidly in the near future. By 2027, worldwide AI-related electricity consumption could increase by 85 to 134 TWh annually based on the projection of AI server production.

The amount is comparable to the annual electricity consumption of countries such as the Netherlands, Argentina, and Sweden. Moreover, improvements in AI efficiency could also enable developers to repurpose some computer processing chips for AI use, which could further increase AI-related electricity consumption.

“The potential growth highlights that we need to be very mindful about what we use AI for. It’s energy-intensive, so we don’t want to put it in all kinds of things where we don’t actually need it,” de Vries says.

Reference: “The growing energy footprint of artificial intelligence” by Alex de Vries, 10 October 2023, Joule.
DOI: 10.1016/j.joule.2023.09.004

3 Comments on "Jevons’ Paradox: AI Could Use As Much Electricity As Entire Countries"

  1. Where are all the activists, politicians, and scientists at on this?

    They should be throwing a fit about how much AI is contributing to climate change due to increased and enormous energy usage.

    Of course that doesnt fit the narrative. They only care about taking your ICE cars, reducing your energy usage, making you eat what they say, etc….

    Its not about climate change, its about control.

  2. And almost as much as Bitcoin!

  3. This article is really misleading. AI and ML are not the same! ML requires power when crunching datasets, yes. This is the training phase, power hungry. Then the AI service is deployed, and it comparatively does not consume anything! Else you wouldn’t have AI in embedded devices… Such as your car, your watch, or all the IOT devices… Think about it for a second. Your Tesla car wouldn’t be able to run AI if it was consuming that much power. It really shows that this Alex de Vries, a PHD at a School of Business and Economics (and not computing!!) does not get it. Really bothered by the almagate you try to induce. Programmer here.

Leave a comment

Email address is optional. If provided, your email will not be published or shared.