For all its potential upside, Artificial intelligence (AI) may have a substantial downside—powering it. A study published in the journal Joule, a founder of Digiconomist, demonstrates that when AI is widely adopted, its large energy footprint may exceed the power demands of some countries.
Training generative AI tools is based on models requiring a large amount of data, which is energy-intensive. According to one AI-developing company, Hugging Face, its multilingual text-generating AI tool consumed approximately 433 megawatt-hours (MWH) during training, the equivalent of powering 40 average American homes for a year. When we look at the number of startups harnessing the power of AI, the impact is obvious. Even after training, the process uses a lot of computing power each time the AI tool generates a text or image. In simple terms, ChatGPT could cost 564 MWh of electricity a day to run.
Will efficiencies improve? Of course. However, increased efficiency often increases demand. The technological advancements will lead to a net increase in resource use, a phenomenon known as Jevons’ Paradox. Google uses generative AI in its email service and plans its use in its search engine. It currently processes up to 9 billion daily searches—and if every Google search used AI, that would require approximately 29.2 TWh of power a year. That’s equivalent to the annual electricity consumption of Ireland.
This won’t happen overnight. By 2027, however, worldwide AI-related electricity consumption could increase by 85 to 134 TWh annually based on the projection of AI server production, comparable to the annual electricity consumption of The Netherlands, Argentina, or Sweden.