Artificial intelligence is increasingly demanding more energy! According to a survey by a Data Scientist referenced by the New York Times, it could soon consume as much electricity as a country the size of Argentina or Sweden. In this context, what does the future hold for AI?

Since their launch, chatbots like ChatGPT and Google Bard have been immensely popular. Millions of people use them daily for work, as personal assistants, or as alternatives to web search engines.

Similarly, image generators such as MidJourney and DALL-E 3 are widely employed for artistic creation, design, illustration, propaganda, or simply for entertainment by creating memes.

Generative artificial intelligence is now ubiquitous, with more and more tools emerging and the number of users constantly on the rise.

But what are the limits?

Despite shocking revelations over the past months, many users remain unaware of the astronomical amount of water and electricity consumed by this technology.

To function, ChatGPT and other AIs rely on vast data centers that are true energy sinks.

YouTube video

Just for its training, GPT-3 required 1,287 gigawatt-hours, which is equivalent to the annual electricity consumption of 100 households. Moreover, each question you pose to the chatbot consumes the equivalent of a glass of water!

According to recent studies, the energy needs of AI continue to increase explosively, far beyond comprehension.

AI Will Soon Consume 0.5% of Global Electricity

According to a recent analysis by Dutch Data Scientist Alex de Vries published in the journal Joule, by 2027, server farms could consume between 85 and 134 terawatt-hours per year.

This is equivalent to the annual consumption of Argentina, the Netherlands, or Sweden. More specifically, AI will alone consume 0.5% of all electricity used on the planet.

Additionally, another technology has reached a similar level of energy consumption in recent years: cryptocurrency.

Furthermore, although OpenAI and other AI companies keep their consumption secret, De Vries provides an estimate based on Nvidia A100 GPU sales: a graphics card used by 95% of the industry. He notes, “every Nvidia server is an energy-hungry beast.”

YouTube video

Should We Reduce Our Use of AI?

In light of this colossal carbon footprint, experts believe it is urgent to reconsider the massive investments in artificial intelligence, or perhaps even its usage…

Roberto Verdecchia from the University of Florence, interviewed by the NYT, stated, “perhaps we should ideally slow down the application of the solutions that we have .”

He suggests, for instance, to “not create a new model just to enhance its speed and accuracy.” Additionally, he calls for us to “take a deep breath and assess how much we are burning in terms of environmental resources.”

This appeal echoes an open letter requesting a pause of at least six months in AI development, signed by over 1,000 experts in March 2023, including the notable Elon Musk. Sadly, this request has gone unheard

YouTube video

Can California Pave the Way for Greener AI?

California-based companies, which represent a significant proportion of the AI sector, may soon face strong opposition in the near future.

During this past weekend, Governor Gavin Newsom signed two new climate laws requiring companies like Google and OpenAI to disclose the carbon they produce by 2026.

This new obligation affects over 10,000 companies and will end the culture of secrecy currently prevailing in Silicon Valley.

Nevertheless, despite the watchful eyes of regulators, the industry essentially continues to self-regulate. It is highly likely that the tech giants and startups will keep consuming immense amounts of energy in the name of the AI race

    Share the article:

Our blog is powered by readers. When you purchase through links on our site, we may earn an affiliate commission.

Leave A Comment