The energy requirement of artificial intelligence technology is one of the most commonly discussed topics in tech circles. While it consumes energy on the scale of an entire country’s needs, it also has a heavy carbon footprint.
OpenAI’s ChatGPT, which is the most famous AI product at the time, is consuming electricity that can power 23,000 mid-sized US homes daily to churn out answers to 195 million queries from users. It’s like powering a small city.
It’s usually our small habits that put the gas pedal on AI’s energy consumption, along with the development happening at tech giants and startups. Let’s take a look.
We Often Don’t Sense AI’s Energy Consumption
In this digital age, nearly all our tasks have a connection to computing power in a certain way. Most of the things that happen in plain sight are connected to a computer embedded somewhere.
For example, when we pay at a checkout counter at a store and swipe our card, there is a long chain of transactions that occurs over a large network that we don’t see.
When we pass through a toll check, the system bills us, but we don’t see it. The same applies when we navigate through maps; while we only see our phone, there is a data center somewhere that stores and processes the data to guide us on our device.
Also read: Japan Estimates a Steep Spike in Energy Needs Due to AI and Data Centers
All these computational transactions that occur when we do anything online or on a device are carried out through a process called inference, which consumes a lot of energy. While companies like OpenAI, Meta, or Alphabet don’t disclose their actual energy consumption figures, Dr. Sasha Luccioni of Hugging Face hinted at the higher costs associated with AI training and inference. She wrote in a tweet, mentioning the AI training, that,
“We blew $100 million on compute just to give your chatbot a sexy voice!”
While mentioning the inference, she noted,
“Every time you search in your pictures, you use as much energy as your entire city block!” Sasha Luccioni.
Just to give you an idea of what these energy-guzzling AI systems are sucking from the grid. Lucioni has researched AI for a long time, and she says that switching from a non-generative AI approach to generative AI can make a difference of up to 40 times more energy for the same task.
What Energy Figures Are We Looking At?
As AI adoption increases, it is estimated that data centers’ energy consumption will almost double by 2026 globally, requiring more than 1000 terawatts.
In January, the International Energy Agency (IEA) released its prediction for the world’s energy consumption in the upcoming two years. The new thing was that projections on the amount of electricity used by data centers, cryptocurrencies, and artificial intelligence were also included for the first time.
“Data centers’ total electricity consumption could reach more than 1000 TWh in 2026. This demand is roughly equivalent to the electricity consumption of Japan.” Source: IEA.
Looking at the figures, it is unknown how we will achieve a sustainable energy future with increasing technology adoption across all sectors. Currently, AI consumption has become a large percentage of workloads at data centers, from small edge inference servers to large AI training clusters.
Carbon Emissions Are Getting Higher
To simplify the data mentioned above, energy consumption will have a carbon footprint equivalent to 80 million gasoline vehicles. Many additional factors also add up quickly.
Take, for example, a GPU (Graphics Processing Unit). It will heat more if used intensely and emit more heat. Imagine a data center with thousands of these running in stacks and the amount of heat they would be making. Different approaches have been adopted to cool them down. Most of the time, water cooling is adopted, which definitely increases water consumption and energy consumption to circulate that water.
“The training and the very dense workloads—they’ve got to sit in a more efficient environment. And if they don’t, we’re going to put such enormous strain on the power grids throughout not just Europe but the planet that we’re going to have real problems coming up over the next five to 10 years.” Dominic Ward, CEO of Verne Global.
As data centers increase in number and the present one expands and adapts to AI capabilities, more heat will be produced, affecting our planet’s ecosystem. Another concern is that data centers are now expanding their operations globally in places where coal and natural gas are the primary fuels for energy generation.
Also read: Wall Street Is Hunting AI Players Beyond Nvidia and Semiconductors
Due to certain solar and wind energy limitations, it is much easier to leverage these traditional sources, and companies are flocking to them in the AI gold rush. Experts estimate that the billions of devices connected to the internet will produce 3.5% of total carbon dioxide by next year. So, we, the users, also have to think: do we need to talk to our washing machines and get AI-generated replies?
Tech giants have much more responsibilities, but everyone has to play their role.
Cryptopolitan reporting by Aamir Sheikh