The relationship between AI and climate is more complicated than one might think. It is because this relationship is strained with the energy requirements of AI models. The large language models (LLMs), which are the back-end systems behind chatbots like ChatGPT, are too power-hungry and consume too much computing power than their search counterparts.
Environmental costs and energy consumption
The environmental cost to operate these models requires a lot of electricity and water. The carbon footprint of all the electricity these models consume is way too much and relatively unknown due to the different types of electricity generation sources used by the grids that power the data centers in different locations where these models are stored and operate from.
CEO of OpenAI, Sam Altman, who is one of the most prominent names in the AI industry, said that AI could bring a plethora of benefits for humans, such as cures for cancer and addressing climate issues. CEO of Microsoft, Satya Nadella, also expressed similar views in his letter to shareholders last year, highlighting the climate change as the defining issue of our generation. Mentioning AI, he said,
“can be a powerful accelerant in addressing the climate crisis.”
Source: Thestreet.
Jensen Huang, CEO of Nvidia, also thinks on similar lines, and like Nadella, according to him, artificial intelligence will lead to breakthroughs in climate research and science. Despite these optimistic views, the ground realities are quite different, at least in relation to the climate impact.
Llama 3, Meta’s latest model, is said to emit 2290 tons of carbon dioxide during its training only, now compare that to an average gas powered car, which emits one ton of carbon dioxide for every 2500 miles it runs, according to the US Environmental Protection Agency (EPA).
AI’s sustainability promises
Google has set a target to reduce its carbon emissions to a net-zero level by 2030, which might be partially achieved through carbon offsets. The carbon emission of Google was 10.2 million tons in 2022, to understand this, consider Finland, where the entire country emitted 45.8 million tons of carbon dioxide for the same year and that with a population of 5.5 million people.
The other important climate impact indicator is water, which for the same year, Google consumed 5.6 billion gallons, with a year-over-year increase of 20%. Most of this water was used by its data centers. Google has said that it will be replacing 120% of the water it consumes by 2030, but it replaced only 6% for the same year, according to a report.
Microsoft’s vice president for energy, Boby Hollis, has also expressed the same views and said that their company will continue to invest in renewable energy and their other efforts to meet their green goals. The massive energy usage of AI is not unknown to the industry, back in January, Sam Altman’s statement was all over the media, in which he pointed out that AI will soon need energy breakthroughs.
According to an AI and sustainability researcher, Sasha Luccioni, LLMs are a trend that everyone is trying to see what sticks, and she sees more computing and more energy consumption. But some experts argue that we have to think about the cost-benefit ratio of these large models. They say that the climate impact of training and operating a very large model can be justified if that model can be used to have a much larger positive environmental impact.
The original story can be seen here.