Energy-hungry AI could pose a challenge for data centre ESG

Save articles for later

Add articles to your saved list and come back to them any time.

Sustainability experts have warned of a crunch ahead for the booming data centre industry, as increasing energy usage amid demand for new artificial intelligence-powered technologies crosses paths with a hotter, drier climate.

Data centres are becoming an asset-class part of infrastructure, as AI powers a boom in growth and demand for data while investment managers and superannuation funds increase their stakes. Late last year, US asset manager DigitalBridge and Melbourne-based IFM investors acquired data centre leader Switch for $US11 billion ($16.83 billion).

NEXTDC says its data centres are carbon neutral.Credit: Aran Anderson

Meanwhile the International Energy Agency estimates that software-related activities currently account for about 5 per cent of global greenhouse gas emissions, which may rise to 14 per cent by 2040. Data centres and transmission networks specifically account for around 1 per cent, but some estimates predict that to rise rapidly to more than 3 per cent in a matter of years.

Most major Australian data centre providers are keen to tout their energy efficiency credentials. ASX-listed NEXTDC claims its operations are totally carbon-neutral via offsets, and offers a solution to clients that offsets their own IT usage as well. Vocus plans to be net-zero by 2025, Equinix “climate neutral” by 2030, and all have long-term goals of powering data centres purely by renewable sources.

But a recent report from the University of Technology Sydney’s Institute for Sustainable Futures suggested the data centre industry was “exposed to significant ESG (environmental, social, and corporate governance) risks that have largely escaped our collective attention”, including the increasing need for cooling combined with huge demands for data.

“This all comes at the same time as we’re seeing a shift in our weather patterns. We’re heading for days of peak heat events,” said researcher Gordon Noble, who led the UTS study.

“So the challenge is, we have 45-degree days in the western suburbs of Sydney and Melbourne, where data centres will need to increase the demand for energy to ensure that they’re delivering their services. At a time when households will also be wanting to make sure that they’ve got cool homes.”

The study, which was commissioned by data centre operator Pure Storage, also surveyed experts in charge of sustainability at their organisations, of whom only five per cent said they were getting detailed sustainability information from their data centre provider.

As record heat waves affect parts of Europe, recent figures have shown data centres in Ireland consume 18 per cent of the country’s electricity, around the same as homes. Ireland is the European home of several tech giants, but some of the nation’s politicians have said the power-hungry data centres put pressure on the national grid, increase electricity prices for everyone and will make it impossible to hit emissions targets.

“From an Australian perspective, data centres need to be on the sustainability agenda,” Noble said.

“It’s probably fair to say other issues have been in the limelight. But we’ve got increasing demand for data, which is only going to exacerbate because of the investments that we’re seeing in new technology like AI. We need to understand where we’re located, particularly in the context of El Nino.”

RMIT University school of computing dean Professor Karin Verspoor said AI – like blockchain technology and cryptocurrency mining before it – was getting a lot of attention from developers and investors, but there was not enough discussion of the exponentially increasing amounts of energy it used.

Some researchers have calculated that training a single medium-sized generative AI model could consume electricity and energy equivalent to 626,000 tons of CO2 emissions, around what five American cars would use throughout their lifetimes, including manufacturing.

“These are huge models, and they’re only getting bigger, and there’s more of them. Massive quantities of data are involved in training,” Verspoor said, adding this was on top of the ongoing energy costs once users are hitting data centres constantly to use the generative AI product.

“And it’s not just energy actually, it’s also water because water is used often to cool the data centres. So, there are these sorts of secondary climate impacts.”

While Verspoor agreed data centre providers could help mitigate the impacts with more energy-efficient technologies and offsets, she said the developers and consumers of AI products also had to take some responsibility.

For example, the actual training doesn’t need to happen geographically close to users, so it could be moved to cooler climates, or where water usage is less impactful. It could even be rotated globally around different data centres to operate when the sun was down. But other problems are harder to solve.

Verspoor said that when it came to online products where users frequently had to connect to data centres, operators were incentivised to build close to their users for quick response times to queries.

“Sometimes that means you’re going to have data centres in places where it doesn’t make sense from a climate perspective,” she said.

AI company Hugging Face has run experiments in low-power AI development using nuclear energy, but still found its development of a large language model produced around 50 metric tons of carbon dioxide emissions, or the equivalent of an individual taking 60 flights between London and New York. It estimated that OpenAI, when developing its last-generation ChatGPT model, may have produced 500 metric tonnes.

Verspoor said a potential upside was that AI was empowering great strides in climate science and energy efficiency, including large-scale modelling and accurate translations that can help share knowledge globally, so the technology could end up being a net positive.

In the meantime, she suggests a somewhat simpler solution – be discerning about when to use high-cost generative AI.

“Not every problem needs to be tackled with generative AI. If you contrast a GPT-enhanced query with a normal search query, like in Google, the cost of the query is something like five to 10 times higher,” she said.

“So maybe some queries should just use the traditional search? Thinking about hybrid approaches and more effective use of the technology will help.”

Get news and reviews on technology, gadgets and gaming in our Technology newsletter every Friday. Sign up here.

Most Viewed in Technology

From our partners

Source: Read Full Article