Fashion Magazine

AI Increases Data Center Energy Consumption – Straining the Grid and Slowing Sustainability Efforts

By Elliefrost @adikt_blog

The rise of artificial intelligence has had such a huge impact on large tech companies that their energy consumption and thus their carbon emissions have increased dramatically.

The spectacular success of large language models like ChatGPT has helped fuel this growth in energy demand. At 2.9 watt-hours per ChatGPT request, AI queries require about 10 times as much electricity as traditional Google queries, according to the Electric Power Research Institute, a nonprofit research firm. Emerging AI capabilities like audio and video generation are likely to contribute to this energy demand.

AI's energy needs are changing the calculations of energy companies, who are now exploring previously untenable options, such as restarting a nuclear reactor at the Three Mile Island power plant that has been idle since the infamous 1979 disaster.

Data centers have been growing steadily for decades, but the scale of growth in the young era of large language models is extraordinary. AI requires far more compute and data storage resources than the pre-AI growth rate of data centers could provide.

AI and the grid

Thanks to AI, the electrical grid, which in many places is already running at capacity or prone to stability issues, is experiencing more pressure than before. There is also a significant lag between the growth of computing and the growth of the grid. Data centers take one to two years to build, while adding new power to the grid takes more than four years.

As a recent report from the Electric Power Research Institute points out, just 15 states are home to 80% of the U.S. data centers. Some states, like Virginia, home to Data Center Alley, astonishingly consume more than 25% of their electricity from data centers. Similar trends of clustered data center growth are occurring in other parts of the world. Ireland, for example, has become a data center nation.

In addition to the need to generate more energy to support this growth, almost all countries have carbon reduction targets. This means they are striving to integrate more renewable energy sources into the grid. Renewable energy sources such as wind and solar are intermittent: the wind doesn't always blow and the sun doesn't always shine. The lack of cheap, green and scalable energy storage means the grid has an even bigger problem matching supply and demand.

Additional challenges to data center growth include the increasing use of water cooling for efficiency, which taxes limited freshwater resources. As a result, some communities are resisting new data center investments.

Better technology

There are several ways the industry is addressing this energy crisis. First, computing hardware has become significantly more energy efficient over the years in terms of operations performed per watt consumed. Data center energy efficiency, a metric that measures the ratio of energy used for computing versus cooling and other infrastructure, has dropped to an average of 1.5 and even to an impressive 1.2 in high-end facilities. New data centers have more efficient cooling by using water cooling and external cool air when available.

Unfortunately, efficiency alone will not solve the sustainability problem. In fact, Jevons' paradox suggests that efficiency can lead to increased energy consumption in the long run. Furthermore, gains in hardware efficiency have slowed considerably as the industry reaches the limits of scaling chip technology.

To continue improving efficiency, researchers are designing specialized hardware such as accelerators, new integration technologies such as 3D chips, and new chip cooling techniques.

Likewise, researchers are increasingly studying and developing data center cooling technologies. The Electric Power Research Institute report endorses new cooling methods, such as air-assisted liquid cooling and immersion cooling. While liquid cooling has already found its way into data centers, only a few new data centers have implemented the still-in-development immersion cooling.

Flexible future

A new way to build AI data centers is flexible computing. The main idea here is to do more computing when electricity is cheaper, more available and greener, and less when it is more expensive, scarce and more polluting.

Data center operators can transform their facilities into a flexible load on the grid. Academia and industry have provided early examples of data center demand response, where data centers regulate their power based on the needs of the grid. For example, they can schedule certain computing tasks for off-peak hours.

Implementing broader and larger scale flexibility in power consumption requires innovation in hardware, software, and grid-data center coordination. Especially for AI, there is significant scope to develop new strategies to balance data center computational loads and thus energy consumption. For example, data centers can scale back accuracy to reduce workloads when training AI models.

To realize this vision, better modeling and forecasting is needed. Data centers can try to better understand and predict their loads and conditions. It is also important to predict grid load and growth.

The Electric Power Research Institute's Load Forecasting initiative includes activities to assist with grid planning and operations. Comprehensive monitoring and intelligent analytics - potentially based on AI - for both data centers and the grid are essential for accurate forecasting.

On the edge

The U.S. is at a critical juncture with the explosive growth of AI. Integrating hundreds of megawatts of electricity demand into already overloaded grids is incredibly difficult. It may be time to rethink how the industry builds data centers.

One option is to sustainably build more edge data centers - smaller, widely distributed facilities - to bring computing to local communities. Edge data centers can also reliably add computing power to densely populated, urban areas without further burdening the network. While these smaller centers currently account for 10% of data centers in the U.S., analysts predict that the market for smaller edge data centers will grow by more than 20% over the next five years.

In addition to transforming data centers into flexible and controllable loads, innovation in the edge data center sector can make AI's energy needs much more sustainable.

This article is republished from The Conversation, a nonprofit, independent news organization that brings you facts and reliable analysis to help you understand our complex world. It was written by: Ayse Coskun, Boston University Read more: Ayse K. Coskun has recently received research funding from the National Science Foundation, the Department of Energy, IBM Research, Boston University Red Hat Collaboratory, and the Research Council of Norway. None of the recent funding is directly related to this article.

Back to Featured Articles on Logo Paperblog