Editorial: Generative AI is an energy hog and that could hurt progress fighting climate change - Los Angeles Times
Advertisement

Editorial: Generative AI is an energy hog and that could hurt progress fighting climate change

Banking and Finance 2023 Magazine images
(Cybrain / stock.adobe.com)
Share via

Artificial intelligence is one of the world’s fast-growing technologies, with Google, Microsoft, Meta, Apple and others rolling out generative AI models. It’s also one of the most energy-intensive technologies, and that is raising alarm that AI data centers’ thirst for electricity will increase planet-warming emissions and strain the electrical grid.

Google AI takes 10 times as much electricity to process a result as a regular Google search, according to one analysis. Data centers, or large buildings filled with computer servers, already accounted for about 4% of energy use in the U.S. in 2022, with consumption expected to hit 6% by 2026, an increase driven in part by the boom in AI use.

And there is a push to build more in California and across the country. Pacific Gas & Electric revealed in June that it had received 26 applications for new data centers that would use 3.5 gigawatts in total, Times’ reporter Melody Petersen recounted. That amount of power could support nearly 5 million homes.

Advertisement

When tech companies present their products as sleek autonomous computers, that ignores the labor powering the machines.

July 12, 2024

Meeting that demand will put pressure on the U.S.’ aging electrical grid and, since 60% of electricity still comes from fossil fuels, increase planet-warming carbon emissions. Indeed, Google’s carbon emissions increased nearly 50% compared with 2019, which the company attributed to its data center energy consumption and supply chain emissions. Microsoft reported a nearly 30% increase in carbon emissions since 2020 due to the construction of data centers. The International Energy Agency estimates that in 2021, Amazon, Microsoft, Google and Meta collectively used 72 trillion watt hours, more than double the amount they used in 2017, and this number is expected to continue rising.

Data centers run about 100,000 servers on average and often need to be located near power plants. There are concerns that these facilities could strain local power supplies, causing rolling blackouts. California is particularly vulnerable; the state ranks 49th out of 50 in the ability to avoid blackouts by having more electricity available than homes and businesses need at peak hours. In addition to being a threat to the power grid, servers produce heat and data centers use a lot of water to cool them down.

While the tech industry has been roiled by layoffs, the greater focus on AI could lead to new jobs in the future.

March 20, 2024

Given the environmental impacts, there has been increasing resistance from communities over plans to expand data centers. Lawmakers in Washington, Virginia, Georgia and other states have pushed for studies of data centers’ energy use and the effects on grid reliability, ProPublica reported. That’s a good start. We have to make sure this rapidly expanding technology doesn’t undermine climate goals.

Advertisement

Jesse Dodge, a senior research scientist at the Allen Institute for AI, said that before this recent wave of consumer AI, artificial intelligence models were mostly being used by researchers for academic purposes and prioritized for efficiency and sustainability. The large tech companies developing AI for the marketplace prioritize bigger models that use more energy.

Electric bills are rising. Here are ways to reduce the burden without slowing the shift to home and vehicle electrification to meet our climate goals.

June 2, 2024

Although tech companies are not required to specify how they trained newer models, revealing this data could help reduce energy demands. Oftentimes if a company is updating a model, it trains the model from scratch instead of expanding on a model that it has already completed. If companies were to release these models, that could help eliminate the repetition seen throughout generative AI, reducing energy and water waste, Dodge said.

Shaolei Ren, an assistant professor of electrical and computer engineering at UC Riverside, has been conducting research on ways Big Tech can be responsible when creating new AI models, and he believes that these companies are more than capable of operating sustainably.

Advertisement

“Theoretically, they could physically be at carbon zero by routing the workloads around the world. There are data centers all over the world. Since California has solar energy they can put the workload [during the day] here and at night they can move the computing to Europe. They could do this, but they do not because there is a lot of risk,” Ren said.

Big Tech has the resources to curb its energy demands, but so far it has chosen not to. For example, there is no way for users to opt out of Google’s AI-generated search, forcing them to get the energy-intensive results even if they did not want them. That has to change. AI is only going to grow, and the companies behind the boom have a responsibility to ensure their technology doesn’t slow progress in fighting climate change.

Advertisement