A new study warns that artificial intelligence technology could cause a significant increase in electricity consumption.
The paper, published in the journal Joule, details the potential future power output of AI systems, noting that generative AI technology relies on powerful servers and increased usage could lead to increased demand of energy.
The authors cite tech giant Google as an example, noting that AI accounted for only 10-15% of the company’s total electricity consumption in 2021.
But as AI technology continues to develop, Google’s energy consumption could begin to reach the scale of a small country.
VETERANS IN DEEP ERRORS IN HEALTH DELIVERY SYSTEM DUE TO COMPUTER ACCIDENT
“The worst case scenario suggests that Google’s AI alone could consume as much electricity as a country like Ireland (29.3 TWh per year), which is a significant increase over its historical consumption of energy linked to AI”, write the authors.
They cautioned that such an example “assumes large-scale adoption of AI using current hardware and software, which is unlikely to happen quickly.”
Christopher Alexander, director of analytics at Pioneer Development Group, told Fox News Digital that the requirements will be similar to the birth of Bitcoin mining, arguing that developers will need to be creative in how they use resources.
WHAT IS ARTIFICIAL INTELLIGENCE (AI)?
“AI is very similar to Bitcoin mining. In both cases, processing power is used at very high intensity to solve problems. You can’t reduce power consumption, but you can mitigate it” , Alexander said. “For example, alternative energy, such as natural gas from oil drilling that is burned rather than used, constitutes a major untapped energy source, as does biogas from landfills.”
Alexander likened the solution to the days when “kerosene was developed from waste,” saying it was another opportunity to develop cheap energy from flare gas and landfills that fuels the future and makes the most of resources that would otherwise become polluting.
Phil Siegel, founder of the Center for Advanced Preparedness and Threat Response Simulation (CAPTRS), told Fox News Digital that similar concerns are characteristic of any growing technology, although he argued that improvements would likely help make the more efficient energy consumption.
“Multiplayer games, social media and cryptocurrencies have all gone through these phases. At first, the technologies tend to be inefficient because the chips and algorithms are not optimized,” Siegel said.
CLICK HERE FOR MORE NEWS FROM US
“People extrapolate these inefficiencies to a larger scale. The bad news is that power consumption increases somewhat. The good news is that as new uses evolve, chips get better, algorithms get better. improve, technology becomes more creative and ultimately reduces the amount of energy consumed well below panic levels.
Although the document acknowledges that some scenarios are extreme and unlikely cases, it says it is important to temper “overly optimistic and overly pessimistic expectations” for the future, noting that “it is probably too optimistic to expect that hardware and software improvements and efficiency gains will fully offset any long-term changes in AI-related power consumption. »
CLICK HERE TO GET THE FOX NEWS APP
“These advances may trigger a rebound effect whereby increased efficiency leads to increased demand for AI, increasing rather than reducing total resource use,” the paper concludes.
“The enthusiasm for AI of 2022 and 2023 could be part of such a rebound effect, and this enthusiasm has put the AI server supply chain on track to make a more significant contribution to the global power consumption of data centers in the coming years.