Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Power

Is AI's Demand for Energy Really 'Insatiable'? (arstechnica.com) 54

Bloomberg and The Washington Post "claim AI power usage is dire," writes Slashdot reader NoWayNoShapeNoForm. But Ars Technica "begs to disagree with those speculations."

From Ars Technica's article: The high-profile pieces lean heavily on recent projections from Goldman Sachs and the International Energy Agency (IEA) to cast AI's "insatiable" demand for energy as an almost apocalyptic threat to our power infrastructure. The Post piece even cites anonymous "some [people]" in reporting that "some worry whether there will be enough electricity to meet [the power demands] from any source." Digging into the best available numbers and projections available, though, it's hard to see AI's current and near-future environmental impact in such a dire light... While the headline focus of both Bloomberg and The Washington Post's recent pieces is on artificial intelligence, the actual numbers and projections cited in both pieces overwhelmingly focus on the energy used by Internet "data centers" as a whole...

Bloomberg asks one source directly "why data centers were suddenly sucking up so much power" and gets back a blunt answer: "It's AI... It's 10 to 15 times the amount of electricity." Unfortunately for Bloomberg, that quote is followed almost immediately by a chart that heavily undercuts the AI alarmism. That chart shows worldwide data center energy usage growing at a remarkably steady pace from about 100 TWh in 2012 to around 350 TWh in 2024. The vast majority of that energy usage growth came before 2022, when the launch of tools like Dall-E and ChatGPT largely set off the industry's current mania for generative AI. If you squint at Bloomberg's graph, you can almost see the growth in energy usage slowing down a bit since that momentous year for generative AI.

Ars Technica first cites Dutch researcher Alex de Vries's estimate that in a few years the AI sector could use between 85 and 134 TWh of power. But another study estimated in 2018 that PC gaming already accounted for 75 TWh of electricity use per year, while "the IEA estimates crypto mining ate up 110 TWh of electricity in 2022." More to the point, de Vries' AI energy estimates are only a small fraction of the 620 to 1,050 TWh that data centers as a whole are projected to use by 2026, according to the IEA's recent report. The vast majority of all that data center power will still be going to more mundane Internet infrastructure that we all take for granted (and which is not nearly as sexy of a headline bogeyman as "AI").
The future is also hard to predict, the article concludes. "If customers don't respond to the hype by actually spending significant money on generative AI at some point, the tech-marketing machine will largely move on, as it did very recently with the metaverse and NFTs..."

Is AI's Demand for Energy Really 'Insatiable'?

Comments Filter:
  • ...especially about the future
    and most of them have a political agenda

    • by jonadab ( 583620 )
      It depends how specific you want to be.

      I can predict with some confidence that the Fundamental Economic Problem will continue to be relevant for some time to come. I can't be sure what precise role LLMs and such will play, but I'm pretty sure there will continue to be a rising demand for energy, globally.
  • by jonsmirl ( 114798 ) on Sunday June 30, 2024 @12:43PM (#64590145) Homepage

    Build data centers in Iceland, plenty of non-polluting geothermal energy available. Use the cold ocean for cooling.

    • Build data centers in Iceland, plenty of non-polluting geothermal energy available. Use the cold ocean for cooling.

      ... and the volcanoes can embed them in molten rock?

    • by Rei ( 128717 )

      Most of our electricity is hydroelectric, which is hardly harmless.

      • If someone were to build a billion dollar data center they could surely pay for new geothermal wells too.

    • Build data centers in Iceland, plenty of non-polluting geothermal energy available. Use the cold ocean for cooling.

      Not even that. Most of the big generative AI use cases are for work, meaning they're heaviest during daylight hours. Build dedicated solar farms and the energy supply goes up with the usage.

    • Do you have any idea how small Iceland is? Even some central African countries generate more geothermal power than Iceland.
    • by AmiMoJo ( 196126 )

      Ocean cooling isn't so easy though. The problem is a lot of stuff lives there. Even if you just have a sealed cooling loop that extends out into the ocean, very quickly it will get covered in stuff living on it and enjoying the heat.

      It's not impossible, UV light can be effective, but it's also not a trivial problem to solve. A lot of nuclear plants struggle with it because they need so much cooling water they can't use a sealed loop, they have to take water in, and keep fish out. They use speakers to try to

  • by bubblyceiling ( 7940768 ) on Sunday June 30, 2024 @12:53PM (#64590159)
    The real story is at the end

    "If customers don't respond to the hype by actually spending significant money on generative AI at some point, the tech-marketing machine will largely move on, as it did very recently with the metaverse and NFTs..."
    • Is the popular conception of the application/presentation layers sustainable....? That is the question. Can we continue at this pace?
    • This is what I been waiting on, to see when the demand actually comes in to fund all this energy usage. I feel like AI is mostly doing 1+1 type of questions while burning a barrel of oil. While Natural Intelligence will just need a glass of milk and a cookie.

      It was all fine while interest rates were nothing and money was just floating around and projects just needed a 1-2% return over 5-10 years to turn a profit. Now rates are up and projects need to prove their +4% returns within a year or two.

      It's like t

  • by rsilvergun ( 571051 ) on Sunday June 30, 2024 @12:58PM (#64590173)
    Once none of us have jobs anymore it'll be satiated.

    And no CEO or head of Goldman Sachs / McKinsey aren't jobs. Those are positions in the ruling class. So those ones don't get replaced any more than you could replace the king, at least not without the threat of violence. It's 2024 though so I think we can get by with just a threat.
    • Once none of us have jobs anymore it'll be satiated.

      That as an explanation on how this ends. I'd phrase it more like AI will stop grabbing up more energy once energy scarcity forces prices too high for people to bother with adding more AI to the energy demand.

      And no CEO or head of Goldman Sachs / McKinsey aren't jobs. Those are positions in the ruling class. So those ones don't get replaced any more than you could replace the king, at least not without the threat of violence. It's 2024 though so I think we can get by with just a threat.

      While it may not be optimal to see people make such large amounts of money with what I guess people call "passive income" I would like to hear alternatives that don't leave people in a worse situation. Oh, and we have seen people removed from a ruling class before without violence. We see royal famil

      • You are fucking stupid. All absolute rulers need absolute power, and that requires a peasant/slave/morlock class. Without someone to dominate the entitled have no way of proving they are superior, so their high position means nothing to them. Read 1984 or read about Stalin, Mao, the Kim family in North Korea, Idi Amin, Pol Pot, American slavery, or damn near anything about colonialism, or religious empires in any era of history.

        You are such a dumb-ass I seriously wonder how you are able to breath on your

  • by thegarbz ( 1787294 ) on Sunday June 30, 2024 @01:21PM (#64590211)

    The current models and training sets are very quickly reaching economic limits. OpenAI is struggling financially to make money to train AI. Microsoft and Google are desperate to shove this down our throats. Largely these models and approaches have already reached a peak: the cost of building these models has already outgrown the return on investment, and companies have already started taking notice.

    AI won't end though, I just predict that smaller more targeted models will become the norm, rather than the "hoover up the internet and spit it back in our faces in search results" approach we are currently taking (and failing to monetise). And these smaller models are likely going to be far less energy consuming.

    • Even throwing more processing power at something eventually reaches the limit of the design. The current design of Ai is to, well, throw massive computing power at it. For this design we are at peak stupid and hopefully peak waste.

      The next step will be efficient design, then when that proves to be no better in actual ability but just cheaper, a new algorithm design will finally be considered.
    • The current models and training sets are very quickly reaching economic limits. OpenAI is struggling financially to make money to train AI. Microsoft and Google are desperate to shove this down our throats. Largely these models and approaches have already reached a peak: the cost of building these models has already outgrown the return on investment, and companies have already started taking notice.

      AI won't end though, I just predict that smaller more targeted models will become the norm, rather than the "hoover up the internet and spit it back in our faces in search results" approach we are currently taking (and failing to monetise). And these smaller models are likely going to be far less energy consuming.

      Is your next prediction that printing presses are doomed since people don't like the uniform letters and the market for books isn't big enough to justify the expense anyway?

      Just because you don't like a tech doesn't mean it's infeasible, particularly when your prediction doesn't take into account the fact that computational costs always rapidly decline.

      • Is your next prediction that printing presses are doomed since people don't like the uniform letters and the market for books isn't big enough to justify the expense anyway?

        No why would you say that? If you want to compare AI to the printing press know that the printing press figuratively (and literally) ended up printing money for the people who used it and the people who manufactured it. AI is doing the opposite. The people with the biggest models are losing the most money. Unlike the printing press it's very difficult to monetise general LLMs.

        Just because you don't like a tech doesn't mean it's infeasible

        Where did I say I don't like it? I do like it. I use it all the time. I even pay for it (I bet you you don't). But there's a differen

        • Also, AI will start doing personal advertising online. The argument will be, AI can pay for its own electricity consumption....
        • Is your next prediction that printing presses are doomed since people don't like the uniform letters and the market for books isn't big enough to justify the expense anyway?

          No why would you say that? If you want to compare AI to the printing press know that the printing press figuratively (and literally) ended up printing money for the people who used it and the people who manufactured it. AI is doing the opposite. The people with the biggest models are losing the most money. Unlike the printing press it's very difficult to monetise general LLMs.

          I was making a reference to Luddites.

          Either way, we know chips get cheaper, energy gets cheaper, the algorithms get more efficient, etc, etc. I don't see any reason to expect that training an LLM will remain as expensive except for the fact that more complex more capable models make the extra expense worth it.

          Just because you don't like a tech doesn't mean it's infeasible

          Where did I say I don't like it? I do like it. I use it all the time. I even pay for it (I bet you you don't).

          Perhaps I was wrong as to your motives, but you're also wrong as I do pay for it. Both as a standard chatbot but also as a module integrated into a program I've written.

          But there's a difference between throwing computational problems at something general purpose which you struggle to sell to someone. There's also a difference between the idea that something will go away vs something won't grow in consumption. That was what I was saying, the computational complexity of AI won't grow. AI will start solving far more specific problems and will do so with smaller easier to train models rather than the "suck up the internet and throw it in a datacentre" models we're currently generating at an insane cost.

          Are more specialized models comi

    • by CAIMLAS ( 41445 )

      What a silly prediction.

      AI has already effectively ended the careers of many motion graphics/animation professionals. If you've made a living doing media, chances are you're being replaced in the next several years.

      That's a lot of revenue in AI licensing that's available. Sure, it'll cut a big part of the cost out for the producers, but it's something - and it'll be close-to-perpetual revenue.

      Yeah, the ability to create new models are increasingly showing diminished returns and increased costs. That doesn't

  • No (Score:5, Interesting)

    by GameboyRMH ( 1153867 ) <gameboyrmh.gmail@com> on Sunday June 30, 2024 @01:34PM (#64590229) Journal

    There are lots of reasons to expect that AI's energy usage will fall soon, without even getting into the strong possibility of an AI bubble burst. New MatMul-free models, acceleration by NPUs on the client side, acceleration by Cerebras chips on the server side, in the future possibly even better acceleration by analog chips on the client side. We should still be much more worried about proof-of-work blockchains than AI in terms of energy usage.

    • >> lots of reasons to expect that AI's energy usage will fall soon

      I think you are correct. Upcoming hardware implementations will focus on performance improvements and reduction in power requirements similar to what has been happening all along in semiconductor development.

      Nvidia recently announced the Blackwell chip which will ship later this year.
      "up to a 30x performance increase compared to the same number of NVIDIA H100 Tensor Core GPUs for LLM inference workloads"
      "reduces cost and energy consumpt

      • Um, I note that the human brain, taken as the performance to beat, runs on a piddly little 25 Watts give or take a little. And we see nVidia talking about a 25x reduction in energy performance, from triple digit TWh/year while the human brain is running at low triple digit kWh. Methinks there is a mother load of improvement here somewhere. Currently the AI industry is running on absurd and is doubling or er tenning down every time I hear them talk. Seems to me a 25x improvement on a 9th order absurdity is j

    • by spth ( 5126797 )
      I see the efficiency improvements. But I fully expect Jevons paradox to be able to deal with that.
      • by Rei ( 128717 )

        Jevon's paradox requires that demand be insatiable and that there's a lack of desire for substitution with cheaper alternatives. E.g. that not only is AI massively commercially successful, but that everyone constantly wants to use the best possible model on the market rather than models that are much cheaper that are "good enough".

        I find this premise beyond questionable.

    • I don't think reduced power consumption will actually dent the demand for more generation in the near term, it'll simply allow more intensive training and faster model iteration.

      The tech companies are going to every utility in the US and basically saying "if you can guarantee us power asap we'll sign a massive long term datscenter contract ." Power is the bottleneck and also the cheaper part of the equation, the cost of siting a data center and filling it with chips is the bulk of the cost. And it just so h

    • You mention that energy use from AI is to be done on NPU's running at the client. Should that not be seen as moving energy-consumption from the datacenter/LLM maker to the customer?

      To extend this even further: the customer is now to pay for a service from their AI/LLM provider, to pay for the extra bandwidth consumption drain on their internet connection, to pay for new hardware that comes with NPU's to "help" the "poor" AI/LLM maker with their running costs and possibly in the service contract from the AI/

    • There are lots of reasons to expect that AI's energy usage will fall soon, without even getting into the strong possibility of an AI bubble burst. New MatMul-free models, acceleration by NPUs on the client side, acceleration by Cerebras chips on the server side, in the future possibly even better acceleration by analog chips on the client side. We should still be much more worried about proof-of-work blockchains than AI in terms of energy usage.

      The bubble may burst in the sense that some folks expect large segments of the workforce to be replaceable by LLMs, and I don't anticipate that.

      But software dev as a usecase is here and it's no bubble. And LLMs are definitely finding their way into a few other use cases like gaming as well.

      As for proof-of-work blockchains the only ones I'm aware of are cryptocurrency. Not a fan of their energy usage for sure, but I feel like they're already fairly close to their peak energy usage.

    • Maybe we'll get AI to optimize pizza delivery to where the pizza arrives 30 minutes BEFORE it is ordered. Then governments learn of the technology and attach nuclear bombs to pizzas to have them sent into the past.
      https://www.youtube.com/watch?... [youtube.com]

  • I was wondering if we could look at this from a completely different perspective? Are the world's chip producers suddenly building more chips? Do AI chips really take a lot more power to run than the same amount of silicon in other chips? I would think that for a given process (e.g., "4nm" chips), the power consumption would be pretty consistent per mm2 of chip space, and the production at the fabs would be scaling up on a slow linear increase, but with a shift from old processes to new in the mix as wel

    • Are the world's chip producers suddenly building more chips? Do AI chips really take a lot more power to run than the same amount of silicon in other chips?

      That's a question, but I don't think it's the most important one. The two biggest questions are whether AI is requiring that more processing than the other things we are already doing, and then whether efficiency is improving faster or slower than the rate at which we're implementing more processing.

      It seems to me like most things we do a lot of require a lot less power than this, but it also seems like there are a lot of efficiency improvements coming out for this purpose, so I don't know either what the a

    • by crow ( 16139 )

      One possibility is that AI training datacenters are running the chips at max capacity, while most chips are only being used at a small fraction of their capability, and hence using much less power. Most desktop and phone CPUs are probably 99% idle over their lifetimes.

    • by erice ( 13380 )

      Do AI chips really take a lot more power to run than the same amount of silicon in other chips?

      Energy per area is generally only important for cooling. What really matters is energy to complete a task. The silicon area can vary.

      AI is computationally expensive. If you ran it on conventional processors you would need a lot more of them (i.e. more silicon area) in order to get acceptable run times. That means more energy. As it is, more specialized silicon, mostly in form of GPU's are is used. It is more efficient in terms of compute/area but still not that good and AI uses more silicon area in G

  • As far as I remember, the power consumed by the human brain is somewhere between 5 and 20 watt. There are about 8E9 people, so all together the power consumption of all human brains is of the order of 10E10 W. Before blindly upscaling AI systems to the TW region, some work to improve their energy efficiency is needed. I wonder if this could be solved by AI itself....
  • The hype around AI is understated. It's already in widespread use with astounding results. It's absurd to compare it to the Metaverse and Crypto+NFTs which haven't come close to reaching this stage. Many of the current business plans may fail but the technology is a huge success.
  • I am sure someone looked at the amount of power AI was using, saw it was a large percent of total server power and thought AI was the problem.

    What really happened is people started calling everything AI - Oh, our phone tree? AI. Our Search Engine? Totally AI powered. The breaks in our car? Definitely AI. Our new full home thermometer - Yeah, that's AI.

    I will admit that perhaps some of new tech actually has AI, but it is merely a slight modification of technology that could have been built without AI,

  • Mining algorithms were designed to consume as much computing power as they are given, and basically throw all that power away. AI at least does work, and the increase in power is translated to increase in capabilities.

    I agree with others that specialised hardware will help make AI more commercially viable. Lower power is required for that. Still, over the next couple of years we'll likely still see power usage for AI growing.

    • by CAIMLAS ( 41445 )

      Crypto still has a smaller energy footprint/less energy intensive than traditional banking and finance. But, that's harder to quantify specifically outside the systemic whole...

  • One very important point is that the nightmare AI energy scenario is completely predicated on the runaway success of AI, particularly generative AI. If AI isn't hugely successful in terms of producing huge profits and garnering widespread usage among the general populace, then the nightmare scenario has no chance whatsoever of materializing.

    This is somewhat of a conflicting paradox for AI sceptics. On the one hand they prophesy that AI will lead to nothing practical. Yet, some of these same skeptics decr

  • Cloud computing seems a good suspect for the 250 TWh rise since 2012.

    One could have imagined it would have replaced on-premise less-efficient installations, but obviously it created new usages.

  • - Bitcoin mining ranged from 67 TWh to 240 TWh in 2023. That's right, nearly double the EV usage.
    - The global EV fleet consumed about 130 TWh of electricity in 2023, which accounted for about 0.5% of current total final electricity consumption worldwide, at only a global market share of around 2.2% for electric vehicles. Scaled to 100% against 31,000 TWh global capacity would mean 8% of global generation capacity would go to cars. (Sidenote, coal is greatest source of power worldwide, that's some great m

To invent, you need a good imagination and a pile of junk. -- Thomas Edison

Working...