Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Earth Power

Will AI Be a Disaster for the Climate? (theguardian.com) 100

"What would you like OpenAI to build/fix in 2024?" the company's CEO asked on X this weekend.

But "Amid all the hysteria about ChatGPT and co, one thing is being missed," argues the Observer — "how energy-intensive the technology is." The current moral panic also means that a really important question is missing from public discourse: what would a world suffused with this technology do to the planet? Which is worrying because its environmental impact will, at best, be significant and, at worst, could be really problematic.

How come? Basically, because AI requires staggering amounts of computing power. And since computers require electricity, and the necessary GPUs (graphics processing units) run very hot (and therefore need cooling), the technology consumes electricity at a colossal rate. Which, in turn, means CO2 emissions on a large scale — about which the industry is extraordinarily coy, while simultaneously boasting about using offsets and other wheezes to mime carbon neutrality.

The implication is stark: the realisation of the industry's dream of "AI everywhere" (as Google's boss once put it) would bring about a world dependent on a technology that is not only flaky but also has a formidable — and growing — environmental footprint. Shouldn't we be paying more attention to this?

Thanks to long-time Slashdot reader mspohr for sharing the article.
This discussion has been archived. No new comments can be posted.

Will AI Be a Disaster for the Climate?

Comments Filter:
  • And if the cheapest source of energy causes a climate disaster, then all productive activity causes climate disaster.
    • Define productive.

      Add to that definition the notion of negative externalities, that are today not taken into consideration at all (despite being still there), and you might find different results about what is and is not productive/adding value.

      • Reminds me of tobacco companies when they started advertising filter cigarettes as a healthier alternative to non-filter cigarettes. Their sales plummeted because their own campaign reminded people that the winning move was not to play. If you were choosing the lesser of two evils, you wouldn't choose evil at all.
        Am afraid that once we start picking and choosing, we will all quickly realize that not much of what modern society produces is bringing a net positive.

        • Cigarettes are so powerfully addicitve, with so many influences and influencers leading to people taking up smoking, in the context of warnings from the Surgeon General about the health risks, I find it really hard to believe that the introduction of filters on cigarettes hurt sales.

          This seems like a Slashdot hypothesis rather than anything carefully though out.

          • Re:Really? (Score:4, Insightful)

            by Nrrqshrr ( 1879148 ) on Monday December 25, 2023 @09:21AM (#64104667)

            Boy do I have a story for you, then. Entire books were written on this subject as it's the textbook example of an ad campaign that entirely transformed a product's image but, in two words: Marlboro Man.
            In fact, that was the moment the "influencers leading to people taking up smoking" part of tobacco companies truly solidified itself as they realized that, instead of selling cigarettes as a healthier alternative to rolled tobaco, they would sell it to teenagers as the Masculine Ideal they should aim for. Hell, even the red and white shape on the packet's packaging was designed to remind people of military medals.

            The history of the creation of Marlboro Man is so interesting that I would advise anyone to read up on it

            • Boy do I have a story for you, then. Entire books were written on this subject as it's the textbook example of an ad campaign that entirely transformed a product's image but, in two words: Marlboro Man. In fact, that was the moment the "influencers leading to people taking up smoking" part of tobacco companies truly solidified itself as they realized that, instead of selling cigarettes as a healthier alternative to rolled tobaco, they would sell it to teenagers as the Masculine Ideal they should aim for. Hell, even the red and white shape on the packet's packaging was designed to remind people of military medals.

              The history of the creation of Marlboro Man is so interesting that I would advise anyone to read up on it

              I agree.

              In an interesting irony, today, a whole lot of young women have taken up smoking/vaping. Is it part of the ongoing masculinization of human females process? Anyhow, veering OT here, I just thought there could b ea connection, after reading your post.

      • by mspohr ( 589790 ) on Monday December 25, 2023 @08:37AM (#64104627)

        There is still a significant question of whether or not AI is productive.
        Does it improve society?
        Does it just make money for rich people at the expense of the environment and social structures?

        • by sonlas ( 10282912 ) on Monday December 25, 2023 @11:19AM (#64104803)

          Define "improve society".

          Slavery improved society. At least the part of it that was calling itself "society" and didn't give a shit about other human beings.
          If society includes everyone, everywhere, then it might be worth it to remember that climate change is going to make a lot of places not suitable for human life for at least 180 days per year (as in the wet bulb temperature would not allow people to work outside without high risks on their health). About 2 billion people will be in that situation by 2050 in a +2C scénario.

          I think you nailed it with your last question: "Does it just make money for rich people at the expense of the environment and social structures".

          As long as you realize that most people writing on slashdot, including you and I, are part of the rich people in question.

        • Does it improve society?

          AI is a tool. It has no impact on society. What impacts society is the way the tool is used. The way I use it is undeniably a benefit. AI powered models for enhancing images is f---ing magic for anyone with a camera, and the tools for photo editing that use generative AI are also an absolute god send.

          Now if you on the other hand use it to perpetuate racial stereo types, cheat on exams or job interviews, or make a Biden deepfake on behalf of the Trump campaign that's on you. In that case it wouldn't be AI ru

        • by PCM2 ( 4486 )

          I just wonder: Is it really the most effective way to solve the problems it's being used for? Or is it just a clever trick that allows companies to spin up lots of hype, but is really just an incredibly over-engineered solution to most computing tasks, like using a diesel engine to drive in wood screws?

      • Re: (Score:2, Interesting)

        by NFN_NLN ( 633283 )

        > Define productive.

        Ability to advance a civilization. Burning cow shit in a hut to heat the bugs you collected doesn't advance a civilization. Harnessing power for ships and off world bases requires growing energy needs. In fact, there is a known measurement scale already:

        The Kardashev scale (Russian: ) is a method of measuring a civilization's level of technological advancement based on the amount of energy it is capable of using.

        https://en.wikipedia.org/wiki/... [wikipedia.org]

        • You are mixing up quite a few things, and trying to justify it by name-dropping the Kardashev scale.

          Ability to advance a civilization.

          What does advancing a civilization mean? Is it only about technological advancement? Is it ok to "advance the civilization" if it has direct negative impacts (as in famines, deaths, population displacement) on 20% of the population making up that "civilization"?

          Burning cow shit in a hut to heat the bugs you collected doesn't advance a civilization.

          It did advance civilization when that was the only way to produce energy. If you actually look at it from a step back, you would actually realize that

  • Can't be worse than Trump being elected.

  • Noc's use most cycles processing the lower OSI layer data.Yet most power consumption is for web and application services.

  • by cjonslashdot ( 904508 ) on Monday December 25, 2023 @08:36AM (#64104623)
    AI is not inherently energy intensive. Neuromorphic chips use thousands of times less energy than GPUs. But AI companies continue with GPUs because programmers are familiar with them, and shifting the chip architecture would be disruptive - they won't do it unless the price of energy goes up, or they need to focus on mobile apps, where energy use is a major factor.
    • by Tony Isaac ( 1301187 ) on Monday December 25, 2023 @09:38AM (#64104687) Homepage

      While you might be right, artificially raising the price of energy would have many side effects, such as lowering the standard of living for the poor among us. Many low-income people already struggle to buy enough gas to go to work to put food on the table. In many places, there is no public transit to get them to where they need to go. Many of these same people struggle to keep the lights on, or their homes heated, as well. If we want a real solution to a warming climate, it has to work for both the well-off and the poor.

      • Yes absolutely. I was not proposing raising energy prices! I was only pointing out why AI companies are using so much energy.
    • Came here to say that more specialized hardware is needed to fix this - NPUs on the server/client side and something like Cerebras chips for training. Since there's no anti-efficiency inherent to the technology, this should result in an energy usage decrease unlike ASICs in cryptomining.

  • Power has a cost, represented in the per kWh price paid to utilities to provide said power. The only thing that'll happen with the expansion of AI is that AI will bid for the same power that's available to everyone else, either

    a). raising prices or
    b). provoking an increase in production.

    Apparently the article's author is more concerned about option b). There are real limits to how much available power supplies can be expanded. There will still be competition for that power. Is it realistic to conclude t

    • I wouldn't be too concerned about what the article's author thinks. He was still writing articles about NFTs a couple of months ago as if they were still a thing. He knows next to nothing abvout tech, he simply regurgitates whatever the latest buzz is, and because he doesn't understand it he's always at least a couple of months behind. If you read many of his articles you quickly come to realise that his tech knowledge is very shallow. He's read about how AI uses a lot of power so he's written an article ab

  • by Rosco P. Coltrane ( 209368 ) on Monday December 25, 2023 @08:42AM (#64104635)

    While the GPUs are busy doing AI stuff, at least they're not mining stupid bitcoins. Whether or not you like AI, at least it's not utterly pointless.

  • by balaam's ass ( 678743 ) on Monday December 25, 2023 @09:06AM (#64104655) Journal

    These sorts of posts almost never offer comparisons with other areas of industry, manufacturing, etc.

    TFA says,
    > "estimated the carbon footprint of training a single early large language model (LLM) such as GPT-2 at about 300,000kg of CO2 emissions – the equivalent of 125 round-trip flights between New York and Beijing. "

    125 flights doesn't sound that bad for a massive undertaking like training ChatGPT.
    How does this compare with what a typical factory emits in a year?
    How does the power consumption compare with, say, an auto manufacturing plant?

    • the equivalent of 125 round-trip flights between New York and Beijing

      What I want to know is, how many bulldozers is that, stacked one on top of the other?

  • by joe_frisch ( 1366229 ) on Monday December 25, 2023 @09:14AM (#64104663)
    In the long term AI will likely provide a great reduction in mankindâ(TM)s greenhouse gas production if we ask it to. It can also eliminate disease, poverty, inequality, war, and all unhappiness as well if that is what we tell it we want.

    Forever
    • Long term AI will solve the climate problem by solving the human problem... by solving over-population. It'll also give people more meaning so instead of doing pointless work to earn money they can work united against a common foe to survive.

    • by gweihir ( 88907 )

      Can it tell me conclusively whether it is cereal first or milk first though and prove its answer? Or will this be forever one of the deep mysteries?

  • by ET3D ( 1169851 ) on Monday December 25, 2023 @09:53AM (#64104699)

    First of all, AI stands to reduce costs of some things. For example Google showed that it can predict the weather with a lot less computing power than current modelling. It's likely that inference will end up replacing other heavy computing tasks, for example rendering.

    Inference will be seriously optimised, in both hardware and software, once it gets to the stage of being ubiquitous. The cost of electricity will have to be passed down to the consumer, so AI companies have every incentive to lower their cost as much as possible.

    This goes to a lesser extent to training. Obviously training will also become more efficient with future hardware, but model sizes are likely to grow too. However, when we get to a stage where we have a good enough models for ubiquitous consumer use, i.e., a product phase rather than research phase, I think it's reasonable to expect more gradual release of new models, which will compensate for the high training cost.

    • Expensive modeling is needed. Weather models powered by big government investments are beneficial beyond simply providing all the forecasting others make profits from. Tons of research is played out on these systems as we try to better model and understand. If you simply feed in enough data and have an AI find a better pattern match to approximate the existing data-- you'll lack the scientific understanding that is applied to broader domains and new situations that do not fit the existing training pattern.

      T

      • by gweihir ( 88907 )

        This is like giving children smart calculators instead of teaching them math.

        Indeed. And thereby creating a whole lot of basically uneducated people.

    • by gweihir ( 88907 )

      Well, if it predicts the weather cheaper, but misses or invents the occasional hurricane, then it is unsuitable for the task. It does also not look like it can make rendering any less computing-intensive unless you do not mind the occasional 6 fingered individual or faces all looking uncannily similar.

  • Initially, using massive clusters of GPUs, AI is harmful but most of this is in centralized large data centers. However, neuromorphic chips should bring the energy use and costs down to a fraction of what they are not. The main hurdle currently is that only a few major tech giants currently have them and they are not sharing.

    Memsistors might be an approach to neuromorphics that brings them mainstream for all to have in our own local devices. Obviously, big tech doesn't want this and that is the primary hu

  • No, but climatism is likely to be a disaster for AI.

  • I lived through the idiocy of the 1970s, telling you to bundle up and drive slower to conserve energy.

    We need to come up with solutions to make even more energy available. That's about 80% of a sci fi future. Teaching you need and want as a solution is a nasty trick. All it does is put off the need for growth by a few years, assuming it lops off a few percent, set against a few percent growth per year due to more people, and more clever things to do with energy.

    Best to get to it.

  • "staggering amounts of computing power"

    "consumes electricity at a colossal rate"

    "means CO2 emissions on a large scale"

    "formidable — and growing — environmental footprint"

    All of this sounds ominous. It's interesting that the only slightly quantitative data is a mention of a Gartner chart of the hype cycle for AI, a chart that contains no units because it's mainly a thought exercise.

    AI energy usage might be significant relative to other uses, but this article is complete fluff.

    • by Budenny ( 888916 )

      "....this article is complete fluff...."

      Yes, spot on. Not a single number in it. Its the Guardian. The Guardian is in a perpetual state of moral panic over one imaginary threat after another, while persistently ignoring real threats and really bad things that are actually going on. Right now its in the grip of global warming hysteria so everything possible has to be reported on with some angle to do with that. Its kind of hilarious that it should call out other people for having a moral panic on AI. P

  • In the fine article was mention of the Gartner hype cycle and I find that more fascinating than any commentary on AI.
    https://en.wikipedia.org/wiki/... [wikipedia.org]

    Nuclear power was introduced to the world in the 1950s, that was the technology trigger. Through the 1960s and 1970s was the rise of expectations, such as "too cheap to meter". From this came many predictions of a nuclear powered world in popular culture, such as TV shows like Thunderbirds and Star Trek. Then the shine started to come off of nuclear power i

    • by catprog ( 849688 )

      If nuclear power costs double the cost of solar (I am talking actual delivered power. Nuclear has the problem of generating when it is not needed and solar needs to be stored) what would be better for the enviroment. Spending the money on Nuclear or Solar?

  • This tech cannot deliver on most of its promises. So it will scale down soon.

  • I'm going to get an AI 3d TV NFT
  • The majority of AI's current power use is concentrated in mega data centres. This is virtually ideal as these can be powered however the company setting it up sees fit. Heck you can even explore options such as using waste heat for district heating. It can be made as dirty or as clean as we want, the underlying technology has no impact on climate.

  • Between the political extremists that appear to have little to no education, we now have to endure HORRIBLY STUPID 'journalists'. It really is hard to call them journalists when they are as stupid as the anti-nukes, pro-BLM, pro-KKK types out there.
  • I have said that the transformer approach to AI has some serious flaws.
    For example the computation may involve trillions of parameters defining relations between words. But the human brain does just fine without doing that. Let's say there are roughly 50,000 words in a language. Then first-level direct connections between words would be about 50K x 50 K or 2.500 x 10^8. Of course not all words are connected, so it would be a somewhat sparse matrix. Now, the human brain couples its word object storage with r

"Facts are stupid things." -- President Ronald Reagan (a blooper from his speeach at the '88 GOP convention)

Working...