• Nemo's public admirer@lemmy.sdf.org
      link
      fedilink
      arrow-up
      16
      arrow-down
      3
      ·
      edit-2
      2 days ago

      How energy intensive?
      Like, compared to random cat videos on youtube n all?

      And would the addition of renewable energy be able to handle it?

      • LwL@lemmy.world
        link
        fedilink
        arrow-up
        16
        arrow-down
        1
        ·
        1 day ago

        Using it, not all that energy intensive (one llm use is roughly the same as 3 pre-ai-bullshit google searches iirc). Training it, very energy intensive.

        Yes it would but we haven’t even replaced all our previous needs with renewables so it aint helping.

        • Prunebutt@slrpnk.net
          link
          fedilink
          arrow-up
          21
          arrow-down
          1
          ·
          1 day ago

          Using it, not all that energy intensive

          According to this article, this is not considered true anymore:

          As conversations with experts and AI companies made clear, inference, not training, represents an increasing majority of AI’s energy demands and will continue to do so in the near future. It’s now estimated that 80–90% of computing power for AI is used for inference.

          I think there’s a reason why OpenAI, Microsoft, Google and Facebook hold the energy consumption and water usage numbers so close to their chest.

          • Tenderizer78@lemmy.ml
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            edit-2
            11 hours ago

            This article is dubious. When it comes to training it uses a lot of sensationalist and unsupported estimates. Notice the following quote:

            OpenAI and President Donald Trump announced the Stargate initiative, which aims to spend $500 billion—more than the Apollo space program—to build as many as 10 data centers (each of which could require five gigawatts, more than the total power demand from the state of New Hampshire).

            I am DEEPLY sceptical of those figures. Like, what data center uses FIVE BLOODY GIGAWATTS. DO YOU KNOW HOW MUCH FIVE GIGAWATTS IS. DO YOU KNOW HOW MUCH THAT’D COST.

            The use of metaphor is also concerning, comparing it to San Francisco or New Hampshire or household electricity consumption.

            America produced 4,000TWH of electricity a year. This report says “22% of household consumption in 2028”, which if I commit the faux pass of mixing data it gets me 7% of US power consumption. A lot, but not apocalyptic and merely a projection for future power consumption. It’s also less than the 50GW to 10 data centers alone in the line I quoted above.

            It’s right in that the core problem is that we don’t know and so I can’t fault it for assuming the worst, but even then there are limits.

            As for the usage, the document you linked puts generating an image using stable diffusion at 400W seconds, or as much as my computer consumes at idle for 8 seconds. I’m gonna stop reading this article because I’m tired and this isn’t worth it.

            I’m not pro-AI. I don’t like how it makes it so easy to fill the internet with slop. I don’t like how it discourages the people who use it from any and all critical thought. I’ve used AI twice, to reword by assignment questions in college because no amount of googling made the phrasing make sense. All I want is for the fearmongering about AI power consumption to stop, not just because it’s inaccurate, but also because it encourages investment into gas-fired power generation to “prepare for the AI boom”.

            • queermunist she/her@lemmy.ml
              link
              fedilink
              arrow-up
              4
              ·
              11 hours ago

              America produced 4,000TWH of electricity a year. This report says “22% of household consumption in 2028”, which if I commit the faux pass of mixing data it gets me 7% of US power consumption.

              7% is a fucking lot though?? That’s an immense amount of power going towards slop instead of making our lives better or growing the economy or actually being productive.

              It’s like we just decided to start burning our limited reserves of natural gas for fun.

              • Tenderizer78@lemmy.ml
                link
                fedilink
                arrow-up
                1
                arrow-down
                2
                ·
                11 hours ago

                Yes, it is a lot. But again not apocalyptic. And it’s noteworthy how the article tries to frame it hyperbolically as “22% of US household consumption”.

          • LwL@lemmy.world
            link
            fedilink
            arrow-up
            10
            ·
            1 day ago

            Oh damn. Very good article btw.

            According to numbers floating around online, thiat would mean one llama query is around as expensive as 10 google searches. And it’s likely that those costs will increase further.

            It still seems like the biggest factor here is the scale of adaptation. Unfortunately the total energy costs of AI might even scale exponentially since the more complex the queries get, the better the responses will likely be. And that will further drive adaptation.

            This pace is so clearly unsustainable it’s horrifying, and while it was obvious to some degree, it seems it’s worse than I thought.