• starelfsc2@sh.itjust.works
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    19 hours ago

    90% of people do not use offline models, especially for everyone doing ai code and video. The offline models are undeniably worse and slower. These ai companies didn’t just magic billions out of thin air, most people are using the massive data farms. Also people are generally not playing 14 hours a day maxed out gaming, where for ai they might use it all day during work.

    • ClamDrinker@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      3
      ·
      edit-2
      16 hours ago

      The existence of offline models highlights a nuance that some people deny even exists though, causing people to talk around one another. I wish it would be more widely acknowledged, as it would make some conversations around AI easier.

      • starelfsc2@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        15 hours ago

        So I did some more research, and evidently if you’re going to use AI at all, you’re probably increasing your energy usage by using it offline if you use it often (unless you are using renewables), since the data centers generally have cards specifically designed for AI. I think it might just be a case of everyone needs to use it significantly less, it’s like if 4k gaming was something the average joe was doing. If everyone was doing that 10 hours a day, we would have a big problem.

        It’s kinda like saying it’s not immoral to go for a pleasure drive, but if you’re driving around 10 hours a day that’s probably not good and you should minimize it as much as you can.

        • ClamDrinker@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          11 hours ago

          That’s pretty interesting. And I totally agree with your last part. One counterpoint I would have is that local models are often more efficient though, and there’s very little checking you can do on how much your query actually costs in the cloud, while using it at home you can monitor your GPU usage and your power bill, and that information creates a sense of responsibility if you overuse it, like the amount of gas station stops a 10 hour joyride would require. But yeah at the end of the day using it as little as possible is a good habit.

      • bbboi@feddit.uk
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        edit-2
        14 hours ago

        Does the physical location of the hardware really matter?

        Single individuals driving do not contribute much to pollution, but collectively they do.

        Shifting the hardware from a data center to your house is great for privacy and I wholeheartedly agree with that, but unless demand goes down the same or perhaps even more energy will be consumed. It’ll just be consumed elsewhere.

        Using AI to make shitposts probably isn’t the best use of energy but then again almost nothing we do is because nobody wants to live in a society with no hobbies.

        • ClamDrinker@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          11 hours ago

          It does not but that wasn’t my point. It was that not all forms of AI usage are the same. The same way someone driving around an EV that they charge with solar power isn’t the same as someone driving a 1969 oil guzzler (or something equivalent). Local usage more often than not means efficient models, low energy consumption, and little difference to other computer tasks like gaming or video editing. But when the conversation is around AI, there is always the implicit expectation that it’s the worst of the worst.