• crunchy@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    57
    ·
    15 hours ago

    That’s got to be it. Cloud compute is expensive when you’re not being funded in Azure credits. One the dust settles from the AI bubble bursting, most of the AI we’ll see will probably be specialized agents running small models locally.

      • And009@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 minutes ago

        I’m somewhat tech savvy, how do I run llm locally. Any suggestions? How to know if my local data is safe