• baatliwala@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      2 days ago

      Local LLMs, probably even ones you can host on phones. But they won’t be as powered of course

          • XLE@piefed.social
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            5
            ·
            2 days ago

            As long as you don’t care if the summaries and analyses are wrong!

        • felsiq@piefed.zip
          link
          fedilink
          English
          arrow-up
          10
          ·
          2 days ago

          Home assistant is the big one imo, voice control for a private smart home is useful and low-stakes so hallucinations won’t be the end of the world

        • baatliwala@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          2 days ago

          In addition to what the others said, some apps allow you to link to an LLM model for additional features.

          For eg Immich has prebuilt models you can choose depending on how powerful your PC is, which will give facial recognition and powerful NLP-like search capabilities for your library. So if they think this is model good they can make a new prebuilt one using this as a base. Software like Microsoft Teams uses LLM for better background blurring for video calls, so maybe an open source equivalent can make use of it.

          Also you can use it for other stuff like image generation too