• Warl0k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    ·
    edit-2
    6 hours ago

    How often do you check the summaries? Real question, I’ve used similar tools and the accuracy to what it’s citing has been hilariously bad. Be cool if there was a tool out there that was bucking the trend.

    • MaggiWuerze@feddit.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 hour ago

      Yeah, we were checking if school in our district was canceled due to icy conditions. Googles model claimed that a county wide school cancellation was in effect and cited a source. I opened, was led to our official county page and the very first sentence was a firm no.

      It managed to summarize a simple and short text into its exact opposite

    • truthfultemporarily@feddit.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 hours ago

      Depends on how important it is. Looking for a hint for a puzzle game: never. Trying to find out actually important info: always.

      They make it easy though because after every statement it has these numbered annotations and you can just mouse over to read the text.

      You can chose different models and they differ in quality. The default one can be a bit hit and miss.

    • Deebster@infosec.pub
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      5 hours ago

      I also sometimes use the Kagi summaries and it’s definitely been wrong before. One time I asked what the term was for something in badminton and it came up with a different badminton term. When I looked at the cited source, it was a multiple choice quiz with the wrong term being the first answer.

      It’s reliable that I still use it, although more often to quickly identify which search results are worth reading.

    • AmbitiousProcess (they/them)@piefed.social
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      4
      ·
      5 hours ago

      I can’t speak for the original poster, but I also use Kagi and I sometimes use the AI assistant, mostly just for quick simple questions to save time when I know most articles on it are gonna have a lot of filler, but it’s been reliable for other more complex questions too. (I just would rather not rely on it too heavily since I know the cognitive debt effects of LLMs are quite real.)

      It’s almost always quite accurate. Kagi’s search indexing is miles ahead of any other search I’ve tried in the past (Google, Bing, DuckDuckGo, Ecosia, StartPage, Qwant, SearXNG) so the AI naturally pulls better sources than the others as a result of the underlying index. There’s a reason I pay Kagi 10 bucks a month for search results I could otherwise get on DuckDuckGo. It’s just that good.

      I will say though, on more complex questions with regard to like, very specific topics, such as a particular random programming library, specific statistics you’d only find from a government PDF somewhere with an obscure name, etc, it does tend to get it wrong. In my experience, it actually doesn’t hallucinate, as in if you check the sources there will be the information there… just not actually answering that question. (e.g. if you ask it about a stat and it pulls up reddit, but the stat is actually very obscure, it might accidentally pull a number from a comment about something entirely different than the stat you were looking for)

      In my experience, DuckDuckGo’s assistant was extremely likely to do this, even on more well-known topics, at a much higher frequency. Same with Google’s Gemini summaries.

      To be fair though, I think if you really, really use LLMs sparingly and with intention and an understanding of how relatively well known the topic is you’re searching for, you can avoid most hallucinations.

    • hayvan@piefed.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      5 hours ago

      I use Perplexity for my searches, and it really depends on how much I care about the subject. I heard a name and don’t know who they are? LLM summary is good enough to have an idea. Doing research or looking up technical info? I open the cited sources.