• jaybone@lemmy.zip
    link
    fedilink
    English
    arrow-up
    87
    arrow-down
    10
    ·
    9 hours ago

    Regular search engines did that 20 years ago, without blowing out the power grid.

    • SlimePirate@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 hours ago

      This is a bad faith argument. Search engines are notoriously bad to find rare specialized information and usually return empty search results for too specific requests. Moreover you need the exact keywords while LLMs use embeddings to find similar meanings

    • Chozo@fedia.io
      link
      fedilink
      arrow-up
      5
      ·
      5 hours ago

      Search engines haven’t worked reliably for several years now, the top results for almost any search are from social media pages that you can’t even read without an account. The Internet is broken.

    • Black616Angel@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      3
      ·
      7 hours ago

      No they didn’t and they still don’t really do that.

      There are too many things (nowadays?) where you have to literally write a question on reddit, stack overflow or Lemmy or the likes and explain your situation in minute detail, because what you find online through search engines is only the standard case which just so happens to not work for you for some odd reason.

      Believe me, when I say that, because I always try search engines first, second and third, before even thinking of using some bs-spitting AI, but it really helped me with two very special problems in the last month.

      • phutatorius@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 hours ago

        what you find online through search engines is only the standard case which just so happens to not work for you for some odd reason

        Usually because the highest-rated solution is half-assed bullshit proposed by an overconfident newbie (or an LLM regurgitating it). I mainly use Stack Overflow as a way to become pissed off enough that I’ll go solve the problem myself, like I should have done in the first place. Indignation As A Service.

        • Black616Angel@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 hours ago

          This is also in part true.
          Today I was searching for multiple things regarding jinja2 and was always recommended a site that no longer exists, as top result, mind you.

      • RememberTheApollo_@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        7 hours ago

        And LLM’s aren’t gamed? Like Grok constantly being tweaked to not say anything inconvenient about Musk? Or ChatGPT citing absurd Reddit posts deliberately made by users to make AI responses wrong?

        AI is built from the ground up to do what they want, and they’re no better than those crappy info-scraper sites like wearethewindoezproz dot com that scrape basic info off every other site and offer it as a solution to your problem with [SOLVED] in the result title. “Did you turn it off and on again?”

      • kadu@scribe.disroot.org
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        edit-2
        7 hours ago

        The “people learned how to game it” is called SEO, and you’re right, they did.

        Guess what, there’s GEO to game the results of LLMs. It works just as well, is harder to spot, and traditional SEO platforms like Ahrefs and SEMRush are already training users on how to do it.

        So congrats, the argument that using LLMs for search is s good solution because people learned how to game search engines makes no sense.

    • Grimy@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      15
      ·
      8 hours ago

      And now we have something better. I’m all for a better grid running on renewables though, which is the actual problem.