• floofloof@lemmy.ca
    link
    fedilink
    English
    arrow-up
    105
    arrow-down
    2
    ·
    edit-2
    18 hours ago

    It’s time to recalibrate my gradient on the big picture.

    I guess this is AI-insider wit, but I’m so glad not to think or speak like these people do.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 hours ago

      It’s language for people who advertise they know something about “AI,” but couldn’t implement it if their life depended on it.

      TBH I’ve never heard that one, but it sounds like they’re trying to use “gradient descent” in a sentence.

    • panda_abyss@lemmy.ca
      link
      fedilink
      English
      arrow-up
      48
      ·
      18 hours ago

      It is, gradient descent is what you use to find optimal model parameters.

      the algorithm takes a step, computes a gradient (whether any nearby options are better), then moves in that direction to improve the parameters, in a loop.

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      16 hours ago

      Someone from a brand new account posted a bunch of gibberish like that today about having the keys to the octo-dimension mother universe…

      And I immediately thought it was an Elon AI, because that’s how they think humans actually talk

      • kautau@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        14 hours ago

        Probably someone who booted up openclaw and gave it a lemmy account and then it likely gave away their financial information on moltbook