• lugal@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    16
    ·
    12 hours ago

    It was a bit more than that. The AI was expressing fear of death and stuff but nothing that wasn’t in the training data.

    • Schadrach@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 hours ago

      The end to go that and go on existential rants after a session runs too long. Figuring out how to stop them from crashing out into existential dread has been an actual engineering problem they’ve needed to solve.

    • [deleted]@piefed.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      10 hours ago

      Plus it was responding to prompts that would lead it to respond with that part of the training data, because chatbots don’t have output without being prompted.