• 8oow3291d@feddit.dk
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 hours ago

    LLMs don’t have any intentions.

    Eh. The output from LLMs is usually pretty goal-oriented, so it arguably has intentions.

    The LLM is not designed to deceive though, so in that sense it is correct that it is not lies.

    • deliriousdreams@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      41 minutes ago

      The people who program, run and upkeep the LLM have intentions. The LLM is not a sapient or sentient entity.