• [deleted]@piefed.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    2
    ·
    1 day ago

    Hallucinations requires perception. LLMs are just statistical models and do not have perceptions.

    It was a cute name early on, now it is used to deflect when the output is just plain wrong.