• nightlily@leminal.space
    link
    fedilink
    English
    arrow-up
    2
    ·
    12 hours ago

    I don’t need to try. You aren’t learning facts from interrogating an LLM. If it doesn’t have information, it will make up a result. If it does have information, it will make up a result. Even that is personifying it too much because really the transformer has no concept of what „making something up“ is. It takes an input and gives an output, no matter what.