• paul@lemmy.org
    link
    fedilink
    English
    arrow-up
    3
    ·
    5 hours ago

    This was predicted early on with LLMs that the information would eventually go into a feedback loop where the AI feeds off other AI hallucinations and they all go downhill fast.