• Sandbar_Trekker@lemmy.today
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 hour ago

    Technically, you can get the same answer twice from an LLM, but only when you control the full input. When a model is being run, a random seed/hash is applied to the input. If you run the model locally you could force the seed to always be the same so that you would always get the same answer for a given question.