• Junkasaurus@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    20 hours ago

    What are you saying precisely? It’s well known that LLMs have non-deterministic output (Ilya Sutskever even claims as such). Are you saying the way it goes about retrieving tokens as deterministic?

    • partofthevoice@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      18 hours ago

      I think you’re right about that, but it is artificial nondeterminism in the sense that it’s relying on several algorithmic factors and, more subtly, device differences. The system itself is a complex yet deterministic function.

      • Junkasaurus@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 hours ago

        I can agree with that largely but I still contend you’re conflating a few things to make that argument. Fundamentally an LLM will make predictions based on probability (ignoring temperature) and probability does not equal certainty.

        • partofthevoice@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 hours ago

          I would argue that’s empirically true but not fundamentally true. Actually, I’d argue that my point is the fundamental truth here. Computers still cannot generate random output. They simulate the process, and it’s not truly random. It’s just good enough to fool us at the surface level.

    • Onno (VK6FLAB)@lemmy.radio
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 hours ago

      They are deterministic but complex to determine.

      The Assumed Intelligence systems I’m familiar with have a “random” element, but it’s unclear where that source of randomness comes from. Is it using a computational random source, or something like the lava lamp wall at Cloudflare, which is significantly more random, potentially actually random.

      • Junkasaurus@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 hours ago

        It’s temperature primarily. That being said there is still a chance that an LLM can output values that are unexpected even at low temperatures.