• 0 Posts
  • 562 Comments
Joined 3 years ago
cake
Cake day: June 30th, 2023

help-circle
  • The embedding layer post tokenization is not just a probability machine the way you’re suggesting it. You can argue that it is probabilistic with inferred sentiment, but too many people think it works like how text prediction on your phone does and that is just factually inaccurate.

    Verify output of course, but saying “it doesn’t understand anything” and “probability machine” is a borderline erroneous short sell. At the level of tokens it “understands” relationships, and those relationships are not probabilistic, though they are fundamentally approximated based on a training corpus.









  • There is a data point missing here.

    Do the same study and give some an LLM, some no LLM, and some a type A subject matter expert for reference. It may also matter if this person is a friend coworker or random passerby, but I would be willing to bet money that the same effect is present to a lesser (but still statistically significant) degree.

    Maybe a future study can be further refined to build some scaffolding for more effective teaching/learning “on the job” or in general.











  • dream_weasel@sh.itjust.workstoLemmy Shitpost@lemmy.worldtype shit
    link
    fedilink
    arrow-up
    4
    arrow-down
    42
    ·
    edit-2
    18 days ago

    As a follow on, is your username supposed to be “Holmes” but you decided to wing it on the spelling test?

    I’m also circumcised and find getting bent out of shape over it 18 years later to be… an unusual response.

    Edit: Hey Lemmy Weiner police! Be sure to bitch out your parents if you haven’t, it will definitely be helpful in some way!