• mhague@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    4
    ·
    edit-2
    1 day ago

    How much do you know about transformers?

    Have you ever programmed an interpreter for interactive fiction / MUDs, before all this AI crap? It’s a great example of the power that even super tiny models can accomplish. NLP interfaces are a useful thing for people.

    Also consider that Firefox or Electron apps require more RAM and CPU and waste more energy than small language models. A Gemma slm can translate things into English using less energy than it requires to open a modern browser. And I know that because I’m literally watching the resources get used.

    • itkovian@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 day ago

      I am not implying that transformers-based models have to be huge to be useful. I am only talking about LLMs. I am questioning the purported goal of LLMs, i.e., to replace all humans in as many creative fields as possible, in the context of it’s cost, both environmental and social.