Have you ever programmed an interpreter for interactive fiction / MUDs, before all this AI crap? It’s a great example of the power that even super tiny models can accomplish. NLP interfaces are a useful thing for people.
Also consider that Firefox or Electron apps require more RAM and CPU and waste more energy than small language models. A Gemma slm can translate things into English using less energy than it requires to open a modern browser. And I know that because I’m literally watching the resources get used.
I am not implying that transformers-based models have to be huge to be useful. I am only talking about LLMs. I am questioning the purported goal of LLMs, i.e., to replace all humans in as many creative fields as possible, in the context of it’s cost, both environmental and social.
How much do you know about transformers?
Have you ever programmed an interpreter for interactive fiction / MUDs, before all this AI crap? It’s a great example of the power that even super tiny models can accomplish. NLP interfaces are a useful thing for people.
Also consider that Firefox or Electron apps require more RAM and CPU and waste more energy than small language models. A Gemma slm can translate things into English using less energy than it requires to open a modern browser. And I know that because I’m literally watching the resources get used.
I am not implying that transformers-based models have to be huge to be useful. I am only talking about LLMs. I am questioning the purported goal of LLMs, i.e., to replace all humans in as many creative fields as possible, in the context of it’s cost, both environmental and social.