I wouldn’t even recommend using LLMs in place of search engines, since they make stuff up. If it’s providing sources, you can check those, but you have to be rigorous enough to check every detail, which just isn’t realistic. People are lazy.
The best way I’ve heard them described is “bullshit machines”, and I don’t say that because I think they’re stupid, but because they “bullshit” as opposed to lying or telling the truth. When you’re bullshitting, the truth is irrelevant, as long as it sounds good. That’s exactly how LLMs work.
So if there’s a problem that can be solved by bullshitting, that’s where an LLM might be the right tool for the job.
I wouldn’t even recommend using LLMs in place of search engines, since they make stuff up. If it’s providing sources, you can check those, but you have to be rigorous enough to check every detail, which just isn’t realistic. People are lazy.
The best way I’ve heard them described is “bullshit machines”, and I don’t say that because I think they’re stupid, but because they “bullshit” as opposed to lying or telling the truth. When you’re bullshitting, the truth is irrelevant, as long as it sounds good. That’s exactly how LLMs work.
So if there’s a problem that can be solved by bullshitting, that’s where an LLM might be the right tool for the job.