Greg Clarke

  • 37 Posts
  • 160 Comments
Joined 2 years ago
cake
Cake day: November 9th, 2022

help-circle



















  • But I don’t think it’s the best option if you consider everyone involved.

    Can you expand on this? Do you mean from an environmental perspective because of the resource usage, social perspective because of jobs losses, and / or other groups being disadvantaged because of limited access to these tools?


  • It is the best option for certain use cases. OpenAI, Anthropic, etc sell tokens, so they have a clear incentive to promote LLM reasoning as an everything solution. LLM read is normally an inefficient use of processor cycles for most use cases. However, because LLM reasoning is so flexible, even though it’s inefficient from a cycle perspective, it is still the best option in many cases because the current alternatives are even more inefficient (from a cycle or human time perspective).

    Identifying typos in a project update is a task that LLMs can efficiently solve.


  • What are you hosting and who are your users? Do you receive any legitimate traffic from AWS or other cloud provider IP addresses? There will always be edge cases like people hosting VPN exit nodes on a VPS etc, but if its a tiny portion of your legitimate traffic I would consider blocking all incoming traffic from cloud providers and then whitelisting any that make sense like search engine crawlers if necessary.