• daniskarma@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    2
    ·
    1 hour ago

    I am skeptical about the real level of protection that Anubis really provides.

    At the end is an automated test. Meaning that any machine could easily solve it.

    Most “attackers” wont bother solving it because they don’t really care. But if they would want they could. It’s sort of protection by obscurity.

    The more Anubis it’s used the more we see attacks that actually equip a way to solve the challenges. Then is when Anubis up the challenge and the battle begin, between how much can Anubis up the challenge so normal users can still browse and how much cost the attacker is willing to eat.

    Giving that these attackers tend to have high budgets I’m not that certain about its actual capabilities to reject a targeted ddos.

    As for crawling for big data. I do think that it does nothing here. Companies willing yo scrape big amounts of data, for AI training or other purposes, have massive budgets and the electricity cost of solving the JavaScript challenges become nothing in comparison. They also doesn’t need ro deny the service so they could spread the scrape to keep the challenge low reducing the cost even more.

    Once again, positive results we currently see in practice I believe that are caused just because most scrappers and ddos attackers are just blindly attacking and doesn’t really equip themselves for Anubis. Protection by obscurity. But a well equiped attacker I don’t think it would have that much trouble getting past it, specially for scrapping, or other type of bot attacks that could be slowed down.

    • softwarist@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      36 minutes ago

      You’re right, although my understanding is that there are a lot of poorly implemented scrapers for AI services unintentionally DDoSing websites with requests, so Anubis is more of a mitigation against those.