Published on 2026-02-19
A practical, risk-based approach: protect high-value actions, use allow/step-up/block, add rate limits, and measure what actually changes.
“Protect website from bots” sounds like a site-wide problem. In reality, bots target a handful of high-value actions where money, data, or compute costs pile up: signup, login, password reset, checkout, and key APIs.
OWASP calls these patterns automated threats to web applications—automation abusing normal functionality rather than a single bug (OWASP Automated Threats). That framing is useful because it stops you wasting time defending pages no attacker cares about.
Some automation is helpful (search crawlers, uptime monitors, integrations). Even Cloudflare’s definition of bot management focuses on blocking unwanted or malicious bots while allowing useful bots (Cloudflare: bot management).
So your job is not to declare war on every script. Your job is to protect the flows that create value and make abuse expensive, noisy, and measurable.
The best bot defences aren’t one clever trick. They’re a simple system:
Score-based tools made this model popular (for example, reCAPTCHA v3 returns a score and expects your backend to act on it) (Google reCAPTCHA v3 docs). Vendor aside, the architecture is the point.
If you only implement one policy, make it this:
This avoids “should we add a CAPTCHA everywhere?” debates, and it’s debuggable in production.
Bots don’t browse. They hammer endpoints.
/login and /password-reset as separate surfaces with different thresholds.NIST’s Digital Identity Guidelines explicitly require rate limiting to effectively limit failed authentication attempts (NIST SP 800-63B).
POST /signup with risk-based decisions.Imagine your SaaS free trial is being farmed:
A practical rollout:
POST /signup.Success looks like “fake accounts down, conversion stable”—not “we blocked 10 million requests”.
Rate limiting is how you turn “infinite attempts” into “finite cost”. When you do throttle, use standard semantics—HTTP 429 Too Many Requests literally means the client sent too many requests in a given timeframe (MDN on 429).
A few practical tips:
Bots concentrate where your value concentrates. Configure by endpoint and action.
Track:
Decide what happens when verification can’t run (timeouts, flaky networks, script blockers). Your system should degrade predictably.
Humans Only helps you protect your website from bots with fast, privacy-first verification (typically under 2 seconds), easy drop-in integration, and real-time analytics.
If you want a bot defence that product owners can tune and developers can ship without a six-week detour, Humans Only is built for it: Stop Bots, Welcome Humans.
We use cookies to improve your experience and anonymously analyze usage.