Published on 2026-02-19
A practical checklist for product owners and developers: minimise data, keep transfers under control, and verify humans with risk-based step-ups.
“GDPR-friendly CAPTCHA” usually means human verification that stops bots without creating a tracking problem.
In practice, that means:
Under GDPR, “personal data” is broadly defined and includes online identifiers where someone is identifiable (see GDPR Recital 26: gdpr-info.eu/recitals/no-26). So even “just security telemetry” can become personal data, depending on what’s collected.
Most teams add CAPTCHA for one reason: security. But the implementation often involves third-party scripts, network calls, and telemetry that may count as personal data.
That’s where the GDPR questions start showing up in sprint planning: Do we need consent? Are there international transfers? What goes in the privacy notice? If you’re operating in the UK/EU, it’s also worth remembering that the ICO treats cookies and “similar technologies” (including some fingerprinting-like techniques) as part of the same family of issues (ICO guidance).
If you want a CAPTCHA that’s actually GDPR-friendly (not just marketed that way), evaluate it like any other data-processing component.
A good default is: collect only what you need to decide “human vs bot” for this endpoint.
Concrete example: protecting POST /signup should not require persistent identifiers that follow users across unrelated sites. Your product goal is “stop automated registrations”, not “learn everything about this browser”.
You should be able to write a plain-English paragraph in your privacy notice that answers:
If you can’t get straight answers from the vendor docs, you’ll struggle to be transparent.
If your CAPTCHA sends data to a third country, you may trigger restricted transfer obligations. The UK ICO has specific guidance on international transfers and the mechanisms you may need (IDTA, addendum, transfer risk assessments): ICO international transfers.
This is not “legal theatre”. It’s operational: you need to know where data goes, and what safeguards exist.
Whether you need consent depends on what’s happening technically (e.g. device storage access) and your legal basis.
If your approach behaves like a tracker, you can quickly drift into consent territory. The EDPB has also clarified how strict “freely given” consent needs to be, including concerns around “cookie walls” style gating (see discussion around the EDPB’s 05/2020 consent guidance).
For product owners and developers, the simplest “GDPR-friendly CAPTCHA” operating model is risk-based verification:
Concrete example:
POST /login, you step up when velocity spikes, credentials are being tried in bulk, or the client looks automated.POST /contact, you’re more relaxed but still block obvious spam waves.This approach reduces the need for blanket tracking because you’re making endpoint-specific security decisions.
Green flags
Red flags
Humans Only is built for teams looking for a GDPR-friendly CAPTCHA approach: fast verification (typically under 2 seconds), privacy-first (zero tracking), and an experience designed to feel natural for real users.
For product owners, you get a measurable gate you can roll out per funnel step. For developers, you get a drop-in integration and real-time analytics so you can tune policies based on outcomes, not guesswork.
If you’re trying to meet GDPR expectations and stop automated abuse, the goal is simple: minimise data, stay transparent, and step up only when the risk warrants it.
Stop Bots, Welcome Humans.
We use cookies to improve your experience and anonymously analyze usage.