Contact
Apr 4, 2026

Cloudflare Web Bot Auth — The New Standard for AI Agent Verification

Key Takeaways

  1. Web Bot Auth is a Cloudflare-proposed protocol undergoing IETF standardization that verifies AI agent identity through cryptographic HTTP signatures
  2. Adopted as the authentication foundation for Visa TAP and Mastercard Agent Pay, it is becoming the "front door" infrastructure for agentic commerce
  3. AWS WAF, Vercel, Shopify, and Akamai have already implemented support, driving adoption across the web platform ecosystem

The Structural Challenge Cloudflare Web Bot Auth Solves

In March 2026, Cloudflare CEO Matthew Prince predicted that bot traffic will exceed human traffic by 2027. AI bot requests on Cloudflare's network already exceed 10 billion per week.

Within this massive volume of traffic, legitimate shopping agents, malicious scrapers, and AI training crawlers coexist. The problem is that traditional web standards offer almost no means to distinguish between them.

Robots.txt is merely a "request" mechanism born in 1994. A Duke University study in 2025 found that several categories of AI-related crawlers never request robots.txt files at all. In 2026, OpenAI removed language stating that its ChatGPT-User crawler would comply with robots.txt, taking the position that user-initiated actions through AI agents are not subject to robots.txt restrictions.

The implication is clear: trust based on self-declaration no longer works. As agentic commerce expands, cryptographic verification of bot identity has become essential. Web Bot Auth, proposed by Cloudflare in May 2025, is a protocol that directly addresses this structural problem.

Why the robots.txt Era Is Ending

Why is robots.txt insufficient? Three structural limitations stand out.

First, lack of enforceability. Robots.txt is a voluntary "please" — it has no technical mechanism to prevent access. Well-intentioned crawlers comply, but malicious bots ignore it. In the AI agent context, the boundary between "well-intentioned" and "malicious" itself has become blurred.

Second, crude identification. The only identifier robots.txt can work with is the User-Agent string, which is trivially spoofable. There is no way to confirm whether a request claiming to be "Googlebot" actually originated from Google using robots.txt alone.

Third, limited expressiveness. Robots.txt can only express "allow or deny." But when considering agentic commerce security, granular control is needed — such as "allow browsing but restrict purchasing to authenticated agents only."

IP address verification also has limitations as an alternative. In cloud environments and browser proxy services, IPs change frequently, making list maintenance impractical. This is the backdrop to Cloudflare's call to "forget IPs".

Web Bot Auth Technical Architecture

Web Bot Auth is built on the already-standardized RFC 9421 (HTTP Message Signatures). Agents attach cryptographic signatures to HTTP requests, and websites verify those signatures using public keys. This simple mechanism enables a fundamental shift in bot authentication.

When an AI agent sends a request to a website, it adds three elements to the HTTP headers. The Signature-Agent header points to the domain where the agent's public keys are published (e.g., operator.openai.com). The Signature-Input header contains the signature's validity window, key ID (JSON Web Key Thumbprint), and a tag indicating the purpose. The Signature header holds the actual Ed25519 cryptographic signature.

Signature-Agent: operator.openai.com
Signature-Input: sig=("@authority" "signature-agent");created=1700000000;expires=1700011111;keyid="ba3e64==";tag="web-bot-auth"
Signature: sig=abc==

Website-side verification follows three steps: retrieve the public key from /.well-known/http-message-signatures-directory at the domain specified in the Signature-Agent header, verify the signature with that key, and confirm the timestamp validity. Since signatures are bound to the target domain (@authority), they cannot be reused on different sites.

The elegance of this design lies in operating on existing HTTP infrastructure. No new protocol stacks or TLS extensions are required — implementation amounts to adding middleware to a web server. Cloudflare's technical blog provides working implementations in TypeScript, Go (Caddy plugin), and an npm package.

Public Key Directory and Registry

In February 2026, Cloudflare announced an open registry format in collaboration with Amazon Bedrock AgentCore, standardizing how agent public keys are discovered.

The registry is a simple text file listing URLs pointing to each agent's key directory. Website operators can host these registries on GitHub or Cloudflare R2, managing them similarly to IP blocklists or robots.txt files. Each entry is a Signature Agent Card — a JSON metadata format containing the agent name, operator information, expected request rate, and cryptographic keys.

This decentralized key discovery mechanism enables agent identity verification without a centralized certificate authority.

IETF Standardization — The Web Bot Auth Working Group

Web Bot Auth has moved beyond a Cloudflare-only proposal. The IETF has formally established the WebBotAuth Working Group to develop it as an internet standard.

Co-chaired by David Schinazi and Rifaat Shekh-Yusef, the working group was chartered following a Birds of a Feather (BoF) session at IETF 123. The standardization scope covers traditional bots like search crawlers and web archivers, as well as AI training crawlers and AI agents that retrieve and interact with content on behalf of end users.

Two milestones have been set: standards track specifications for authentication techniques and bot information mechanisms due to the IESG by April 2026, and a Best Current Practice document on key management and deployment by August 2026. If standardization proceeds on schedule, RFC publication is possible in 2027.

Notably, the working group explicitly places "bot reputation tracking" and "end-user authentication" out of scope. Web Bot Auth is specialized for verifying "who sent this request," leaving the judgment of whether that agent is "trustworthy" to other layers such as the KYA (Know Your Agent) framework.

Agentic Commerce Application — Visa TAP and Mastercard Agent Pay

The most concrete application of Web Bot Auth is as the authentication foundation for agentic commerce. In October 2025, Cloudflare announced collaboration with Visa, Mastercard, and American Express to integrate Web Bot Auth into payment network agent authentication.

Visa TAP (Trusted Agent Protocol) extends the Web Bot Auth signature mechanism with an authentication layer for shopping agents. Specifically, it introduces two values for the tag field in the Signature-Input header: agent-browser-auth indicating the agent is browsing products, and agent-payer-auth indicating a payment attempt. A nonce field is also included in the signature to prevent replay attacks.

Cloudflare's verification process runs seven steps: confirming signature header presence, retrieving keys from the public key directory, validating timestamps, checking nonce uniqueness, verifying the tag type, performing Ed25519 cryptographic signature verification, and confirming the agent's registration status. By executing this entire sequence at Cloudflare's edge, infrastructure changes on the merchant side are minimized.

Mastercard takes a similar approach. Mastercard Agent Pay incorporates Web Bot Auth and links AI agents to individual users through agentic tokens. Visa handles "identity proof," Mastercard's Verifiable Intent handles "intent proof," and Web Bot Auth's cryptographic signatures serve as the foundation for both.

For merchants, this enables precise control: "allow Visa-approved shopping agents through while blocking unauthenticated scrapers." Cloudflare plans to integrate Visa and Mastercard protocol support directly into its Agent SDK, offering it as a managed ruleset.

Web Platform Adoption

The pace of adoption beyond Cloudflare is a key indicator of Web Bot Auth's impact.

PlatformSupport StatusUse Case
CloudflareNative support (proposer)WAF & Bot Management integration
AWS WAFSupported since Nov 2025Bot Control rule integration
VercelSupported since Aug 2025Bot verification feature
ShopifyCrawler Access Keys supportCustom crawler authentication
AkamaiWeb Bot Auth supportedBot management & agentic commerce

AWS WAF announced Web Bot Auth support in November 2025. In Bot Control rules, verified requests automatically receive the web_bot_auth:verified label, while failed verifications get web_bot_auth:invalid and expired keys receive web_bot_auth:expired. Amazon Bedrock AgentCore's browser service also supports Web Bot Auth, reducing the frequency of CAPTCHA encounters for AI agents.

On the agent developer side, OpenAI has begun attaching HTTP Message Signatures to all Operator requests, with public keys published at operator.openai.com's /.well-known/http-message-signatures-directory. Cloudflare's signed agents program launched with ChatGPT agent, Block's Goose, Browserbase, and Anchor Browser as founding members.

This breadth of adoption demonstrates that Web Bot Auth is becoming internet-wide infrastructure rather than a vendor-specific lock-in.

The Migration Path from robots.txt to Web Bot Auth

How do traditional mechanisms relate to Web Bot Auth?

Aspectrobots.txtIP Address VerificationWeb Bot Auth
AuthenticationSelf-declared (User-Agent)IP address list matchingCryptographic signatures (Ed25519)
Tamper ResistanceNone (easily spoofed)Low (IP spoofing/rotation)High (cryptographically verifiable)
EnforceabilityVoluntary complianceTechnically blockableTechnically verifiable
ScalabilityHighLow (list management overhead)High (public key directory)
Agent IdentificationBot name onlyOrganization-levelIndividual agent-level

Crucially, Web Bot Auth does not "replace" robots.txt but "complements" it. Robots.txt continues to function as a "policy" layer declaring crawl frequency and access scope. Web Bot Auth adds an "authentication" layer on top, cryptographically verifying that agents claiming to follow the policy are who they say they are.

Cloudflare has also proposed Content Signals, a mechanism for expressing content usage terms through HTTP response headers like Content-Signal: ai-train=yes, search=yes, ai-input=yes. Combined with Markdown for Agents, this creates a framework where AI agents can efficiently retrieve content while respecting publisher preferences.

The new web order Cloudflare envisions is a three-layer structure: Content Signals define content usage rules, robots.txt declares access policies, and Web Bot Auth verifies accessor identity. These three layers working together form the trust layer for agentic commerce.

Impact on E-Commerce Merchants

For e-commerce merchants, adapting to Web Bot Auth can follow two routes.

First, automatic platform-level support. If you already use Cloudflare, AWS, Vercel, or Shopify, platform updates will progressively deliver Web Bot Auth verification capabilities. Cloudflare users can identify signed agents and control traffic through the Bot Management and AI Audit dashboard features.

Second, indirect support through payment providers. Merchants using payment processors that support Visa TAP and Mastercard Agent Pay — including Stripe, Adyen, Nuvei, and Worldpay — will have agent authentication verification handled on the processor side.

Either way, the fundamental shift required is from "block everything" to "selective access management." AI agent traffic will only increase. Blocking legitimate agents translates directly to lost revenue. Using Web Bot Auth's cryptographic identification to open the door for trusted agents while closing it for others is the essential posture for operating e-commerce in the agent era.

Summary

Where robots.txt was a "request," Web Bot Auth is "proof." IETF standardization, implementation across major CDNs and WAFs, and integration into Visa and Mastercard payment protocols — this three-pronged adoption suggests Web Bot Auth is becoming a new infrastructure layer for the internet. The submission of IETF standards track specifications in the second half of 2026 will sharpen the contours of that future.