Chrome 146 just shipped WebMCP โ the new W3C standard that lets AI agents read, understand, and interact with websites. Without it, your site is a black box to every AI agent, browser assistant, and autonomous workflow that visits it.
Scan any URL in seconds and get a full compliance report across 19 checks.
The web is shifting from human-first to agent-first. Sites that don't adapt will be left behind.
Without WebMCP tool definitions, AI agents visiting your site have no idea what actions they can take. Your forms, search, checkout โ all invisible. They bounce and go to a competitor that is compliant.
AI assistants increasingly act as intermediaries between users and websites. If your site isn't agent-readable, you won't appear in AI-generated recommendations, summaries, or automated workflows โ a growing source of traffic.
WebMCP is not a future spec โ it's live in Chrome 146 right now. Early adopters who implement it today will have a significant first-mover advantage before it becomes a mainstream requirement.
The declarative WebMCP API requires just toolname, tooldescription, and toolaction on your existing HTML forms. No framework required.
Chrome's Prompt API (LanguageModel) lets your website use on-device Gemini Nano โ no API costs, no server calls, full privacy. Sites that leverage this will deliver smarter experiences than those relying on cloud AI alone.
Our audit shows 99%+ of websites currently score below 20/100. That gap is your opportunity โ implement WebMCP now and be in the top 1% of agent-ready sites while competitors are still unaware it exists.
Add WebMCP compliance to your site in minutes with our WordPress plugin or get a done-for-you audit.
Everything you need to know about WebMCP and agent readiness.
toolname, tooldescription, toolaction) to your existing <form> elements. The imperative approach using navigator.modelContext.registerTool() requires JavaScript but gives you more control and richer tool definitions./.well-known/webmcp file is a JSON discovery manifest that AI agents can fetch before visiting your site to understand what tools you offer. It's analogous to robots.txt for search engines, but for AI agents. It contains your tool names, descriptions, input schemas, and API endpoints.llms.txt is a plain-text file at the root of your site that describes your content, tools, and key pages for AI language models. Think of it as a human-readable sitemap designed for AI. It's not an official W3C standard yet, but it's increasingly recognised by AI crawlers including ChatGPT, Claude, and Perplexity.SubmitEvent includes an agentInvoked boolean flag. This lets your server or JavaScript distinguish between a human submission and an agent submission, allowing you to return structured JSON to the agent instead of a full HTML page response.robots.txt and /.well-known/webmcp. All checks run against the real fetched content โ not simulated data. The proxy includes SSRF protection and rate limiting.requestUserInteraction() is a WebMCP API that allows your site to pause an agent's action and ask the human user for confirmation before proceeding. This is the human-in-the-loop consent mechanism โ critical for actions like form submissions, purchases, or deletions triggered by an AI agent.toolname, tooldescription, and toolaction to your key forms. Then create a /.well-known/webmcp JSON manifest. For a complete done-for-you implementation, visit ucpcompliant.com โ we handle everything from attribute injection to manifest creation and API endpoint setup.