Contact
Apr 9, 2026

What is the Model Context Protocol (MCP)? The Complete 2026 Guide to AI's 'USB-C Moment'

Key Takeaways

  1. MCP is an open standard for connecting AI agents to tools and data, with 97M SDK downloads by end of 2025.
  2. Anthropic donated MCP to the Linux Foundation's AAIF in December 2025, moving it to neutral industry governance.
  3. In agentic commerce it's the data and tool connectivity layer, with ACP, UCP, and AP2 stacked on top.

The USB-C for AI, Explained in One Protocol

In November 2024, Anthropic quietly published a GitHub repository that, in just 17 months, rewrote the wiring diagram of the entire AI industry. 97 million monthly SDK downloads. More than 10,000 active servers. And in December 2025, a founding project of the Linux Foundation's Agentic AI Foundation (AAIF) — that's where MCP stood when the Linux Foundation announced AAIF.

How many times have you heard the name Model Context Protocol, or MCP, in the last six months? It's embedded in ChatGPT, Claude, Cursor, and Gemini. Shopify and commercetools now ship their commerce features through MCP servers. And yet very few articles answer the simple question: what is this thing, really? This piece treats MCP not only as a technical spec, but as the foundation layer of agentic commerce.

What is the Model Context Protocol (MCP)?

MCP is an open standard, released by Anthropic, for connecting AI applications to external tools and data sources. The official docs describe it as the "USB-C for AI", and the metaphor captures the goal precisely: unify the connector so that any AI can reach any tool or data source through the same protocol.

Anthropic's original announcement defines MCP as "an open standard that enables developers to build secure, two-way connections between their data sources and AI-powered tools." The critical point is that MCP is not tied to a specific LLM. The spec is model-agnostic, so the same server can be called from Claude, GPT, or Gemini without modification.

That's exactly what happened on March 26, 2025, when OpenAI publicly adopted MCP. Anthropic's direct rival pulling Anthropic's spec into its Agents SDK, Responses API, and ChatGPT desktop app was, as TechCrunch put it, a signal that MCP had escaped vendor lock-in.

Why MCP Exists — The N×M Wiring Nightmare

Put simply, MCP solves the N×M problem. If you have N AI models and M tools or data sources, connecting each pair individually requires N×M integrations. Five models and twenty tools already means a hundred bespoke integrations that somebody has to write and maintain forever.

MCP compresses that to N+M. A model implements the MCP client once and can talk to every MCP server. A tool implements an MCP server once and can be called by every MCP client. Moving from multiplication to addition is a massive drop in the total engineering cost of the ecosystem.

The backstory matches that framing. The inventors of MCP, David Soria Parra and Justin Spahr-Summers at Anthropic, built the prototype out of frustration. In July 2024, David got tired of manually copying context back and forth between Claude Desktop and his IDE. Six weeks of prototyping later, the first version of MCP existed. A personal frustration produced the de facto standard for AI infrastructure — and that origin matters, because it shows why the spec is small and pragmatic rather than bloated.

MCP Architecture — The Host / Client / Server Model

MCP splits responsibilities across three roles: Host, Client, and Server. The Host is the user-facing AI application — Claude Desktop, Cursor, VS Code, the ChatGPT desktop app. Inside the Host, a separate MCP Client is spawned for every MCP Server it connects to, and each client holds a one-to-one connection with its server.

The MCP Server provides a toolbox. The spec defines three server primitives: Tools (executable actions), Resources (contextual data the model can read), and Prompts (reusable templates). A GitHub MCP server might offer a "create pull request" Tool and a "read README.md" Resource.

The Client side also exposes primitives back to the server. Sampling lets a server ask the host's LLM to run an inference, which keeps servers model-agnostic even when they need LLM capabilities inside their own logic. Elicitation lets a server request follow-up input from the user, and Logging routes log messages from server to client.

Under the hood, the data layer is JSON-RPC 2.0, carrying request, response, and notification messages. Two transports cover the deployment spectrum. For local execution, stdio pipes JSON over the standard input and output of a subprocess — the lightest possible binding. For remote execution, Streamable HTTP layers HTTP POST with optional Server-Sent Events for streaming. The earlier HTTP+SSE-only approach has been superseded by Streamable HTTP.

LayerRoleExamples
HostThe user-facing AI applicationClaude Desktop / Cursor / VS Code / ChatGPT Desktop
ClientOne-to-one connection to each Server, spawned inside the HostMCP client implementation inside the Host
ServerExposes Tools / Resources / PromptsGitHub Server / Postgres Server / Shopify MCP

The simplicity of this three-layer model is the single biggest reason MCP spread so fast. The official architecture documentation fits most of the spec into roughly ten pages. MCP doesn't try to win by being comprehensive. It wins by locking in a minimal shared vocabulary and leaving the rest to the ecosystem — a choice that pulled in even direct competitors like OpenAI and Google.

From Side Project to USB-C Moment

MCP has only existed for a year and a half. The density of that year and a half, however, is unusual.

The first release was November 25, 2024. Alongside the announcement, Anthropic shipped six reference servers — Google Drive, Slack, GitHub, Git, Postgres, and Puppeteer. Block, Apollo, Zed, Replit, Codeium, and Sourcegraph joined as early partners, and the developer tool industry responded instantly.

Through early 2025 the community scaled the SDKs. Python and TypeScript came first, then official implementations in Java, C#, Kotlin, Swift, Ruby, and Go. Third-party registries like Smithery and mcp.so emerged, and the ecosystem started self-replicating.

The inflection point came in March 2025. OpenAI's Sam Altman announced MCP support across the Agents SDK, Responses API, and ChatGPT desktop. Google Cloud announced MCP support for Google services. Microsoft brought MCP to GitHub Copilot, Copilot Studio, and VS Code. The three largest AI platform companies backing the same spec is not a common sight in this industry.

The defining moment arrived on December 9, 2025. Anthropic donated MCP itself to the Linux Foundation's Agentic AI Foundation (AAIF). The founding platinum members of AAIF include AWS, Anthropic, Block, Bloomberg, Cloudflare, Google, Microsoft, and OpenAI. The stats announced that day were remarkable: 97 million monthly SDK downloads, over 10,000 active servers, and first-class support across every major AI platform including ChatGPT, Claude, Cursor, Gemini, and Microsoft Copilot.

The act of handing the project away is what turned MCP from a vendor spec into neutral industry infrastructure. The first MCP Dev Summit North America, held in New York on April 2–3, 2026, with more than 95 sessions, was the visible payoff of that handoff.

Where MCP Fits in the Agentic Commerce Stack

Now for the part that matters most to commerce teams. What does MCP actually mean for retail?

The short answer: MCP is not a commerce protocol. It's the plumbing layer that lets AI agents reach tools and data. To move actual commerce workflows and real money, you need an additional layer of commerce-specific protocols on top. Blurring that line — assuming "MCP alone will let agents sell things" — is the most common misreading of the current landscape.

Borrowing commercetools' framing, the stack looks like this:

LayerMajor protocolsRole
Payment authorizationAP2 (Google + Mastercard/Visa/PayPal)Cryptographic "mandates" for agent payments
Commerce workflowACP (Stripe + OpenAI) / UCP (Shopify + Google)Product search, cart, checkout, and order flow
Agent-to-agentA2A (Google)Horizontal coordination between agents
Tool & data accessMCP (Anthropic → AAIF)Vertical AI-to-tool/data connectivity

MCP handles vertical connectivity at the bottom. A2A handles horizontal agent coordination. ACP and UCP standardize commerce workflows on top of AI surfaces. AP2 handles the trust layer for payments. These are not competitors. They're complements, and any merchant seriously building for agentic commerce will eventually implement more than one. Our full guide to agentic AI protocols maps the complete taxonomy in detail.

A concrete example makes the hierarchy clear. Shopify today ships multiple MCP servers — Dev MCP (for developers), Storefront MCP (product discovery and cart actions), and Checkout MCP (the payment flow) — and stacks UCP on top of them. MCP answers "how does the agent fetch product data?". UCP answers "how does the agent complete the purchase?". Each protocol has a clearly defined job. commercetools follows the same pattern, positioning its Commerce MCP as agent-ready infrastructure and pairing it with UCP. For the detailed implementation view, our Commerce MCP implementation guide walks through Shopify's four-MCP architecture and commercetools' approach.

The bigger point is that MCP is the foundation layer for the entire agentic commerce stack. As we laid out in our complete guide to agentic commerce, AI-driven transactions have moved past the experiment stage, and a handful of protocols are now dividing up the job by use case. Understanding MCP is partly about understanding how that division of labor reads on a wiring diagram.

The Production Story — OAuth 2.1 and Streamable HTTP

If you still picture MCP as a handy feature for Claude Desktop on your laptop, you're looking at 2024, not 2026. Since mid-2025, the main action has shifted to remote, authenticated, enterprise-grade production MCP.

The major turning point on the auth side was a spec update in March 2025. MCP formally adopted OAuth 2.1 as the authorization framework for the Streamable HTTP transport — PKCE, Dynamic Client Registration, and .well-known metadata discovery all included. As Auth0's deep-dive explains, this change was what made production deployments defensible from a security perspective.

Enterprise deployments add more requirements on top: SSO, audit logging, gateways, policy enforcement. The MCP community has agreed to treat these as extensions rather than adding them to the core spec, and the 2026 roadmap explicitly restates the principle: keep the core small, solve enterprise concerns through extensions.

On scalability, the key work is making Streamable HTTP stateless. Current MCP is session-oriented, with state held between a specific client and server. That's convenient but blocks horizontal scaling. The 2026 roadmap prioritizes a standardized stateless Streamable HTTP pattern so that MCP servers can sit behind a load balancer and scale across instances — the baseline requirement for serious commercial workloads.

Getting Started with MCP

For developers, getting hands-on with MCP is surprisingly easy.

The lowest-friction entry point is opening Claude Desktop and adding an existing MCP server to the configuration file. Add the GitHub server, and Claude can immediately read and write your repositories. That "edit one file, AI gains a new capability" experience is the single best introduction to why MCP matters.

For building your own server, the official SDK coverage is broad. Python and TypeScript were there from day one; Java, C#, Kotlin, Swift, Ruby, and Go have all joined. The modelcontextprotocol GitHub organization collects the SDKs and reference implementations, and the inspector tool gives you a GUI for debugging server behavior.

To find existing servers, the official registry.modelcontextprotocol.io is the canonical source, but mcp.so, Smithery, Glama, and PulseMCP all offer curated lists. As of April 2026, mcp.so lists around 19,687 servers and Glama around 18,042, covering essentially every major SaaS — GitHub, Slack, Postgres, Stripe, Shopify, Notion, Linear, and so on.

For commerce specifically, Shopify's developer docs walk through the concrete use of Dev MCP, Storefront MCP, and Checkout MCP. commercetools' Commerce MCP launch post is equally useful for framing MCP inside an enterprise commerce architecture.

The 2026 Roadmap — MCP After the Chasm

The 2026 MCP Roadmap, published March 9, 2026, highlights four priorities: transport evolution and scalability, agent communication, governance, and enterprise readiness.

The most interesting item is the maturation of the Tasks primitive (SEP-1686). Until now MCP has been fundamentally synchronous and request-response. With Tasks, long-running jobs and deferred results become first-class protocol concerns. In a commerce context, that unlocks scenarios like "if the price drops in three days, buy it automatically" without keeping a session pinned open.

On the authorization side, two SEPs are in flight: DPoP (SEP-1932), which strengthens token theft resistance, and Workload Identity Federation (SEP-1933), which modernizes service-to-service identity. Both are squarely aimed at enterprise adoption — a clear signal that MCP is no longer considered an individual-developer toy.

The roadmap is just as explicit about what it is not doing this cycle. No new official transports. No enterprise extensions folded into the core spec. This is the kind of discipline that mature open-source specs settle into — protecting the core by subtracting rather than adding.

Conclusion — From Plumbing to Infrastructure

Back to the opening question. What is the Model Context Protocol?

Technically, it's an open connection protocol built on JSON-RPC 2.0 with a three-layer Host / Client / Server model. Ecosystem-wise, it's a piece of cross-industry infrastructure, born at Anthropic, adopted by OpenAI, Google, and Microsoft, and donated to the Linux Foundation's AAIF. And for commerce, it's the plumbing — the foundation layer that carries data and tools up to the higher layers of the agentic commerce stack.

In the 17 months between November 2024 and April 2026, MCP went from a side project born out of personal frustration to the de facto standard that ChatGPT, Claude, Shopify, and commercetools all implement. The spec's next chapter is now firmly focused on production, enterprise, and scalability.

The real decision for commerce teams in 2026 isn't whether to understand MCP. It's which commerce protocols to pair with MCP. Once you see the stack — MCP as plumbing, ACP/UCP/AP2 as the commerce logic flowing through it — the question becomes which layers your use case actually needs. The plumbing is done. Now the interesting choice is what flows through it.