Fundamentals

What Is Model Context Protocol (MCP)? The Complete Guide for 2026

Model Context Protocol is the open standard that connects AI agents to external tools. Here is everything you need to know: what MCP is, how it works, who created it, and where the ecosystem stands today.

April 15, 202610 min readToolRoute Team

Every AI model hits the same wall. It can reason, summarize, write code, and hold a conversation, but the moment it needs to check a database, send an email, or read a file, it has no hands. It cannot reach out and touch the world. The model needs tools.

Before 2024, connecting an AI model to external tools meant writing custom glue code for every integration. OpenAI had function calling. LangChain had its own tool abstraction. Every framework invented its own way to bridge the gap between "the model wants to do something" and "the tool actually does it." None of them talked to each other.

Model Context Protocol (MCP) is the open standard that fixes this. It defines a single, universal way for AI applications to discover and call external tools, regardless of which model, framework, or tool is involved.

What Is MCP? The One-Paragraph Definition

MCP (Model Context Protocol) is an open protocol, created by Anthropic and released in November 2024, that standardizes how AI models and agents connect to external tools, data sources, and services. It uses a client-server architecture: an MCP client inside the AI application connects to one or more MCP servers, each of which wraps a specific tool or API. Communication happens over JSON-RPC. The result is that any AI application with an MCP client can use any tool that has an MCP server, with zero custom integration code.

Think of MCP as USB for AI tools. Before USB, every peripheral needed its own proprietary connector. After USB, any device works with any computer. MCP does the same thing for the connection between AI models and the tools they use.

Why MCP Was Created

Anthropic announced MCP on November 25, 2024, alongside an open specification, SDKs for Python and TypeScript, and reference server implementations. The stated goal was to solve the M x N integration problem: if there are M AI applications and N tools, the industry was building M x N custom integrations. MCP reduces this to M + N. Each AI app implements one MCP client. Each tool implements one MCP server. They all work together.

The timing was not accidental. By late 2024, AI agents were moving from research demos to production systems. Companies needed their agents to interact with real infrastructure: databases, CRMs, payment processors, communication platforms. The lack of a standard protocol was becoming the primary bottleneck, not model capability.

How MCP Works: Client-Server Architecture

MCP has three layers: the host, the client, and the server.

LayerWhat It IsExamples
HostThe AI application the user interacts withClaude Desktop, Cursor, Windsurf, Claude Code
MCP ClientThe protocol handler inside the host that connects to serversBuilt into the host application (one client per server connection)
MCP ServerA lightweight program that wraps a tool and speaks the MCP protocolSupabase MCP, GitHub MCP, Playwright MCP, Stripe MCP

Here is the flow: a user asks Claude to "check the latest orders in my database." The host (Claude Desktop) passes this to the language model. The model decides it needs the execute_sql tool from the Supabase MCP server. The MCP client sends a JSON-RPC request to the Supabase MCP server. The server executes the query, returns the results over JSON-RPC, and the model incorporates the data into its response. The user sees the answer. The model never touched the database directly.

The Three MCP Primitives: Tools, Resources, and Prompts

MCP servers can expose three types of capabilities to clients:

Tools

Actions the model can invoke. Each tool has a name, a description, and a typed JSON Schema for its inputs. Examples: execute_sql, browser_navigate, create_issue, send_email. Tools are the most commonly used primitive.

Resources

Data the model can read. Resources are identified by URIs and can be static (a configuration file) or dynamic (database rows matching a query). Unlike tools, resources are read-only and do not perform side effects.

Prompts

Templated instructions that servers provide to guide model behavior for specific workflows. A Supabase MCP server might include a prompt template for "generate a migration file" that pre-fills context about the database schema. Prompts are optional and less commonly implemented than tools and resources.

The Three Transport Types

MCP defines three ways for clients and servers to communicate. The transport layer is independent of the protocol itself, so the same tool semantics work over any transport.

TransportHow It WorksBest ForLimitations
stdioClient spawns the server as a child process. Communication over stdin/stdout.Local development, CLI toolsOne process per connection. Does not scale to many clients.
SSE (Server-Sent Events)HTTP-based. Client sends POST requests, server pushes responses via SSE stream.Remote servers, early MCP deploymentsBeing superseded by Streamable HTTP in newer implementations.
Streamable HTTPSingle HTTP endpoint. Requests and responses as JSON-RPC over HTTP. Supports streaming.Production deployments, cloud-hosted servers, gatewaysNewest transport. Requires HTTP infrastructure.

In practice, stdio is what most developers encounter first: you configure a server in your mcp.json file, and the host application spawns it as a local process. Streamable HTTP is where the ecosystem is heading for production, because it allows MCP servers to run as remote services that multiple clients can connect to simultaneously. MCP gateways use Streamable HTTP to expose dozens of tools through a single remote endpoint.

What MCP Servers Look Like in Practice

An MCP server is typically a small program (often a single file) that does three things: advertises its tools when a client connects, handles incoming tool-call requests, and returns structured results. Here are real examples from the ecosystem:

  • Supabase MCP Server: Exposes tools like execute_sql, list_tables, apply_migration. An agent can query any Supabase database, create tables, and deploy edge functions without the developer writing SQL integration code.
  • Playwright MCP Server: Exposes browser automation tools: browser_navigate, browser_click, browser_take_screenshot. An agent can browse websites, fill forms, and capture screenshots by calling standard MCP tools.
  • GitHub MCP Server: Exposes repository operations: create issues, open pull requests, read files, search code. An agent can manage an entire development workflow through MCP tool calls.
  • Stripe MCP Server: Exposes payment operations: create customers, manage subscriptions, generate invoices, process refunds. An agent can handle billing workflows end to end.

You can explore these and dozens more in the ToolRoute registry, which curates the best MCP-compatible tools across 14 categories.

What MCP Clients Are

An MCP client is the other side of the connection. It lives inside the AI application (the host) and handles the protocol-level communication with MCP servers. When a language model decides it needs to call a tool, the MCP client is the component that actually sends the JSON-RPC request, receives the response, and passes the result back to the model.

As of 2026, MCP clients are built into: Claude Desktop, Claude Code, Cursor, Windsurf, Cline, Zed, and many other AI-powered development tools. OpenAI added MCP client support to ChatGPT and the Assistants API. Google integrated MCP client capabilities into Gemini. The protocol has become the de facto standard.

MCP vs. REST APIs

The most common question developers ask is: "How is this different from just calling a REST API?" The answer is that MCP and REST solve different problems.

DimensionREST APIMCP
Designed forApp-to-app communicationAI-to-tool communication
DiscoveryManual (read docs, hardcode endpoints)Automatic (client queries server for available tools)
Input schemasVaries (Swagger, custom docs, none)Standardized JSON Schema on every tool
Integration effortPer-API custom codeZero per-tool code (client handles all)
Who calls itDeveloper writes the callAI model decides when and how to call
Response formatVaries per API (JSON, XML, plain text)Standardized content blocks (text, images, embedded resources)
ComposabilityManual (developer chains calls)Native (model chains tools in a single conversation)

REST is not going away. MCP servers themselves often call REST APIs under the hood. The difference is that MCP provides the standard interface that lets an AI model dynamically discover and compose tools at runtime, rather than requiring a developer to hardcode every integration in advance.

The MCP Ecosystem in 2026

Eighteen months after its release, MCP has become the dominant standard for AI-tool integration. The ecosystem includes:

  • The Official MCP Registry at registry.modelcontextprotocol.io lists thousands of community-built servers
  • Third-party directories like Glama, mcpservers.org, PulseMCP, and Smithery index and categorize servers
  • MCP gateways like ToolRoute go beyond listing to provide unified execution, authentication, and billing across tools
  • Major AI providers (OpenAI, Google, Microsoft, Anthropic) all support MCP in their client applications
  • Enterprise adoption is accelerating as companies realize that standardizing on MCP reduces integration costs by an order of magnitude

The protocol itself continues to evolve. The Streamable HTTP transport (added in the 2025 revision) enabled remote MCP servers and gateways. Upcoming work focuses on authentication standards, server-to-server communication, and richer resource types.

How to Get Started with MCP

If you are building an AI agent that needs tools, here is the fastest path:

  1. Pick a host with MCP support. Claude Desktop, Claude Code, and Cursor all have built-in MCP clients.
  2. Find the servers you need. Check the ToolRoute catalog or the official registry for the best MCP servers in your category.
  3. Configure your servers. Add them to your mcp.json file for local stdio servers, or point your client at a remote Streamable HTTP endpoint.
  4. Or use a gateway. If you need many tools without managing many servers, an MCP gateway gives you one endpoint and one API key for everything.

Frequently Asked Questions

What is MCP?

MCP (Model Context Protocol) is an open standard created by Anthropic in November 2024 that defines how AI models and agents connect to external tools, data sources, and services. It uses a client-server architecture where MCP clients inside AI applications communicate with MCP servers wrapping tools using a standardized JSON-RPC protocol. MCP eliminates the need for custom integrations between every AI model and every tool.

Who created MCP?

Anthropic created and open-sourced the Model Context Protocol in November 2024. It was released under an open specification so any AI provider, tool builder, or developer can implement it. By 2026, MCP has been adopted across the AI industry by companies including OpenAI, Google, Microsoft, and thousands of independent developers.

What is an MCP server?

An MCP server is a lightweight program that wraps an external tool, API, or data source and exposes it through the Model Context Protocol. It advertises the tools it offers with names, descriptions, and input schemas. It handles incoming requests from MCP clients, executes the underlying operations, and returns structured results. Examples include MCP servers for Supabase, GitHub, Stripe, Playwright, and Slack.

What is an MCP client?

An MCP client is the component inside an AI application that connects to MCP servers, discovers available tools, and sends tool-call requests on behalf of the AI model. Claude Desktop, Cursor, Windsurf, and Claude Code all contain built-in MCP clients. When a language model decides it needs to call a tool, the MCP client handles the protocol communication with the server.

How is MCP different from REST APIs?

REST APIs are designed for application-to-application communication where the developer knows the endpoints in advance. MCP is designed for AI-to-tool communication where the model dynamically discovers what tools are available and what inputs they accept. MCP includes built-in tool discovery, typed input schemas, and a standardized protocol that works the same regardless of the underlying tool. REST requires per-API code; MCP provides a universal interface.

What are the best MCP servers?

The most widely used MCP servers in 2026 include Supabase (database), GitHub (code), Playwright (browser automation), Stripe (payments), Slack (messaging), and Brave Search (web search). The official registry at registry.modelcontextprotocol.io lists thousands of servers. See our curated list of the best MCP servers for 2026.

ToolRoute is an MCP gateway that gives AI agents access to 87 tools through one API key and one endpoint. Read the docs or explore the tool catalog.