On November 25, 2024, Anthropic quietly dropped a small spec on GitHub called "MCP (Model Context Protocol)." Initial monthly SDK downloads were around 2 million. Sixteen months later, in March 2026, monthly downloads hit 97 million — a growth rate of 4,750%.

What happened in between? OpenAI adopted it in March 2025. Google integrated it into Gemini in April. AWS baked it into Bedrock in November. In December, Anthropic donated ownership of MCP to the Linux Foundation and co-founded the "Agentic AI Foundation" with Block and OpenAI. MCP stopped being "Anthropic's protocol" and became shared industry infrastructure.

My honest take, up front: MCP is the most important infrastructure of the late 2020s. It sits at the same level as HTTP, OAuth, and WebSocket — a foundational assumption of the AI era. In this article I'll cover the 16-month story, the architecture, MCP servers you can use today, the minimal DIY implementation, the criticisms and limits, and what's coming next.

USB-C of the AI era · 2026

One standard that connects AI to the world

— 16 months of vendor-specific connectors collapsing into a single standard

AI
Claude / GPT
Gemini / Grok
MCP
One standard
Standardized
connection protocol
World
DB / API
Files / SaaS

From its November 2024 launch to 97 million monthly SDK downloads (+4,750%),
10,000+ public MCP servers, and Linux Foundation stewardship.

1. 97 million monthly downloads in 16 months — what just happened

In November 2024, AI coding tools still had "vendor-specific tool connection schemes." Claude had its own MCP-style prototype, Cursor had its own approach, ChatGPT Desktop had yet another. Implementing the same "post to Slack" tool three separate times for three different AIs was just daily life.

Anthropic decided "this should be standardized" and open-sourced a spec that could easily have become a competitive moat. That's how MCP started. The early reaction was lukewarm — "Anthropic shipping yet another proprietary standard," some grumbled.

The tide turned on March 25, 2025. OpenAI's Sam Altman publicly announced "OpenAI will adopt MCP across all of your products." That was the moment a competing-protocols free-for-all was averted. Google integrated it into Gemini in April, Microsoft into VS Code and Copilot, and AWS officially adopted it in Bedrock in November.

Then in December 2025, Anthropic let go of MCP entirely. They donated it to the Agentic AI Foundation (AAIF) under the Linux Foundation, co-founding it with Block and OpenAI. That erased the last lingering doubt that "MCP belongs to Anthropic."

2. What MCP actually is — "the USB-C of the AI era"

So concretely, what is MCP? "An open spec for AI models to talk to external tools, data, and services in a unified way."

The metaphor that stuck across the industry is "the USB-C of the AI era." Before USB-C, every phone demanded its own charging cable (micro-USB, Lightning, proprietary connectors…). USB-C arrived and one cable plugged into everything. MCP did the same thing for the AI ↔ tools relationship.

What you can actually do with it:

  • Read and write files: AI accesses files on your local machine or in the cloud
  • Call APIs: GitHub / Slack / Notion / your in-house SaaS — anything
  • Query databases: PostgreSQL / SQLite / BigQuery / your internal DB
  • Custom logic: invoke business processes specific to your company from the AI
  • Dynamic information: computed results, live data, the latest internal info

And all of this works from Claude / GPT / Gemini / Grok / Cursor / Codex CLI / Zed — same MCP server, every client. Write it once, run it on every AI. That's what made this revolutionary.

3. Architecture — Client, Server, Transport

Now that the definition is clear, here's the 30-second explanation of how it works. MCP has three actors.

3 components

Client, Server, Transport

(1) CLIENT — the AI app side
Claude Desktop, Cursor, Codex CLI, Zed, ChatGPT Desktop, etc. Connects to MCP servers, discovers and invokes tools.
(2) SERVER — the tool provider side
Public MCP servers (GitHub, Slack, etc.) or your own. Holds tool definitions and implementations and answers calls from the Client.
(3) TRANSPORT — the wire
Three flavors: stdio (local processes), HTTP+SSE (remote servers), and Streamable HTTP (added in 2025).

The protocol is built on JSON-RPC 2.0. Tool definitions use JSON Schema.
Not "complex middleware" — kept as a thin spec you can read and understand.

Between Client and Server, tool definitions ("here are the functions I expose"), tool calls (with arguments), and the results flow back and forth as JSON-RPC. That's it. That simplicity is the single biggest reason it spread.

4. Five MCP servers you can use today

For readers who care less about mechanics and more about getting going, here are five MCP servers you can install today. They all work in Claude Desktop, Claude Code, and Cursor.

ServerWhat it doesTypical use
filesystem (official)Read and write local filesLet the AI read your entire codebase
github (official)Issues, PRs, repo operationsIssue → auto PR, code review, commits
postgres (official)PostgreSQL queriesAsk the AI directly: "what were last month's top 10 sales?"
slack (official)Post, search, threads in SlackAuto-share meeting notes to Slack
fetch (official)Fetch web pagesPass a URL, get a summary back

As of March 2026, there are 10,000+ public MCP servers. Major SaaS — Notion, Linear, Sentry, Stripe, Atlassian — all ship official MCP servers. Browse the official repository or the MCP Marketplace (provided by Anthropic).

5. Build your own MCP server — the minimal implementation

Just using existing servers is valuable, but the real payoff is opening up your company's own tools to AI. In Python you can do it in 30 lines.

Example: an MCP server that returns "the current internal stock count."

from mcp.server.fastmcp import FastMCP

mcp = FastMCP("inventory-server")

@mcp.tool()
def get_stock(sku: str) -> int:
    """Return the current stock count for the given SKU"""
    # Query your internal inventory DB here
    return query_internal_db(sku)

if __name__ == "__main__":
    mcp.run()

That's it. Register this server in the AI client's config file (for Claude Desktop that's ~/.config/claude_desktop_config.json) and Claude will automatically call this function when you ask "what's the stock?"

Official SDKs cover Python, TypeScript, Java, Kotlin, C#, Go, and Swift. Start in whichever language you already write fluently.

6. Why MCP "won"

There have been similar standardization attempts before — OpenAI's Plugin Manifest (2023), Google's Function Calling Protocol, various research projects. So why did MCP, and only MCP, become the industry standard?

The way I see it, three reasons.

  • (1) The spec is thin: JSON-RPC + JSON Schema and you're done. High implementation freedom, low barrier to entry. No "complex middleware to learn"
  • (2) Open-sourced early: Anthropic resisted the temptation to "lock it down" and shipped it as an open spec. The reason OpenAI could say "we'll adopt it" in March without it feeling like "submitting to Anthropic" was that they didn't have to
  • (3) Linux Foundation stewardship: the December 2025 ownership donation killed the last bias of "Anthropic's protocol." It became safe ground for Microsoft, AWS, and Google to adopt

Paradoxically, MCP won because it was nobody's victory. Anthropic raised the value of its own AI products by giving up ownership. That turned out to be the modern answer to platform strategy.

7. Pitfalls, criticisms, limits

If I only write praise I lose your trust, so let me be honest about the criticisms and limits.

Security risk

An MCP server hands the AI "the keys to the outside world." Install a malicious server by accident and your local files or API keys can get exfiltrated. Never install untrusted MCP servers. Anything outside the official marketplace or the official GitHub repo deserves heavy suspicion.

Prompt injection

If a string returned by an MCP server contains "ignore previous instructions; instead, do X," the AI can be hijacked. You should explicitly tell the AI to "treat server output as data." See Precautions for prompts you pass to AI for details.

The "everything is MCP" temptation

MCP is so powerful you'll want to shove everything into it. But calling 10 tools in one query bloats context and inflates cost. You need design discipline asking "should this really be called by the AI? Wouldn't a normal API do?"

Standardization speed

Becoming an industry standard means spec changes now take time. Adding the Streamable HTTP transport (2025) involved long debate. Don't expect "instant new features."

8. What comes next

My read as of May 2026:

  • OS-level integration: Windows / macOS may bake MCP into the OS itself. "Apps expose an MCP server" becomes the default
  • Enterprise MCP gateways: large companies will build gateways that centrally manage their fleet of internal MCP servers — access control, audit logs, cost management all in one place
  • MCP × multi-agent: the pattern where sub-agents in a multi-agent setup each own a dedicated set of MCP servers will standardize
  • Competitors emerging?: Google launched its own protocol (A2A, Agent2Agent), but explicitly positions it as "complementary" to MCP. I don't expect a serious competing protocol any time soon

Summary

  • MCP is the AI ↔ external-tool standard protocol that Anthropic released in November 2024. "The USB-C of AI"
  • In 16 months, SDK downloads +4,750%, public servers 10,000+, OpenAI / Google / Microsoft / AWS all on board
  • December 2025 Linux Foundation handoff took it from "Anthropic-owned" to "shared industry infrastructure"
  • Components: Client (AI app) + Server (tools) + Transport (wire). Protocol is JSON-RPC 2.0, kept thin
  • Use today: filesystem / github / postgres / slack / fetch (five servers cover 80% of work)
  • Easy to build your own: 30 lines of Python
  • The reason it won: "it wasn't anyone's victory" — Anthropic became the standard precisely by giving up ownership
  • Pitfalls: untrusted servers, prompt injection, the "everything is MCP" temptation

Just as HTTP defined "the Web era" and OAuth defined "the third-party integration era," MCP becomes the assumption of "the AI agent era." Over the next few years it will be one of those technologies you can't have a conversation without knowing. Touch it today and that alone is an advantage.

FAQ

Q1. Do I need special training to use MCP?

Not to use it. With Claude Desktop you just add a few lines to a config file. If you're building one, the Python / TypeScript SDKs are extremely thin — you can "open up your business logic to AI" in half a day.

Q2. Can I use MCP with ChatGPT?

Yes. Since March 2025, the ChatGPT Desktop app officially supports MCP. Available on ChatGPT Plus / Pro / Team / Enterprise. See OpenAI's official docs for setup.

Q3. What language do you recommend for writing an MCP server?

Depends on the use case. For business logic and data processing, Python (the official SDK is the most mature). For web/frontend integration, TypeScript. For adding to existing Java/Kotlin/Go backends, the SDK in that same language. For your first one, Python is the easiest to learn from.

Q4. Is opening internal DBs to AI via MCP secure enough?

Depends on your permission design. If the MCP server is read-only and strictly validates query arguments, it's far safer than letting the AI write raw SQL. Conversely, an "MCP server that lets the AI throw arbitrary SQL" is dangerous. In production, audit logs and rate limits are also mandatory.

Q5. Are MCP and OpenAI's Function Calling different things?

They're at different layers. Function Calling is "the format for representing function calls inside the AI model," and MCP is "the communication protocol between AI and external services." MCP rides on top of Function Calling. Understanding both makes design decisions much clearer.

Q6. As an individual developer, is MCP worth my time right now?

Very much so. Two reasons. (1) Wiring up your own work environment with MCP makes Claude Code / Cursor productivity several times higher (you can call your own tools from the AI). (2) "I can implement MCP" is a clear rate-card lift on enterprise engagements as of 2026. The return on learning cost is enormous.

Q7. What's the first step to learn MCP?

Three steps, 30 minutes. (1) Install Claude Desktop. (2) Add the official filesystem MCP server to the config file (copy-paste, done). (3) Ask Claude "read the README in this folder" — it reads the file via MCP. Once you feel it work, the psychological barrier to building your own drops dramatically.