Anthropic launched the Model Context Protocol (MCP) in November 2024 to solve a problem everyone else had been tiptoeing around: the NxM integration nightmare. Every AI platform needed its own custom connectors for every tool it wanted to work with. ChatGPT Plugins? ChatGPT only. OpenAI’s function-calling API? Same story. Want your tool to play nice with four different AI platforms? You were building four completely separate integrations.
Then within six months something interesting happened. OpenAI adopted MCP in March 2025. Google DeepMind came on board in April 2025. Microsoft announced preview support in May 2025. The MCP Registry grew 407% from September to November 2025, hitting nearly 2,000 server entries from the likes of GitHub, Stripe, Notion, and Salesforce.
When competing AI companies adopt a competitor’s standard that fast, something real is happening. For you, that vendor lock-in concern that probably had you hedging your bets across multiple platforms just got a whole lot smaller.
This article walks you through how the adoption played out, who’s running the show (and why it’s not just Anthropic), how the registry ecosystem works, and what it all means when you’re deciding which AI platform to commit to. For a comprehensive foundation on what is Model Context Protocol and its architecture, start with our complete guide.
What is the Model Context Protocol and why did major platforms adopt it?
MCP is an open-source protocol Anthropic introduced in November 2024 to standardise how AI systems connect to external tools and data sources. It runs on JSON-RPC 2.0 for transport—if you’ve worked with APIs before, this should feel familiar. The core architecture takes its cues from Microsoft’s Language Server Protocol, which solved the exact same integration headache for IDEs years back.
The problem MCP tackles is pretty straightforward. Before MCP showed up, four AI platforms and fifty tools meant you needed 200 custom integrations. With MCP, you build 50 servers (one per tool) and 4 platform clients. That’s 54 implementations instead of 200.
So why would OpenAI and Google adopt something built by their competitor? Because cutting down integration friction helps everyone. Developers were getting tired of building the same integration three different ways just to cover multiple AI platforms. ChatGPT Plugins locked you into OpenAI-specific development. Custom GPTs tied integrations to specific GPT instances. Every vendor had built its own walled garden.
MCP gives you a universal interface for reading files, running functions, and managing contextual prompts. It handles stdio for local integrations and HTTP with Server-Sent Events for remote ones. By building on JSON-RPC 2.0 and supporting OAuth 2.0 for security, MCP slots into infrastructure you’ve probably already got running. You’re not ripping out your entire auth stack to use it.
Multi-vendor support also gives you something practical: you can actually negotiate with multiple AI vendors knowing your integrations stay portable. That kind of procurement leverage didn’t exist when every integration locked you to one platform.
Which major AI platforms have adopted MCP and when?
Anthropic kicked things off in November 2024 with Claude Desktop as the reference implementation, launching alongside SDKs for Python and TypeScript. That was day one.
OpenAI committed to MCP support in March 2025, rolling it out across ChatGPT Desktop, their Agents SDK, and Responses API. They were the first major competitor to get on board. Google DeepMind confirmed MCP support in April 2025 for upcoming Gemini models and infrastructure, though they also built the competing Agent2Agent protocol (more on that in a bit).
Microsoft announced at Build 2025 an early preview of Windows 11 embracing MCP, with integrations running through Copilot Studio, Azure OpenAI Service, and Semantic Kernel. That was May 2025. Salesforce joined in June 2025, building their Agentforce 3 platform around MCP interoperability and shipping servers for Salesforce DX, Heroku Platform, and MuleSoft.
Block and Apollo jumped in early. AWS started building MCP servers for code assistants. Dev tools companies like Zed, Replit, Codeium, and Sourcegraph implemented MCP for their AI-powered coding features.
Six months from launch to four major platform adopters. Compare that to OAuth 2.0, which took roughly four years, or OpenAPI at about five years. MCP became the de-facto standard in less than twelve months.
That speed tells you something. When four competing platforms adopt something this fast, it’s because developers were demanding it and the alternative was watching everyone build integrations for whichever platform made it easiest.
What is the MCP Registry and why does it matter for standardisation?
The MCP Registry launched in September 2025 as a centralised directory of available MCP servers. Think of it as a discovery tool. Instead of digging through GitHub or docs hoping someone already built the connector you need, you just check the registry.
By November 2025, it had nearly 2,000 entries. That 407% growth in two months shows real developer momentum. You’ll find servers from GitHub for engineering automation, Stripe for payment workflows, Notion for productivity management, Hugging Face for model management, and Postman for API testing.
The registry sets MCP apart from proprietary approaches where each vendor runs their own marketplace. With ChatGPT Plugins, you went to OpenAI’s plugin store. With MCP, there’s one registry that works everywhere. That centralisation creates a network effect: more servers pull in more AI platform adopters, which brings more server developers, which builds more servers.
Salesforce contributed servers for Heroku Platform and MuleSoft. AWS is developing servers for code assistants. The registry quality gets maintained through a security vetting process run by Anthropic and the community, though you’ll still want to do your own security assessment before putting anything into production. If you’re considering building MCP servers for your organisation, the registry provides a submission pathway and discovery mechanism once they’re production-ready.
You can also run your own internal registry with your own governance controls. You don’t need to publish proprietary tool connectors publicly if that doesn’t make sense for your business.
Over 1,000 community-built servers exist because developers thought it was worth the time to build them. That only happens when people believe the standard’s going to stick.
Who governs MCP and how does multi-stakeholder control reduce vendor risk?
MCP is maintained by 9 core maintainers in the steering group, supported by 58 maintainers, with 2,900+ contributors hanging out on Discord. About 100 new contributors join every week. This governance model builds on MCP fundamentals established at launch but expands control beyond a single vendor.
The steering committee includes Anthropic maintainers and independent community leaders working together on spec changes. The Spec Enhancement Proposal (SEP) process drives collaborative evolution—the community processed 17 SEPs in roughly a quarter. Anyone can submit proposals. Maintainers review them, working groups talk them through, the steering committee votes.
One maintainer put it clearly: “People think the value of MCP is the protocol. The value is getting people to agree and do something.” Another said: “There is a focus on not overcomplicating the specification, and not designing ahead of need. When that’s the ethos driving decision-making, everyone’s voice matters.”
This multi-stakeholder setup reduces the risk that comes with vendor abandonment or sudden hostile changes. If Anthropic walks away or makes decisions the community hates, the openly licensed specs can be forked and carried forward. The Language Server Protocol followed this same path—from Microsoft-controlled to community-governed.
How does MCP compare to proprietary AI integration methods?
ChatGPT Plugins and Custom GPTs & Actions represent OpenAI’s proprietary approach. Build a ChatGPT Plugin and it only worked with ChatGPT. Create a Custom GPT with Actions and those integrations locked to that specific GPT instance. Want your tool to work with Claude or Gemini? Start from scratch.
The practical difference shows up when you want to switch platforms. With proprietary integrations, migration means rebuilding everything. With MCP, your existing integrations work straight away.
There’s a competing standard worth knowing about. Google donated the Agent2Agent (A2A) protocol to the Linux Foundation, with backing from AWS, Cisco, Microsoft, Salesforce, SAP, and ServiceNow. Over 100 companies now support A2A. It’s built on JSON-RPC 2.0 over HTTP, just like MCP.
The protocols look complementary on paper: A2A handles agent-to-agent coordination, while MCP manages AI-to-tool integration. But if they both grow to cover similar ground, they could end up competing. MCP’s four-platform adoption versus A2A’s single-vendor primary support suggests MCP is the dominant standard for tool integration right now, but things could shift.
Investment in proprietary integration development sits at risk if that vendor exits the market or dramatically changes their API. Investment in MCP development gets protected by multi-vendor support.
What enterprise security features make MCP production-ready?
The June 2025 spec update classified MCP servers as OAuth 2.0 resource servers. This gives you granular permissions and authorisation controls using infrastructure you likely already have running. If you’re using Auth0, Okta, or Azure Active Directory for identity management, those integrate with MCP without building new authentication systems.
The November 2025 spec introduced Client ID Metadata Documents (CIMD), replacing Dynamic Client Registration for scenarios with unbounded clients and servers. MCP uses OAuth 2.1 as the default authorisation approach, leveraging existing identity infrastructure with security guidelines that address confused deputy problems, token passthrough vulnerabilities, and session hijacking.
For transport security, MCP supports stdio for local-only communication (which is inherently secure) and HTTP with Server-Sent Events for remote servers (which requires OAuth and TLS encryption).
The MCP gateway pattern is emerging for enterprise governance. Gateways like MCP Manager sit between clients and servers, providing centralised observability, rate limiting, JWT token validation, security headers, protocol version transformation, and caching. Cloudflare introduced MCP Server Portals to centralise, secure, and observe MCP connections across organisations.
What does MCP adoption mean for technical decision-makers?
MCP’s multi-vendor support reduces the risk that comes with picking an AI platform. You can commit to AI investments without the vendor lock-in worry that probably held you back before.
Your engineering team builds one MCP server instead of four proprietary integrations. Companies using MCP servers report reduced complexity in managing tool integrations and fewer errors. Even if your preferred AI vendor exits the market or jacks up pricing, your integrations work with alternative platforms.
The competitive dynamics signal something worth noting. When OpenAI and Google adopt Anthropic’s standard instead of building their own, that’s not typical vendor behaviour. It happens when the alternative is worse: fragmented ecosystems where developers pick one platform and ignore the rest.
There’s risk to keep in mind. The A2A protocol exists and has backing from major companies. If A2A gains momentum for agent-to-agent communication and MCP stays focused on tool integration, they might coexist happily. If they end up competing head-to-head, you’ll need to watch which one picks up more platform adoption. Right now, MCP’s four-platform support versus A2A’s single primary vendor suggests MCP is the standard for tool integration.
Next steps worth considering: evaluate the MCP registry for servers that match your needs. Assess whether migrating from proprietary integrations makes sense based on your switching costs and the ROI of multi-vendor support. Look at security requirements for production deployment, particularly around OAuth integration and gateway governance.
The standardisation trajectory shows real momentum: six months to four major platforms, registry growth at 407% over two months, active governance processing 17 proposals in a quarter. MCP is becoming infrastructure rather than vendor tooling.
FAQ Section
How quickly did MCP achieve industry-wide adoption?
MCP reached four major platform adopters (OpenAI, Google, Microsoft, Salesforce) within six months of the November 2024 launch. That’s significantly faster than OAuth 2.0 (roughly four years) or OpenAPI (about five years) to reach comparable cross-vendor adoption. The MCP Registry grew 407% from September to November 2025, which points to strong developer momentum.
Does Google support both MCP and Agent2Agent protocols?
Yes. Google DeepMind confirmed MCP support in April 2025 while also developing the Agent2Agent (A2A) protocol, which was donated to the Linux Foundation. The protocols might turn out to be complementary (A2A for agent coordination, MCP for tool integration) or they might compete. MCP’s four-platform support suggests it’s dominant for tool integration at the moment, but the situation could change.
Can organisations switch between AI platforms without rebuilding integrations?
Yes, and that’s MCP’s main selling point. A single MCP server implementation works across Anthropic Claude, OpenAI ChatGPT, Google Gemini, and Microsoft Copilot. Switching platforms doesn’t require any integration redevelopment, which cuts down vendor lock-in substantially. With proprietary integrations, you’d be rebuilding everything from scratch.
Who controls the MCP specification and prevents vendor takeover?
The MCP Steering Committee governs the protocol with 9 core maintainers, 58 supporting maintainers, and 2,900+ community contributors. The Spec Enhancement Proposal (SEP) process enables collaborative evolution, with 17 proposals processed in the first quarter. Multi-stakeholder control reduces the risk of Anthropic making unilateral changes. The protocol specifications are openly licensed, so the community can fork them if needed.
Is MCP secure enough for enterprise production deployments?
Yes, especially with the June 2025 spec update classifying MCP servers as OAuth 2.0 resource servers. This lets enterprise identity providers (Auth0, Okta, Azure AD) manage authentication using infrastructure you’ve already got. The November 2025 spec introduced Client ID Metadata Documents for dynamic client registration. MCP gateways provide governance and observability for production environments.
How many MCP servers are available in the registry?
The MCP Registry contained nearly 2,000 entries by November 2025, representing 407% growth from the September baseline. Available servers include GitHub for engineering automation, Stripe for payment workflows, Notion for productivity, Hugging Face for model management, Postman for API testing, plus Salesforce servers (Heroku, MuleSoft), AWS servers, and options built by the community.
What happened to OpenAI’s ChatGPT Plugins after MCP adoption?
ChatGPT Plugins represented OpenAI’s proprietary integration approach. OpenAI’s March 2025 MCP adoption signals a shift towards the open standard, though legacy plugins may stick around during migration. MCP supersedes the plugin architecture with multi-vendor compatibility. Custom GPTs & Actions currently remain as OpenAI-specific integration methods.
Does MCP work with existing OAuth identity providers?
Yes. The June 2025 spec classified MCP servers as OAuth 2.0 resource servers, enabling integration with Auth0, Okta, Azure Active Directory, and other enterprise identity providers. Organisations use existing authentication infrastructure without building new identity systems. MCP uses OAuth 2.1 as the default authorisation approach.
What is the cost reduction from using MCP vs proprietary integrations?
For organisations integrating with multiple AI platforms, MCP cuts development costs substantially. Instead of N platforms times M tools (4 platforms × 50 tools = 200 custom implementations), MCP requires building 50 servers plus 4 clients (54 implementations total). Companies using MCP servers report decreased complexity in managing tool integrations and fewer errors.
Can organisations build private MCP servers or must they use the public registry?
Organisations can build private MCP servers using Anthropic’s SDKs without submitting to the registry. The registry provides discovery for public servers but it’s not required. Enterprise deployments often use internal servers for proprietary data and tools, with MCP gateways handling governance. The registry supports enterprises adopting their own registries with self-managed governance controls.
What is the MCP gateway pattern for enterprise governance?
MCP gateways sit between clients and servers, providing centralised observability and governance. Features include team provisioning, security policies (rate limits, allowed servers), audit logging, identity management, JWT token validation, security headers, protocol version transformation, and caching. Tools like MCP Manager and Cloudflare’s MCP Server Portals handle production monitoring and compliance requirements.
How does MCP handle authentication for remote servers?
MCP supports multiple transports. Stdio for local-only communication is inherently secure. HTTP with Server-Sent Events for remote servers requires OAuth 2.0 plus TLS encryption. OAuth resource server classification enables granular permissions per server and tool with token-based authentication. Servers implement Protected Resource Metadata endpoints and token validation middleware for security.