Glossary
What is Model Context Protocol (MCP)?
Model Context Protocol (MCP) is an open standard for connecting AI tools to data sources, knowledge layers, and external services. Released by Anthropic in 2024, MCP lets any compatible AI client (Claude, ChatGPT, Cursor, Codex) query a server using a common protocol. MCP makes AI integrations vendor-neutral: build once, connect to any MCP-compatible client.
Last updated:
Origins
Where the term comes from
Anthropic released the Model Context Protocol as an open specification in 2024 alongside reference implementations and SDKs. The protocol was designed to address fragmentation: every AI tool had its own way of connecting to external data, which meant integrations had to be built per-tool. MCP standardizes the interface so a single server can serve many clients.
Capabilities
What Model Context Protocol (MCP) does
Standardizes AI tool integrations
One protocol works across Claude, ChatGPT, Cursor, Codex, and any other MCP-compatible client. No more per-tool integrations.
Exposes tools, resources, and prompts
MCP servers can expose callable tools (functions), resources (data), and prompts (templates) to AI clients in a structured way.
Vendor-neutral architecture
Servers built once work across all AI vendors. AI clients can connect to many servers.
Powers Context OS integrations
A Context OS like DearTech-OS exposes its knowledge graph through an MCP server, so any compatible AI tool can query, search, and traverse the graph.
Distinctions
Model Context Protocol (MCP) vs adjacent concepts
Model Context Protocol (MCP) is often confused with related but distinct ideas. Here is how it differs.
| Concept | What it is | How Model Context Protocol (MCP) differs |
|---|---|---|
| OpenAI plugins | Per-tool plugin system, OpenAI-specific. | Open, vendor-neutral standard. Servers work across multiple AI clients. |
| Function calling | Per-prompt function definitions, in-conversation. | Persistent, server-based connections. The server lifecycle is independent of any single conversation. |
| Traditional API | Per-app integration. Each AI tool needs its own client code. | Standardized protocol. AI tools have built-in MCP support, no per-app integration needed. |
Who uses it
Who uses Model Context Protocol (MCP)
Anthropic (Claude), OpenAI (ChatGPT MCP support is rolling out), Cursor, Codex, and a growing list of AI tools support MCP natively. Companies building Context OS infrastructure, internal AI tools, or custom AI integrations use MCP to avoid per-tool integration work.
FAQ
Common questions about Model Context Protocol (MCP)
Which AI tools support MCP?
Claude (Pro, Team, Enterprise) supports MCP natively through Claude Desktop and Claude Code. Cursor supports MCP. ChatGPT MCP support is rolling out. Codex and other tools have varying levels of support, with the ecosystem expanding rapidly.
How is MCP different from a traditional API?
An API is per-app: each AI tool needs its own client code to call your API. MCP is standardized: any MCP-compatible AI tool can connect to your MCP server without per-tool integration work. Build once, connect everywhere.
How does DearTech-OS use MCP?
DearTech-OS exposes the company knowledge graph as an MCP server. AI tools that support MCP (Claude, Cursor, etc.) can search, traverse, and query the graph through the standard protocol, without DearTech-OS having to build a per-tool integration.
Related terms
Keep going
See Model Context Protocol (MCP) in practice
DearTech-OS is a Context OS for founder-operators. Explore the product or talk through whether one is right for your team.