Featured image of post MCP (Model Context Protocol): The 'USB Port for AI' Mental Model

MCP (Model Context Protocol): The 'USB Port for AI' Mental Model

What is MCP and why is every AI tool adopting it in 2026? A mastery guide to Anthropic's open standard for connecting AI models to the real world.

Before USB, every device needed a custom cable — keyboards used PS/2, mice used serial ports, printers used parallel ports. It was a mess.

USB standardized everything: one protocol, any device.

Before MCP (Model Context Protocol), every AI model needed custom integration code to connect to every tool — Slack, GitHub, Notion, your database. A custom plugin for every pair. An N×M integration nightmare.

MCP is the USB Port for AI. One open standard. Any AI model. Any data source or tool.


Part 1: Foundations (The Mental Model)

The N×M Integration Problem

Without MCP:

1
2
3
4
5
Claude ←→ [custom code] ←→ GitHub
Claude ←→ [custom code] ←→ Notion
Claude ←→ [custom code] ←→ PostgreSQL
GPT-4 ←→ [custom code] ←→ GitHub   (different custom code!)
GPT-4 ←→ [custom code] ←→ Notion   (yet another...)

With N models and M tools, you need N×M custom integrations.

MCP = The Universal Standard

With MCP:

1
2
3
4
Claude ──┐
GPT-4o ──┤── MCP Protocol ──► GitHub MCP Server
Gemini ──┘                ──► Notion MCP Server
                          ──► PostgreSQL MCP Server
  • MCP Host: The AI application (Claude Desktop, your custom agent).
  • MCP Client: Lives inside the host; manages the connection to servers.
  • MCP Server: A lightweight server exposing tools, resources, and prompts.

Any MCP-compatible model can use any MCP-compatible server. Write the server once. All AI models benefit.


Part 2: The Investigation (What MCP Exposes)

An MCP Server exposes three types of capabilities:

1. Tools (Functions the AI Can Call)

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
# Example: A database MCP server exposing a query tool
{
    "name": "execute_sql",
    "description": "Execute a read-only SQL query on the production database",
    "inputSchema": {
        "type": "object",
        "properties": {
            "query": {"type": "string", "description": "The SQL query to execute"}
        },
        "required": ["query"]
    }
}

2. Resources (Data the AI Can Read)

Resources are data sources the AI can access — like files, database records, or API responses.

1
2
# The AI can request: mcp://postgres/tables/orders
# The server returns: schema and sample data for the orders table

3. Prompts (Reusable Prompt Templates)

Pre-built prompt templates that users can invoke. Example: “Analyze sales data for Q4” → triggers a pre-built analytical prompt with real data injected.


Part 3: The Diagnosis (Building Your Own MCP Server)

Using the Python mcp SDK:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
from mcp.server.fastmcp import FastMCP
import psycopg2

# Create the MCP server
mcp = FastMCP("Company Data Server")

@mcp.tool()
def get_recent_orders(limit: int = 10) -> list[dict]:
    """Get the most recent orders from the database."""
    conn = psycopg2.connect("postgresql://...")
    cur = conn.cursor()
    cur.execute("SELECT id, customer, total, status FROM orders ORDER BY created_at DESC LIMIT %s", (limit,))
    return [{"id": r[0], "customer": r[1], "total": r[2], "status": r[3]} for r in cur.fetchall()]

@mcp.tool()
def get_customer_stats(customer_id: int) -> dict:
    """Get lifetime value and order count for a customer."""
    ...

@mcp.resource("company://docs/{filename}")
def get_document(filename: str) -> str:
    """Retrieve a company document by filename."""
    with open(f"/company_docs/{filename}") as f:
        return f.read()

# Run as a standalone server
if __name__ == "__main__":
    mcp.run(transport="stdio")  # Claude Desktop uses stdio transport

Connecting to Claude Desktop

1
2
3
4
5
6
7
8
9
// ~/.config/claude/config.json
{
    "mcpServers": {
        "company-data": {
            "command": "python",
            "args": ["/path/to/my_mcp_server.py"]
        }
    }
}

Now Claude Desktop has access to your company database through natural language queries!


Part 4: The Resolution (MCP in Production)

Security: The Critical Rules

  • Never expose write operations without confirmation: Read-only tools by default.
  • Authenticate every request: JWT tokens or API keys per MCP server.
  • Scope per client: Different AI clients get different tool access levels.

The MCP Ecosystem (2026)

Major MCP servers already available:

  • GitHub MCP: Read repos, issues, PRs, commit history.
  • Postgres MCP: Query your database.
  • Slack MCP: Read messages, send notifications.
  • Google Drive MCP: Read/write documents and sheets.
  • Filesystem MCP: Read local files (with sandboxing).

Using MCP Servers with LangChain

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
from langchain_mcp_adapters.tools import load_mcp_tools
from langgraph.prebuilt import create_react_agent
from mcp import StdioServerParameters

# Load tools from an MCP server
server_params = StdioServerParameters(
    command="python",
    args=["my_mcp_server.py"]
)

async def run_agent():
    async with load_mcp_tools(server_params) as tools:
        # These tools are now usable by any LangChain agent
        agent = create_react_agent("openai:gpt-4o", tools)
        result = await agent.ainvoke({"messages": [
            {"role": "user", "content": "Show me the top 5 orders from today."}
        ]})
        print(result["messages"][-1].content)

Final Mental Model

1
2
3
4
5
6
7
8
9
Pre-MCP    N models × M tools = N×M custom integrations. Chaos.
MCP        One protocol. Any model, any tool. Like USB.

MCP Tool       A function the AI can call (execute_sql, send_slack_message).
MCP Resource   Data the AI can read (files, DB schema, docs).
MCP Prompt     A pre-built prompt template the AI can invoke.
MCP Server     Your custom integration. Write once, all models benefit.

Security rule  Read-only by default. Authenticate everything.

MCP is the missing infrastructure layer between AI models and the real world. In 2026, every serious AI application is either built on MCP or building toward it.

Made with laziness love 🦥

Subscribe to My Newsletter