🕒 Estimated time: 10–15 minutes
🧠 What You'll Learn
The 3 core components of MCP: Host, Client, and Server
How these components interact to enable seamless tool use by AI
Real-world analogies to simplify your understanding
Hands-on: Simulate the Host-Client-Server architecture locally to understand their communication flow (Python).
🏗️ The MCP Architecture at a Glance
Model Context Protocol (MCP) uses a client-server architecture, but it’s tailored to AI applications. It's built on a foundation of three key components that communicate using standardized MCP messages.
Host – The app that interacts with the user (e.g., ChatGPT, Claude, Cursor)
Client – The protocol adapter inside the Host
Server – The backend that exposes tools, resources, and prompts
Each plays a distinct role in enabling AI to access external capabilities — and they communicate via standardized MCP messages.
🧩 1. Host: The AI Interface You Interact With
The Host is the user-facing application where the AI lives and interacts with human input.
Examples:
OpenAI's ChatGPT interface
Claude Desktop
Cursor IDE
A custom web app with an embedded LLM
Responsibilities:
Captures user input
Maintains chat or session history
Decides when external context is needed
Displays model responses
Contains the MCP Client (MCP Adapter)
🔌 2. Client: The MCP Adapter
The Client lives inside the Host and knows how to “speak MCP.” It’s the bridge between the Host and external Servers.
Responsibilities:
Discover available MCP servers and their capabilities
Format and send requests to those MCP servers
Handle results and pass them back to the Host or model
The Client is the protocol driver that makes the AI "tool-aware."
⚙️ 3. Server: The Capability Provider
The Server is an external program (local or remote) that wraps one or more capabilities. These could be:
Tools: Functions that perform an action.
Resources: Data sources that can be read.
Prompts: Predefined instructions or templates.
Responsibilities:
Advertise what it can do in a standard way
Receive and execute requests (e.g., run a Python function or query a database)
Return results to the client
🔁 How They Work Together
Here's a simplified step-by-step example of the communication flow:
User inputs a query to the Host app (e.g., "What's the weather in Paris?").
The LLM, embedded in the Host, determines it needs to use a tool to get the answer.
The Client discovers a Server with a weather tool.
The Client sends a standardized request to the Server to call the weather tool.
The Server runs the function (e.g., calls a weather API) and returns the result.
The Client feeds the result back to the LLM.
The Host formats the LLM's final response and shows it to the user.
✅ Hands-On: Simulate the MCP Architecture (Offline)
🔧 Goal:
You’ll simulate the three MCP components using basic Python functions to understand how they interact.
The complete source code can be found in Github.
📁 Step 1: Create a Mock Server (Tool Provider)
Create a file named server.py
with the following code. This function simulates the work a real server would do.
# server.py
def get_weather(location):
return f"The weather in {location} is sunny and 25°C."
🖥️ Step 2: Create a Simple Client
Next, create a file named client.py
. This simulates the client's role in discovering and calling a tool on behalf of the host.
# client.py
from server import get_weather
def call_tool(tool_name, **kwargs):
if tool_name == "get_weather":
# In a real client, this would be a network call
return get_weather(**kwargs)
return "Tool not found"
🧑💻 Step 3: Simulate the Host
Finally, create a file named host.py
. This script simulates the user's input and the LLM's decision-making process.
# host.py
from client import call_tool
user_input = "What's the weather in Paris?"
# Normally the LLM would decide which tool to use
tool_needed = "get_weather"
params = {"location": "Paris"}
# The Host uses the Client to call the tool
result = call_tool(tool_needed, **params)
print("AI Response:", result)
▶️ Run It
Run host.py
in your terminal:
python host.py
Expected Output:
AI Response: The weather in Paris is sunny and 25°C.
🔑 Key Takeaways
Host manages interaction and reasoning
Client routes requests via MCP
Server offers executable or queryable capabilities
You now understand the communication flow at the heart of MCP