FastAPI + MCP — Zero‑to‑Hero Guide to Building AI‑Native Microservices
Jun 11, 2025
Reading time: ≈ 4 minutes • Audience: Python backend & AI engineers • Goal: Turn a plain FastAPI service into an agent‑ready tool using the Model Context Protocol (MCP).
Why Care? — From REST to Agentic APIs
AI systems are shifting from single‑shot models to agentic architectures—LLMs that plan, reason and call external tools.
For an agent to use your API it has to understand it first.
That’s exactly what MCP (Model Context Protocol) provides: a machine‑readable manifest—think USB‑C for AI tools—describing each endpoint’s name, purpose, inputs and outputs.
Modern AI is racing beyond single‑shot predictions toward agentic architectures—LLMs that can plan, reason and call external tools. For an agent to use your API it must first understand what the API can do.
That’s exactly what MCP (Model Context Protocol) provides: a machine-readable manifest that describes your service’s capabilities—including endpoint names, purposes, input parameters, and response formats—in a format that LLMs and agents can natively understand and reason about.
Think of it like USB‑C for AI tools—a plug‑and‑play interface between smart clients and your backend logic.
FastAPI + MCP gives you:
Auto‑discoverability: Agents hit
/mcp
once and instantly know every capability your service exposes.Interoperability: Any MCP‑compliant orchestrator (LangChain, Ardor Cloud Copilot, custom code) can plug‑and‑play without bespoke wrappers.
Single source of truth: Docs for humans (
/docs
) and metadata for AI (/mcp
) are both generated from the same FastAPI routes.Zero lock‑in: MCP is an open standard—your service remains portable across clouds, agents and runtimes.
Project Overview
We’ll build a tiny health micro‑service that calculates Body Mass Index (BMI) and expose it as an AI tool.
This modular flow lets you:
Add new tools via FastAPI routes
Keep metadata and logic in sync
Allow AI agents to reason about and call your API
Let’s now walk through building it from scratch.

Hands‑On Tutorial
This section will guide you step-by-step through creating your FastAPI service and making it AI-ready with MCP.
Prerequisites
Before we begin, ensure you have the following installed:
Python 3.10+: Check with python --version.
uv: For fast, reproducible Python environments. https://docs.astral.sh/uv/getting-started/installation/
(Optional) Node.js ≥18: Required only if you want to use the MCP Inspector GUI tool for validation. You can check with node -v and install from https://nodejs.org/ if needed.
Bootstrap the project
uv
handles both environment creation and dependency installation.No need for pip or venv—just
uv venv
anduv sync
.fastapi
: The core web framework.uvicorn
: An ASGI server to run our FastAPI application.fastapi-mcp
: The library that integrates MCP with FastAPI.pydantic
: For data validation and serialization.
Write main.py
Now, create a file named main.py in your fastapi_mcp_demo directory and add the following Python code. This code defines our BMI calculation logic and integrates MCP.
Key takeaways from main.py:
Pydantic Models (BMIResponse): Define clear data structures for request and response bodies. fastapi-mcp uses these to generate schemas for AI agents.
FastAPI Route (@app.get("/bmi")): This is standard FastAPI. The operation_id is used by fastapi-mcp as the unique tool_name for the agent. The docstring and summary provide descriptions that agents can use to understand the tool's purpose. Parameter descriptions and examples are also crucial for agents.
FastApiMCP Integration: With just a few lines, fastapi-mcp inspects your FastAPI app (specifically, its routes and Pydantic models) and prepares the MCP manifest. mcp_server.mount() adds the necessary MCP endpoints (like /mcp) to your application.
Run & verify
Now, let's run our FastAPI application using Uvicorn:
main: Refers to main.py.
app: Refers to the app = FastAPI() instance in your code.
-reload: Uvicorn will automatically restart the server when code changes are detected.
Once the server is running (you should see output like Uvicorn running on http://127.0.0.1:8000), open your web browser or use a tool like curl to verify the endpoints:
Accessing the URL
http://localhost:8000/bmi?weight_kg=70&height_m=1.75
will show you:{"bmi":22.86,"assessment":"Normal weight"}
.The URL
http://localhost:8000/docs
will display the Swagger UI for humans.The URL
http://localhost:8000/mcp
will provide the JSON manifest for AI.

Peek through the eyes of an agent (optional but recommended)
To truly understand how an AI agent perceives and interacts with your service via MCP, it's helpful to use tools that simulate or replicate agent behavior. This section offers four ways to validate your /mcp manifest and invoke your tool as an agent would.
Goal: Prove that your /mcp manifest is live, correctly describes your calculate_bmi tool, and that an agent can successfully call this tool through the MCP interface.
Web Inspector (zero‑code)
Prerequisite: Node ≥ 18 (
node -v
). Install from https://nodejs.org/ if needed.

Run the Inspector:

Point it at your service: paste
http://localhost:8000/mcp
in the Server URL field and hit Connect. (But remember that/mcp
url in first place is SSE out of the box and you need to change it as a Transport Type)

Explore(by clicking on List Tools):

Invoke: fill
weight_kg = 70
,height_m = 1.75
, hit Call → response appears in the right pane. Your first terminal logs an HTTP 200.

Tip — your first terminal (running
uvicorn
) should log200 GET /bmi
confirming the agent-style call reached the REST path.
Raw JSON‑RPC (curl / HTTPie)
Prefer HTTPie?
If you get 400 Bad Request
, check JSON quoting and field names.
mcp-cli
(power users & CI)
Add --verbose
to any sub‑command to see the underlying JSON‑RPC exchange—perfect for debugging contract mismatches.
Automated health probe (optional)
Drop the snippet below into your monitoring stack; it simply checks that the manifest returns HTTP 200 and that calculate_bmi
exists:
Use a cron job or health‑check endpoint so Kubernetes / Docker can restart the container if the probe fails.
Level‑Up Ideas
Now that you have a basic MCP-enabled tool, here are some ideas to expand its capabilities and integrate it further into AI workflows:
Add more tools: Try wrapping any business function -
create_order
,send_email
, etc. - in a FastAPI route, and it’s instantly AI‑callable.Use rich Pydantic models: Experiment with nested objects, lists, and enums - MCP will describe them for you.
Secure the toolbox: Apply FastAPI dependencies (API keys, OAuth) so agents run with the least privilege.
Run with OpenAI: use OpenAI SDK and let it orchestrate your tools.
Chain workflows: Enable agents to call multiple MCP tools in sequence. e.g., search → summarise → email.
Best Practices Before Production
Transitioning your MCP service from a demo to a production-ready application requires attention to detail. Here are key best practices:
Name & document every tool Good
operation_id
andsummary
boost agent reasoning.Validate hard Pydantic guards against malformed input before your logic runs.
Rate‑limit & log Treat agents like power users; protect critical paths and capture audits.
Version If you change schemas, bump
/mcp/v2
just like you would for REST.Test with an agent Use Ardor Cloud Copilot or an open‑source agent to simulate real usage.
Wrap‑up
With fewer than 50 lines of Python, we've successfully transformed a standard FastAPI route into a self‑describing AI tool accessible via the Model Context Protocol. By simply adding more FastAPI endpoints and documenting them appropriately, you can rapidly build out a comprehensive toolbox of capabilities. This toolbox becomes instantly available to a wide range of LLM-powered agents - whether they run locally or in the cloud, and whether they are open-source or proprietary.
The path from a simple API to an AI-native service is now remarkably straightforward.
Your move: Clone the example repository https://github.com/Ardor-Cerebrum/ardor-ai-blog, adapt it by swapping in your own business logic, and then point an AI agent like Ardor Copilot (or your preferred agent framework) at your new /mcp
endpoint. Witness firsthand how it can discover, understand, plan, and execute tasks by calling your service, much like a highly efficient, automated teammate.
Happy building! 🚀
References
Shamim Bhuiyan — “Modern AI Integrations” (3‑part series)
fastapi-mcp
on PyPI · https://pypi.org/project/fastapi-mcp/0.1.3/MCP Inspector · https://github.com/modelcontextprotocol/inspector
FastAPI docs · https://fastapi.tiangolo.com/