All Tools
FastMCP logo

FastMCP

FastMCP is the standard Python framework for building MCP servers and clients — write a function, add a decorator, and your tool is ready for any AI agent to use.

What is FastMCP?

FastMCP is the standard Python framework for building MCP servers, clients, and applications. MCP, or Model Context Protocol, is an open standard that gives AI models a consistent way to connect with external tools, data sources, and services. FastMCP handles all the protocol complexity under the hood, so you write a Python function, decorate it, and your tool is immediately available to any MCP-compatible AI client. It powers over 70% of all MCP servers in production today and is downloaded more than a million times a day.

How FastMCP works

When an AI model needs to use a tool, it speaks MCP: a standardized communication format that tells a server what to do and receives structured results back. Building that server from scratch involves a lot of boilerplate protocol code. FastMCP removes all of it.

Here is how the framework is structured:

  • FastMCP class: The entry point for every server. You instantiate it once and attach all your tools, resources, and prompts to it using decorators.
  • Tools: Functions the AI can call to take action, like querying a database, sending a message, or running a calculation. You register them with @mcp.tool.
  • Resources: Read-only data the AI can pull into its context, like config files, user profiles, or live data from an API. You register them with @mcp.resource.
  • Prompts: Reusable message templates that guide how the AI interacts with your tools in consistent ways. You register them with @mcp.prompt.
  • Automatic schema generation: FastMCP reads your Python type hints and docstrings to generate the full MCP schema for each tool automatically. No manual configuration needed.
  • Transport layer: FastMCP handles how messages move between client and server, supporting stdio for local tools and HTTP for remote deployments, without you changing your application code.

A minimal working server looks like this: create a FastMCP instance, write a Python function, and add @mcp.tool. That is the entire setup.

What you can build with FastMCP

  • Internal admin tool server: Expose your company’s internal APIs as MCP tools so an AI assistant can query your database, update records, or trigger workflows directly from a conversation.
  • Document processing server: Build a server that reads PDFs, DOCX files, and other formats, then surfaces them to any connected AI client for analysis, summarisation, or extraction.
  • Custom AI agent backend: Wire up tools for web search, email sending, calendar access, and file management, then connect the server to Claude, Cursor, or any MCP-compatible agent.
  • Multi-server composition pipeline: Combine multiple FastMCP servers into one application using server mounting, so a single endpoint exposes tools from several specialised services.
  • Interactive dashboard tool: Use FastMCP’s Apps feature to return live charts, tables, and forms directly in the conversation instead of plain text, so users see visual outputs from tool calls.

FastAPI-to-MCP bridge: Wrap an existing FastAPI application with FastMCP so your current REST endpoints become agent-callable tools without duplicating any logic or schemas.

Key Features

  • Decorator-based tool registration turns any Python function into a fully compliant MCP tool in one line
  • Automatic schema and validation generation from type hints and docstrings, with full Pydantic support
  • Built-in MCP Inspector for testing tools, resources, and prompts in a browser before connecting to a client
  • Supports both stdio transport for local integrations and HTTP for remote, cloud-deployed servers
  • Server composition and proxying lets you combine or wrap multiple MCP servers into a single application
  • Authentication support including OAuth, JWT, and Azure Entra ID for production deployments
  • Native OpenTelemetry instrumentation for tracing tool calls and diagnosing latency in production
  • Apps feature lets tools return interactive UIs such as charts, tables, and forms rendered directly in the conversation

FAQ

Does FastMCP work with AI models other than Claude? +

Yes. FastMCP implements the open Model Context Protocol standard, which means any MCP-compatible client can connect to your server. This includes Claude, Cursor, and custom agents built with OpenAI or Google models. You build the server once and it works across clients without modification.

What is the difference between FastMCP 2 and FastMCP 3? +

FastMCP 2 focused on making it easy to build individual servers. FastMCP 3 is built for production systems, adding native OpenTelemetry tracing, granular OAuth controls, tool versioning, background task support, and the Apps feature for returning interactive UIs. For scripts and prototypes, v2 still works fine. For anything going to production, v3 is the right choice.

Do I need to know the MCP specification to use FastMCP? +

No. FastMCP is designed specifically so you do not need to read the spec. You write normal Python functions, add decorators, and the framework handles protocol compliance, schema generation, transport, and error handling. Most developers get a working server running in under an hour.

Explore Similar AI Tools

Newsletter

The Twice-Monthly AI Briefing

Updates from the AI world — what shipped, what we’re using in production, and what’s worth your attention. Two emails a month, no spam.