Implement Atlan MCP Server
The Model Context Protocol (MCP) is an open standard that enables AI agents to access contextual metadata from external systems.
Atlan provides a reference implementation of MCP through the Atlan MCP server. This server acts as a bridge between Atlan’s metadata platform and AI tools such as Claude and Cursor.
You can use the Atlan MCP server to support AI-driven use cases like searching for assets, understanding lineage, or updating metadata, all using real-time context from Atlan.
Get started
The Atlan MCP server can be configured in multiple environments depending on your preferred development setup and integration target. Follow the instructions below to set up the server with your desired tool:
- Cursor
- Claude
- Local development
Set up the Atlan MCP server in Cursor:
- Using uv:
uv
is a fast Python package manager designed to run and manage virtual environments locally without the overhead of Docker. - Using Docker: Use Docker to run the MCP server in an isolated, containerized environment.
Set up the Atlan MCP server in Claude Desktop:
-
Using uv:
uv
is a fast Python package manager designed to run and manage virtual environments locally without the overhead of Docker. -
Using Docker: Use Docker to run the MCP server in an isolated, containerized environment.
Build and run the Atlan MCP server locally:
- Local build guide: Walkthrough for cloning, building, and running the server locally.
Available tools
The Atlan MCP server exposes a set of tools that enable AI agents to interact with your metadata programmatically.
- Search assets: Search for assets in Atlan using filters such as name, type, tags, and domains.
- Query by DSL: Retrieve specific assets using Atlan’s DSL query language.
- Explore lineage: Explore upstream or downstream lineage for a given asset.
- Update assets: Modify asset metadata, including descriptions and certification status.
Need help?
- For troubleshooting and feature requests, see the GitHub repo.
- Contact Atlan support for help with setup or integration.