Skip to main content

Use Atlan MCP for agent automation

You can use Atlan MCP to automate catalog operations from Python agents, LangChain pipelines, n8n workflows, and other automation platforms. Unlike chat-based use cases where a person types a prompt, agent automation runs these operations programmatically—on a schedule, in response to external events, or as part of a larger pipeline.

  • Governance sweeps: Scan for assets missing metadata and apply certifications or descriptions in bulk.
  • Lineage impact analysis: Trace downstream dependencies from a source asset automatically.
  • Glossary bootstrapping: Create glossaries, categories, and terms from an external source.
  • Stale asset reporting: Detect undescribed or uncertified assets on a schedule and alert teams.
  • Data quality automation: Apply and schedule DQ rules on newly discovered assets.
info

Make sure the Atlan MCP server is running and your agent is connected before using any of these patterns. For setup, see Set up Local MCP Server.

Governance sweep

Use this to scan for tables that have descriptions but no certification, and automatically apply DRAFT status to flag them for owner review.

To run a governance sweep, you need these Atlan MCP tools:

  • Search assets: to find tables with descriptions and no certification
  • Update assets: to apply certification status in bulk

Example (Python):

tables = await client.call_tool("search_assets_tool", {
"asset_type": "Table",
"conditions": {"user_description": "has_any_value"},
"negative_conditions": {"certificate_status": "has_any_value"},
"limit": 50
})

await client.call_tool("update_assets_tool", {
"assets": [
{
"guid": t["guid"],
"name": t["name"],
"type_name": "Table",
"qualified_name": t["qualified_name"]
}
for t in tables["results"]
],
"attribute_name": "certificate_status",
"attribute_values": ["DRAFT"] * len(tables["results"])
})

Lineage impact analysis

Use this to automatically trace which downstream dashboards, models, and datasets are affected when an upstream table changes—for example, before a schema migration or deprecation.

To run a lineage impact analysis, you need these Atlan MCP tools:

  • Search assets: to locate the upstream asset by name
  • Traverse lineage: to walk downstream dependencies

Example (Python):

source = await client.call_tool("search_assets_tool", {
"asset_type": "Table",
"conditions": {"name": {"operator": "match", "value": "raw_events"}},
"limit": 1
})

consumers = await client.call_tool("traverse_lineage_tool", {
"guid": source["results"][0]["guid"],
"direction": "DOWNSTREAM",
"depth": 5,
"size": 100
})

Glossary bootstrap

Use this to programmatically create a glossary, add categories, and populate terms from an external source such as a data dictionary, spreadsheet, or internal system.

To bootstrap a glossary, you need these Atlan MCP tools:

  • Create glossaries: to create the top-level glossary container
  • Create glossary categories: to add a category hierarchy
  • Create glossary terms: to populate individual business terms

Example (Python):

glossary = await client.call_tool("create_glossaries", {
"name": "Data Dictionary"
})

categories = await client.call_tool("create_glossary_categories", [{
"name": "Customer",
"glossary_guid": glossary[0]["guid"]
}])

await client.call_tool("create_glossary_terms", [{
"name": "Customer ID",
"glossary_guid": glossary[0]["guid"],
"category_guids": [categories[0]["guid"]]
}])

Stale asset report

Use this to detect tables without descriptions on a daily schedule and post a summary to Slack, so your team can prioritize which assets to document next.

To build a stale asset report, you need these Atlan MCP tools:

  • Search assets: to find tables with missing descriptions

Example (n8n workflow):

Schedule Trigger (daily 08:00)
→ MCP Client node: Execute Tool — search_assets_tool
{ "asset_type": "Table", "negative_conditions": { "user_description": "has_any_value" }, "limit": 50 }
→ Code node: format asset names into a Slack message body
→ Slack node: post to #data-quality channel

For n8n MCP Client setup, see Set up n8n with Remote MCP.

Data quality sweep

Use this to add null-count DQ rules to all columns in a critical table and schedule them to run nightly alongside your ETL.

To run a data quality sweep, you need these Atlan MCP tools:

  • Search assets: to find columns in the target table
  • Create DQ rules: to define a null-count rule per column
  • Schedule DQ rules: to configure a recurring cron run

Example (Python):

columns = await client.call_tool("search_assets_tool", {
"asset_type": "Column",
"conditions": {
"table_qualified_name": {
"operator": "match",
"value": "default/snowflake/<CONNECTION_ID>/DB/SCHEMA/ORDERS"
}
},
"limit": 100
})

for col in columns["results"]:
rule = await client.call_tool("create_dq_rules_tool", {
"rule_type": "Null Count",
"asset_qualified_name": col["table_qualified_name"],
"column_qualified_name": col["qualified_name"],
"threshold_value": 0,
"threshold_compare_operator": "EQUAL",
"alert_priority": "HIGH"
})

await client.call_tool("schedule_dq_rules_tool", {
"rule_id": rule["id"],
"cron": "0 2 * * *"
})