Atlan vs Databricks lineage
Frequently asked questions about how Atlan lineage differs from Databricks native lineage.
Frequently asked questions about how Atlan lineage differs from Databricks native lineage.
Understand how Data Quality Studio uses compute resources, how costs are calculated, and practical ways to track and optimize spend for Snowflake and Databricks.
Configure BYOC credentials and query settings on your BigQuery, Databricks, or Snowflake connection to enable failed rows queries.
Connect Databricks to Lakehouse using foreign Iceberg tables in Unity Catalog and run your first query
Connect your query engine to the Atlan Lakehouse and start querying metadata.
Complete field-by-field reference for the Snowflake Cortex Analyst and Databricks Metric View YAML schemas used in Context Engineering Studio.
Discover and catalog AI models registered in the Databricks Unity Catalog Model Registry using Atlan's Databricks connector.
Frequently asked questions about Databricks cross-workspace extraction setup and configuration.
Troubleshoot common issues in Databricks cross-workspace extraction with error, cause, and solution guidance.
Monitor and maintain data quality across your data sources with automated quality checks, alerts, and governance workflows
Use Lakehouse to analyze database usage, optimize query performance, and manage storage and compute costs
Integrate, catalog, and govern Databricks assets in Atlan.
Engine-specific setup, authentication, and behavior for Context Engineering Studio on Databricks Genie.
Set up and configure Databricks for data quality monitoring through Atlan.
Complete reference for the Snowflake Semantic View and Databricks Metric View DDL that Context Engineering Studio generates at deploy time.
Deploy a context repository to Databricks Genie: create Metric Views and a Genie Space, refine, certify, and ship to production.
Enable and configure data quality for your Databricks connection in Atlan.
Configure data quality rules to scan only the most recent day's data for faster, more efficient monitoring.
Set up OAuth SSO authentication for Databricks connections in Atlan.
Build upstream lineage between Databricks AI models and the datasets, tables, and functions they depend on.
Apply the Unity Catalog grants and workspace permissions that Context Engineering Studio needs before you can deploy to Databricks Genie.
Frequently asked questions about how Atlan constructs lineage for Databricks AI models, including lineage sources, relationships created, and asset behavior.
Understand how Atlan filters Databricks lineage data to show only valid lineage relationships.
Full reference of the privileges required to crawl AI models and extract lineage from Databricks Unity Catalog, including what each privilege enables and how to grant it.
View and export the actual data rows that failed data quality rules to investigate and resolve data quality issues.
Reference for restricting data quality rules to only the most recent day's data.
Trigger data quality rules immediately at the table or rule level without waiting for the next scheduled run. Supported for Snowflake and Databricks.
Configure a single service principal to crawl metadata from all workspaces within a Databricks metastore using system tables
Configure Databricks to enable data quality monitoring through Atlan.
Common questions about Databricks data quality setup and configuration.
Resolve common Databricks errors when creating and querying foreign Iceberg tables in Unity Catalog.
Resolve common Databricks errors in Context Engineering Studio, including OAuth, Genie Space, Unity Catalog, and deployment issues.
Let Atlan suggest data quality rules automatically based on your asset's metadata structure, and apply them in a few clicks.