Connect query engine
The Atlan Lakehouse makes all of the context about your data estate and Atlan tenant available through the Iceberg REST catalog, so you can query it with any Iceberg REST–compatible client.
Get connection details
You can connect any Iceberg REST–compatible client using the connection details provided in the Atlan UI. Here's how to find them:
-
In your Atlan workspace, navigate to Workflows in the left-hand navigation bar.
-
Open the Marketplace tab.
-
Select the Atlan Lakehouse tile. You can also search the Marketplace for Atlan Lakehouse.
-
Select View connection details.
-
Copy the values your engine needs — typically the catalog URI, catalog name, OAuth client ID, and OAuth client secret, plus any engine-specific values such as a reader role or region.
Connect your query engine
Below, popular engines have published setup guides. The same connection details work with any other Iceberg REST–compatible client (for example, PyIceberg, Trino, or DuckDB).
Snowflake
Catalog integrationConnect Snowflake to Lakehouse using a catalog-linked database and query Atlan metadata using standard SQL.
Amazon Athena
REST catalogConnect Amazon Athena to Lakehouse using the Iceberg REST catalog and query Atlan metadata from your AWS environment.
PySpark
REST catalogConnect PySpark to Lakehouse through the Iceberg REST catalog with Polaris credential vending, and query Atlan metadata from a Spark environment.
Google BigQuery
External tablesConnect BigQuery to Lakehouse using external Iceberg tables pointing to Atlan Lakehouse metadata files stored in GCS.
Databricks
Foreign tablesConnect Databricks to Lakehouse using foreign Iceberg tables in Unity Catalog pointing to Atlan Lakehouse metadata files stored in S3 or ADLS.