Skip to main content

Connect Snowflake to Lakehouse

This guide walks you through how to connect Snowflake to your Lakehouse using Snowflake’s catalog-linked database feature, so you can start running Snowflake queries and building Snowflake Cortex applications on your Atlan metadata.

Query engine compatibility

If you use an engine that currently doesn't support querying external Iceberg REST catalogs (such as Databricks or Google BigQuery), contact your Atlan Customer Success team. Databricks and BigQuery support for Lakehouse is currently experimental and available through a custom setup.

GCP-hosted tenants

If your Atlan tenant is deployed on GCP, you must configure a GCS external volume before Snowflake can access Lakehouse Iceberg tables. Follow the steps in Snowflake: table is not initialized instead.

Prerequisites

Before you begin, make sure that:

  • You have enabled Lakehouse for your Atlan tenant. See Enable Lakehouse.

  • You have ACCOUNTADMIN privileges in your Snowflake account.

    Using a custom role instead of ACCOUNTADMIN?

    The setup commands require the CREATE INTEGRATION and CREATE DATABASE account-level privileges. Roles like SYSADMIN don't have these by default. If you can't use ACCOUNTADMIN, grant the required privileges to your role first:

    -- Run as ACCOUNTADMIN
    GRANT CREATE INTEGRATION ON ACCOUNT TO ROLE <your_role>;
    GRANT CREATE DATABASE ON ACCOUNT TO ROLE <your_role>;

Set up connection in Snowflake

  1. On the Lakehouse app page, click the Setup button.

  2. In the setup view, select the Snowflake tab.

  3. Click Copy to copy the Snowflake command. The command is pre-filled with your tenant-specific details, including the Catalog URL and configuration parameters for the Iceberg REST Catalog API.

  4. Open your Snowflake console and navigate to the Worksheets section.

  5. Create a new worksheet using the ACCOUNTADMIN role (or a custom role with the required privileges—see Prerequisites).

  6. Paste the command you copied from Atlan into the worksheet. It looks similar to this:

    -- Create catalog integration
    CREATE OR REPLACE CATALOG INTEGRATION context_store_catalog
    CATALOG_SOURCE = POLARIS
    TABLE_FORMAT = ICEBERG
    CATALOG_NAMESPACE = 'context_store'
    REST_CONFIG = (
    CATALOG_URI = 'https://<tenant_subdomain>.atlan.com/api/polaris/api/catalog'
    WAREHOUSE = 'context_store'
    ACCESS_DELEGATION_MODE = VENDED_CREDENTIALS
    )
    REST_AUTHENTICATION = (
    TYPE = OAUTH
    OAUTH_CLIENT_ID = '<polaris_reader_id>'
    OAUTH_CLIENT_SECRET = '************************'
    OAUTH_ALLOWED_SCOPES = ('PRINCIPAL_ROLE:lake_readers')
    )
    ENABLED = TRUE;

    -- Create database
    CREATE DATABASE context_store
    LINKED_CATALOG = (
    CATALOG = 'context_store_catalog',
    SYNC_INTERVAL_SECONDS = 60
    );
  7. Click Run in Snowflake to execute the commands. Your Snowflake account is now connected to the Lakehouse.

    Troubleshooting

    If you see an error like Failed to create catalog integration ... failed to parse response body into OAuthTokenResponse, the most likely cause is that your active Snowflake role lacks the CREATE INTEGRATION privilege. Switch to ACCOUNTADMIN or grant the required privileges as described in Prerequisites.

  8. In Snowflake, under DATABASES, you now see an entry for context_store, alongside all your other databases in Snowflake. You can now explore the contents of your Lakehouse using standard SQL commands, or use the data in the Lakehouse to build Snowflake Cortex applications.

    Example 1: To confirm that the setup worked and see available schemas in the Lakehouse, run:

    -- Use context_store database
    USE DATABASE context_store;

    -- Show schemas
    SHOW SCHEMAS IN context_store;

    Example 2: To view metadata for tables in your Atlan tenant, run:

    -- Get metadata for tables registered in Atlan
    SELECT *
    FROM context_store.entity_metadata."table"
    LIMIT 10;

Next steps

Now that Snowflake is connected to Lakehouse, you can:

  • Query Atlan metadata from Snowflake: See the available metadata tables in Entity metadata reference.
  • Use cases: Explore popular patterns such as metadata enrichment tracking, lineage impact analysis, and glossary alignment in Use cases.