Crawl SAP HANA
Create a crawler workflow to automatically discover and catalog your SAP HANA assets, including databases, schemas, tables, views, and columns.
Prerequisites
Before you begin, make sure you have:
- Configured SAP HANA user permissions with metadata read access
- SAP HANA database connection details (host, port, credentials)
- Reviewed the order of operations for workflow execution
Create crawler workflow
Create a new SAP HANA crawler workflow in Atlan by selecting the SAP HANA connector package, configuring your extraction method and connection details, and running the crawler to extract metadata.
- In the top navigation, click Marketplace.
- Search for SAP HANA Assets and select it.
- Click Install.
- Once installation completes, click Setup Workflow on the same tile.
If you navigated away before installation completed, go to New > New Workflow and select SAP HANA Assets to proceed.
Configure extraction
Select your extraction method and provide the connection details.
The offline extraction method has been sunset and is no longer available. For on-premises or network-restricted environments, use the Agent extraction method with Self-Deployed Runtime.
- Direct
- Agent
In Direct extraction, Atlan connects to your database and crawls metadata directly.
- For Host, enter the host name for your SAP HANA instance.
- For Port, enter the port number for your SAP HANA instance.
- For Username, enter the username you created for the instance.
- For Password, enter the password for the username.
- Click the Test Authentication button to confirm connectivity to SAP HANA.
- Once authentication is successful, navigate to the bottom of the screen and then click Next.
In Agent extraction, Self-Deployed Runtime executes metadata extraction within your organization's environment.
- Install Self-Deployed Runtime if you haven't already:
- Select the Agent tab.
- Store sensitive information in the secret store configured with the Self-Deployed Runtime and reference the secrets in the corresponding fields. For more information, see Configure secrets for workflow execution.
- For details on individual fields, refer to the Direct extraction tab.
- Click Next after completing the configuration.
Configure connection
Set up connection details including a descriptive name, admin access, and data access permissions.
- Provide a Connection Name that represents your source environment. For example, you might use values like
production,development,gold, oranalytics. - To change the users able to manage this connection, update the users or groups listed under Connection Admins. If you don't specify any user or group, nobody can manage the connection, including admins.
- To prevent users from querying SAP HANA data, change Allow SQL Query to No. This option applies only to Direct extraction.
- To prevent users from previewing SAP HANA data, change Allow Data Preview to No. This option applies only to Direct extraction.
- Click Next at the bottom of the screen.
Configure crawler
Configure crawler settings to control which assets to include or exclude. If an asset appears in both filters, the exclude filter takes precedence.
- To select specific assets for crawling, click Include Metadata. By default, all assets are included.
- To exclude specific assets from crawling, click Exclude Metadata. By default, no assets are excluded.
Run crawler
Run preflight checks to validate your configuration, then execute the crawler immediately or schedule it to run on a recurring basis.
- To verify permissions and configuration before running, click Preflight checks.
- Choose your run option:
- To run the crawler once immediately, click Run at the bottom of the screen.
- To schedule the crawler to run hourly, daily, weekly, or monthly, click Schedule Run at the bottom of the screen.
Once the crawler completes, you can view the assets in Atlan's asset page.
See also
- What does Atlan crawl from SAP HANA: Complete reference of assets and metadata discovered during crawling
- Preflight checks for SAP HANA: Validation checks for permissions and configuration before running the crawler