Asset export (basic) package
The asset export (basic) package identifies all assets that could have been enriched in some way through Atlan's UI and extracts them. The resulting CSV file can be modified or enriched, and then loaded back using the asset import package.
All assets
In this example, we’re building and running the asset-export workflow to export all assets.
However, you can also use one of the following methods to customize the scope of your asset export workflow:
enriched_only(): sets up the package to export only assets enriched by users.glossaries_only(): sets up the package to export only glossaries.products_only(): sets up the package to export only data products.all_assets(): sets up the package to export all assets, whether enriched by users or not, will be exported.
- Java
- Python
- Kotlin
- Raw REST API
from pyatlan.client.atlan import AtlanClient
from pyatlan.model.packages import AssetExportBasic
client = AtlanClient()
workflow = (
AssetExportBasic() # (1)
.all_assets( # (2)
prefix="default",
include_description=True,
include_glossaries=True,
include_data_products=True,
include_archived=True,
)
.object_store(prefix="/test/prefix") # (3)
.s3( # (4)
access_key="test-access-key",
secret_key="test-secret-key",
bucket="my-bucket",
region="us-west-1",
)
).to_workflow() # (5)
response = client.workflow.run(workflow) # (6)
-
The
AssetExportBasicpackage exports assets from Atlan. -
In this example, we’re building a workflow to export
all_assets(). However, you can also use one of the following methods to customize the scope of your asset export workflow:enriched_only(): sets up the package to export only assets enriched by users.glossaries_only(): sets up the package to export only glossaries.products_only(): sets up the package to export only data products.all_assets(): sets up the package to export all assets, whether enriched by users or not, will be exported.
For
all_assets(), you need to provide following:prefix: starting value for aqualifiedNamethat will determine which assets to export, default:default(all data assets).include_description: whether to extract only user-entered description (False), or to also include system-level description (True).include_glossaries: whether glossaries (and their terms and categories) should be exported (True) or not (False).include_data_products: whether data products (and their domains) should be exported (True) or not (False).include_archived: whether to include archived assets in the export (True) or only active assets (False).
-
To set up the package to export to an object storage location, you need to provide
prefix: directory (path) within the object store where the exported file will be uploaded.
-
In this example, we're exporting assets to an object storage location using
s3(). However, you can use different object storage methods such asgcs()oradls(). You can also configure different export delivery methods using one of the following methods:email(): sets up the package to deliver the export via email.direct(): sets up the package to deliver the export via direct download.
For
s3(), you need to provide following:access_key: AWS access key.secret_key: AWS secret key.bucket: S3 bucket to upload the export file to.region: name of the AWS region.
-
Convert the package into a
Workflowobject. -
Run the workflow by invoking the
run()method on the workflow client, passing the created object.Workflows run asynchronously
Remember that workflows run asynchronously. See the packages and workflows introduction for details on how to check the status and wait until the workflow has been completed. :::
We recommend creating the workflow only via the UI. To rerun an existing workflow, see the steps below.
Re-run existing workflow
To re-run an existing asset export basic workflow:
- Java
- Python
- Kotlin
- Raw REST API
from pyatlan.client.atlan import AtlanClient
from pyatlan.model.enums import WorkflowPackage
client = AtlanClient()
existing = client.workflow.find_by_type( # (1)
prefix=WorkflowPackage.ASSET_EXPORT_BASIC, max_results=5
)
# Determine which asset export basic workflow (n)
# from the list of results you want to re-run.
response = client.workflow.rerun(existing[n]) # (2)
-
You can find workflows by their type using the workflow client
find_by_type()method and providing the prefix for one of the packages. In this example, we do so for theAssetExportBasic. (You can also specify the maximum number of resulting workflows you want to retrieve as results.) -
Once you've found the workflow you want to re-run, you can simply call the workflow client
rerun()method.- Optionally, you can use
rerun(idempotent=True)to avoid re-running a workflow that's already in running or in a pending state. This will return details of the already running workflow if found, and by default, it's set toFalse.
Workflows run asynchronously - Optionally, you can use
Remember that workflows run asynchronously. See the packages and workflows introduction for details on how you can check the status and wait until the workflow has been completed. :::
- Find the existing workflow.
- Send through the resulting re-run request.
{
"from": 0,
"size": 5,
"query": {
"bool": {
"filter": [
{
"nested": {
"path": "metadata",
"query": {
"prefix": {
"metadata.name.keyword": {
"value": "csa-asset-export-basic" // (1)
}
}
}
}
}
]
}
},
"sort": [
{
"metadata.creationTimestamp": {
"nested": {
"path": "metadata"
},
"order": "desc"
}
}
],
"track_total_hits": true
}
-
Searching by the
csa-asset-export-basicprefix will make sure you only find existing asset export basic workflows.Name of the workflow
The name of the workflow will be nested within the _source.metadata.name property of the response object.
(Remember since this is a search, there could be multiple results, so you may want to use the other
details in each result to determine which workflow you really want.)
:::
{
"namespace": "default",
"resourceKind": "WorkflowTemplate",
"resourceName": "csa-asset-export-basic-1684500411" // (1)
}
- Send the name of the workflow as the
resourceNameto rerun it.