Manage data quality rules
Create data quality rules
Data quality rules can be created using three different creator methods depending on the type of rule you want to create:
- Column level rules: For rules that apply to specific columns (for example, Freshness, Null Count)
- Table level rules: For rules that apply to entire tables (for example, Row Count)
- Custom SQL rules: For Custom SQL rule only
Column level rules
Column level rules are used for data quality checks that apply to specific columns within a asset.
- Java
- Python
- Kotlin
- Raw REST API
from pyatlan.client.atlan import AtlanClient
from pyatlan.model.assets import DataQualityRule, Table, Column
from pyatlan.model.enums import (
DataQualityRuleAlertPriority,
DataQualityRuleTemplateType,
DataQualityRuleThresholdCompareOperator,
DataQualityDimension,
DataQualityRuleThresholdUnit
)
client = AtlanClient()
# Create a Freshness rule for a specific column
dq_rule = DataQualityRule.column_level_rule_creator( # (1)
client=client, # (2)
rule_type=DataQualityRuleTemplateType.FRESHNESS, # (3)
asset=Table.ref_by_qualified_name(qualified_name="default/databricks/1750768309/dq/weather/monitoring"), # (4)
column=Column.ref_by_qualified_name(qualified_name="default/databricks/1750768309/dq/weather/monitoring/evaluated_at"), # (5)
threshold_value=1, # (6)
alert_priority=DataQualityRuleAlertPriority.URGENT, # (7)
threshold_unit=DataQualityRuleThresholdUnit.DAYS # (8)
)
response = client.asset.save(dq_rule) # (9)
# Create a Null Count rule for a specific column
dq_rule_null = DataQualityRule.column_level_rule_creator(
client=client,
rule_type=DataQualityRuleTemplateType.NULL_COUNT,
asset=Table.ref_by_qualified_name(qualified_name="default/databricks/1750768309/dq/weather/monitoring"),
column=Column.ref_by_qualified_name(qualified_name="default/databricks/1750768309/dq_poc/accuweather/_quality_monitoring_summary/catalog"),
threshold_compare_operator=DataQualityRuleThresholdCompareOperator.LESS_THAN_EQUAL, # (10)
threshold_value=5,
alert_priority=DataQualityRuleAlertPriority.URGENT,
row_scope_filtering_enabled=True # (11)
)
response = client.asset.save(dq_rule_null)
- Use the
column_level_rule_creatormethod to create column-level data quality rules. - Provide the Atlan client instance.
- Specify the rule type using the
DataQualityRuleTemplateTypeenum (for example,DataQualityRuleTemplateType.FRESHNESS,DataQualityRuleTemplateType.NULL_COUNT). - Reference the asset using its qualified name to which you want to apply this rule.
- Reference the specific column using its qualified name of that asset to which you want to apply this rule. Make sure the column data type is compatible with the rule type (for example, date/time columns for Freshness rules).
- Set the threshold value for the rule (same as you would in the UI).
- Set the alert priority level (same as you would in the UI).
- Optional: Specify the threshold unit (for example, DAYS, HOURS) for rules that support units (such as Freshness). For rules without units (such as Null Count), omit this parameter.
- Save the data quality rule to Atlan.
- Optional: Specify the threshold compare operator (same as you would in the UI).
- Optional: Set
row_scope_filtering_enabled=Trueto enable incremental data quality monitoring (check UI for availability). - Optional: Pass
rule_conditionsparameter for rule types that support conditions (for example, String Length, Regex, Valid Values, Reconciliation rules).
Rule types with conditions
These rule types (String Length, Regex, Valid Values) make data validation faster and easier by providing common checks that previously required custom SQL or complex setup. They support adding rule conditions along with thresholds.
- Java
- Python
- Kotlin
- Raw REST API
from pyatlan.client.atlan import AtlanClient
from pyatlan.model.assets import DataQualityRule, Table, Column
from pyatlan.model.enums import (
DataQualityRuleAlertPriority,
DataQualityRuleTemplateType,
DataQualityRuleTemplateConfigRuleConditions
)
from pyatlan.model.dq_rule_conditions import DQRuleConditionsBuilder
client = AtlanClient()
# Create a String Length rule with conditions
rule_conditions = ( # (1)
DQRuleConditionsBuilder()
.add_condition(
type=DataQualityRuleTemplateConfigRuleConditions.STRING_LENGTH_BETWEEN,
min_value=5,
max_value=50,
)
.build()
)
dq_rule_string = DataQualityRule.column_level_rule_creator( # (2)
client=client,
rule_type=DataQualityRuleTemplateType.STRING_LENGTH,
asset=Table.ref_by_qualified_name(qualified_name="default/databricks/1750768309/dq/weather/monitoring"),
column=Column.ref_by_qualified_name(qualified_name="default/databricks/1750768309/dq/weather/monitoring/country"),
threshold_value=6,
alert_priority=DataQualityRuleAlertPriority.URGENT,
rule_conditions=rule_conditions,
row_scope_filtering_enabled=True
)
response = client.asset.save(dq_rule_string) # (3)
# Create a Regex rule with pattern validation
rule_conditions_regex = ( # (4)
DQRuleConditionsBuilder()
.add_condition(
type=DataQualityRuleTemplateConfigRuleConditions.REGEX_NOT_MATCH,
value="^[A-Za-z]+$",
)
.build()
)
dq_rule_regex = DataQualityRule.column_level_rule_creator( # (5)
client=client,
rule_type=DataQualityRuleTemplateType.REGEX_MATCH,
asset=Table.ref_by_qualified_name(qualified_name="default/databricks/1750768309/dq/weather/monitoring"),
column=Column.ref_by_qualified_name(qualified_name="default/databricks/1750768309/dq/weather/monitoring/country"),
threshold_value=2,
alert_priority=DataQualityRuleAlertPriority.URGENT,
rule_conditions=rule_conditions_regex,
row_scope_filtering_enabled=True
)
response = client.asset.save(dq_rule_regex) # (6)
# Create a Valid Values rule with allowed values list
rule_conditions_valid = ( # (7)
DQRuleConditionsBuilder()
.add_condition(
type=DataQualityRuleTemplateConfigRuleConditions.IN_LIST,
value=["United States", "Canada", "Mexico"],
)
.build()
)
dq_rule_valid = DataQualityRule.column_level_rule_creator( # (8)
client=client,
rule_type=DataQualityRuleTemplateType.VALID_STRING_VALUES,
asset=Table.ref_by_qualified_name(qualified_name="default/databricks/1750768309/dq/weather/monitoring"),
column=Column.ref_by_qualified_name(qualified_name="default/databricks/1750768309/dq/weather/monitoring/country"),
threshold_value=2,
alert_priority=DataQualityRuleAlertPriority.URGENT,
rule_conditions=rule_conditions_valid,
row_scope_filtering_enabled=True
)
response = client.asset.save(dq_rule_valid) # (9)
# Create a Valid Values Reference rule using a reference table/column
rule_conditions_reference = ( # (10)
DQRuleConditionsBuilder()
.add_condition(
type=DataQualityRuleTemplateConfigRuleConditions.IN_LIST_REFERENCE,
reference_table="default/databricks/1750768309/dq/weather/valid_countries",
reference_column="default/databricks/1750768309/dq/weather/valid_countries/country_code",
)
.build()
)
dq_rule_reference = DataQualityRule.column_level_rule_creator( # (11)
client=client,
rule_type=DataQualityRuleTemplateType.VALID_STRING_VALUES_REFERENCE,
asset=Table.ref_by_qualified_name(qualified_name="default/databricks/1750768309/dq/weather/monitoring"),
column=Column.ref_by_qualified_name(qualified_name="default/databricks/1750768309/dq/weather/monitoring/country"),
threshold_value=2,
alert_priority=DataQualityRuleAlertPriority.URGENT,
rule_conditions=rule_conditions_reference,
row_scope_filtering_enabled=True
)
response = client.asset.save(dq_rule_reference) # (12)
- Create rule conditions using
DQRuleConditionsBuilderwith String Length condition type - Create a String Length rule with conditions using
DataQualityRuleTemplateType.STRING_LENGTH - Save the String Length rule to Atlan
- Create rule conditions for Regex pattern validation
- Create a Regex rule using
DataQualityRuleTemplateType.REGEX_MATCH - Save the Regex rule to Atlan
- Create rule conditions for Valid Values with an allowed values list
- Create a Valid Values rule using
DataQualityRuleTemplateType.VALID_STRING_VALUES - Save the Valid Values rule to Atlan
- Create rule conditions for Valid Values Reference using a reference table and column
- Create a Valid Values Reference rule using
DataQualityRuleTemplateType.VALID_STRING_VALUES_REFERENCE - Save the Valid Values Reference rule to Atlan
Reconciliation rules
Reconciliation rules allow you to compare metrics between a base table/column and a target table/column. These rules support reconciliation conditions that specify the target table and column for comparison.
- Java
- Python
- Kotlin
- Raw REST API
from pyatlan.client.atlan import AtlanClient
from pyatlan.model.assets import DataQualityRule, Table, Column, View
from pyatlan.model.enums import (
DataQualityRuleAlertPriority,
DataQualityRuleTemplateType,
DataQualityRuleTemplateConfigRuleConditions,
DataQualityRuleThresholdCompareOperator,
DataQualityRuleThresholdUnit
)
from pyatlan.model.dq_rule_conditions import DQRuleConditionsBuilder
client = AtlanClient()
# Create a Row Count Reconciliation rule at table level
rule_conditions_recon = ( # (1)
DQRuleConditionsBuilder()
.add_condition(
type=DataQualityRuleTemplateConfigRuleConditions.ROW_COUNT_RECON,
target_table="default/snowflake/1755428756/COVID19_DATASET_PROD/small_case/NYT_US_COVID19",
)
.build()
)
dq_rule_recon = DataQualityRule.table_level_rule_creator( # (2)
client=client,
rule_type=DataQualityRuleTemplateType.RECON_ROW_COUNT,
asset=View.ref_by_qualified_name(
qualified_name="default/snowflake/1755428756/COVID19_DATASET_PROD/PUBLIC/CDC_INPATIENT_BEDS_ALL_VIEW"
),
threshold_value=3,
threshold_unit=DataQualityRuleThresholdUnit.PERCENTAGE,
alert_priority=DataQualityRuleAlertPriority.NORMAL,
rule_conditions=rule_conditions_recon, # (3)
)
response = client.asset.save(dq_rule_recon) # (4)
# Create a Unique Count Reconciliation rule at column level
rule_conditions_unique_recon = ( # (5)
DQRuleConditionsBuilder()
.add_condition(
type=DataQualityRuleTemplateConfigRuleConditions.UNIQUE_COUNT_RECON,
target_table="default/snowflake/1755428756/COVID19_DATASET_PROD/small_case/KFF_HCP_CAPACITY",
target_column="default/snowflake/1755428756/COVID19_DATASET_PROD/small_case/KFF_HCP_CAPACITY/TOTAL_CHCS",
)
.build()
)
dq_rule_unique_recon = DataQualityRule.column_level_rule_creator( # (6)
client=client,
rule_type=DataQualityRuleTemplateType.RECON_UNIQUE_COUNT,
asset=View.ref_by_qualified_name(
qualified_name="default/snowflake/1755428756/COVID19_DATASET_PROD/PUBLIC/CDC_INPATIENT_BEDS_ALL_VIEW"
),
column=Column.ref_by_qualified_name(
qualified_name="default/snowflake/1755428756/COVID19_DATASET_PROD/PUBLIC/CDC_INPATIENT_BEDS_ALL_VIEW/INPATIENT_BEDS_OCCUPIED"
),
threshold_value=2,
threshold_unit=DataQualityRuleThresholdUnit.PERCENTAGE,
alert_priority=DataQualityRuleAlertPriority.NORMAL,
rule_conditions=rule_conditions_unique_recon,
row_scope_filtering_enabled=True # (7)
)
response = client.asset.save(dq_rule_unique_recon) # (8)
- Create reconciliation rule conditions. For
ROW_COUNT_RECON, onlytarget_tableis required. For other reconciliation types (AVERAGE_RECON,SUM_RECON,DUPLICATE_COUNT_RECON,UNIQUE_COUNT_RECON), bothtarget_tableandtarget_columnare required. - Create a Row Count Reconciliation rule using
DataQualityRuleTemplateType.RECON_ROW_COUNTat the table level. Thethreshold_compare_operatoris optional and defaults to the template config value if not provided. - Pass
rule_conditionsas a parameter totable_level_rule_creator. - Save the reconciliation rule to Atlan.
- Create reconciliation rule conditions for column-level rules. Both
target_tableandtarget_columnare required. - Create a Unique Count Reconciliation rule using
DataQualityRuleTemplateType.RECON_UNIQUE_COUNTat the column level. - Optional: Enable row scope filtering for reconciliation rules. When enabled, the system validates that both the base asset and target table have row scope filter columns configured.
- Save the reconciliation rule to Atlan.
Table Level rules
Table level rules are used for data quality checks that apply to entire table.
- Java
- Python
- Kotlin
- Raw REST API
from pyatlan.client.atlan import AtlanClient
from pyatlan.model.assets import DataQualityRule, Table
from pyatlan.model.enums import (
DataQualityRuleAlertPriority,
DataQualityRuleTemplateType,
DataQualityRuleThresholdCompareOperator
)
client = AtlanClient()
# Create a Row Count rule for a table
dq_rule = DataQualityRule.table_level_rule_creator( # (1)
client=client, # (2)
rule_type=DataQualityRuleTemplateType.ROW_COUNT, # (3)
asset=Table.ref_by_qualified_name(qualified_name="default/databricks/1750768309/dq_poc/accuweather/_quality_monitoring_summary"), # (4)
threshold_compare_operator=DataQualityRuleThresholdCompareOperator.EQUAL, # (5)
threshold_value=15, # (6)
alert_priority=DataQualityRuleAlertPriority.URGENT # (7)
)
response = client.asset.save(dq_rule) # (8)
- Use the
table_level_rule_creatormethod to create table-level data quality rules. - Provide the Atlan client instance.
- Specify the rule type using the
DataQualityRuleTemplateTypeenum (for example,DataQualityRuleTemplateType.ROW_COUNT). - Reference the asset using its qualified name to which you want to apply this rule.
- Optional: Set the threshold comparison operator (for example, EQUAL, LESS_THAN_EQUAL).
- Set the threshold value for the rule (same as you would in the UI).
- Set the alert priority level (same as you would in the UI).
- Save the data quality rule to Atlan.
Custom SQL rule
Custom SQL rule allow you to define data quality check using custom SQL queries.
- Java
- Python
- Kotlin
- Raw REST API
from pyatlan.client.atlan import AtlanClient
from pyatlan.model.assets import DataQualityRule, Table
from pyatlan.model.enums import (
DataQualityRuleAlertPriority,
DataQualityRuleCustomSQLReturnType,
DataQualityRuleThresholdCompareOperator,
DataQualityDimension
)
client = AtlanClient()
# Create a Custom SQL rule
dq_rule = DataQualityRule.custom_sql_creator( # (1)
client=client, # (2)
rule_name="Test SQL Rule", # (3)
asset=Table.ref_by_qualified_name(qualified_name="default/databricks/1750768309/dq_poc/accuweather/_quality_monitoring_summary"), # (4)
custom_sql="SELECT count(*) FROM `dq_poc`.`accuweather`.`_quality_monitoring_summary`", # (5)
threshold_compare_operator=DataQualityRuleThresholdCompareOperator.LESS_THAN_EQUAL, # (6)
threshold_value=10, # (7)
alert_priority=DataQualityRuleAlertPriority.URGENT, # (8)
dimension=DataQualityDimension.COMPLETENESS, # (9)
custom_sql_return_type=DataQualityRuleCustomSQLReturnType.ROW_COUNT, # (10)
description="Custom SQL rule for completeness check" # (11)
)
response = client.asset.save(dq_rule) # (12)
- Use the
custom_sql_creatormethod to create custom SQL data quality rules. - Provide the Atlan client instance.
- Provide a name for the custom rule (same as you would in the UI).
- Reference the asset using its qualified name to which you want to apply this rule.
- Provide the custom SQL query for the rule (same as you would in the UI).
- Set the threshold comparison operator (same as you would in the UI).
- Set the threshold value for the rule (same as you would in the UI).
- Set the alert priority level (same as you would in the UI).
- Set the data quality dimension (for example, COMPLETENESS, ACCURACY) (same as you would in the UI).
- Optional: Specify the return type of the custom SQL query (
ROW_COUNTorNUMERIC_VALUE). This indicates whether the SQL returns a row count or a numeric value. - Optional: Provide a description for the rule.
- Save the data quality rule to Atlan.
Update data quality rules
To update an existing data quality rule, you only need to provide the qualified name and the Atlan Client. All other parameters are optional and will only be updated if provided.
- Java
- Python
- Kotlin
- Raw REST API
from pyatlan.client.atlan import AtlanClient
from pyatlan.model.assets import DataQualityRule
from pyatlan.model.enums import (
DataQualityRuleAlertPriority,
DataQualityRuleCustomSQLReturnType,
DataQualityRuleThresholdCompareOperator,
DataQualityDimension,
DataQualityRuleThresholdUnit,
DataQualityRuleTemplateConfigRuleConditions
)
from pyatlan.model.dq_rule_conditions import DQRuleConditionsBuilder
client = AtlanClient()
# Update specific fields of an existing data quality rule
updated_rule = DataQualityRule.updater( # (1)
client=client, # (2)
qualified_name="default/databricks/1750768309/dq_poc/accuweather/_quality_monitoring_summary/rule/40e01c39-dcb8-4348-9259-041f353a8348", # (3)
threshold_compare_operator=DataQualityRuleThresholdCompareOperator.LESS_THAN_EQUAL, # (4)
threshold_value=20, # (5)
alert_priority=DataQualityRuleAlertPriority.HIGH, # (6)
threshold_unit=DataQualityRuleThresholdUnit.DAYS, # (7)
dimension=DataQualityDimension.COMPLETENESS, # (8)
custom_sql="SELECT count(*) FROM updated_table", # (9)
custom_sql_return_type=DataQualityRuleCustomSQLReturnType.NUMERIC_VALUE, # (10)
rule_name="Updated Rule Name", # (11)
description="Updated description for the rule" # (12)
)
response = client.asset.save(updated_rule)
# Update rule conditions and enable row filtering for a String Length rule
updated_rule_conditions = (
DQRuleConditionsBuilder()
.add_condition(
type=DataQualityRuleTemplateConfigRuleConditions.STRING_LENGTH_BETWEEN,
min_value=10,
max_value=100,
)
.build()
)
updated_string_rule = DataQualityRule.updater(
client=client,
qualified_name="default/databricks/1750768309/dq/weather/monitoring/rule/40e01c39-dcb8-4348-9259-041f353a8348",
threshold_value=15,
alert_priority=DataQualityRuleAlertPriority.HIGH,
rule_conditions=updated_rule_conditions, # (13)
row_scope_filtering_enabled=True # (14)
)
response = client.asset.save(updated_string_rule) # (15)
- Use the
updatermethod to update an existing data quality rule. - Provide the Atlan client instance.
- Provide the qualified name of the existing rule.
- Optional: Update the threshold comparison operator.
- Optional: Update the threshold value for the rule.
- Optional: Update the alert priority level.
- Optional: Update the threshold unit.
- Optional: Update the data quality dimension (for custom SQL rules).
- Optional: Update the custom SQL query (for custom SQL rules).
- Optional: Update the custom SQL return type (
ROW_COUNTorNUMERIC_VALUE) for custom SQL rules. - Optional: Update the name of the rule (for custom SQL rules).
- Optional: Update the description of the rule (for custom SQL rules).
- Optional: Update the rule conditions (check UI for availability).
- Optional: Enable row scope filtering (check UI for availability).
- Save the updated data quality rule to Atlan.
When updating data quality rules, only update parameters that are applicable to your specific rule type as shown in the UI. Updating parameters that don't apply to your rule type may cause the operation to fail or produce unexpected results.
Retrieve data quality rules
To retrieve data quality rules, you can use fluent search to retireve data quality rules.
- Java
- Python
- Kotlin
- Raw REST API
from pyatlan.client.atlan import AtlanClient
from pyatlan.model.assets import Asset, Connection, DataQualityRule
from pyatlan.model.fluent_search import FluentSearch
client = AtlanClient()
# Example 1: Retrieve all data quality rules on a connection
search_request = ( # (1)
FluentSearch()
.select(include_archived=False)
.where_some(Connection.QUALIFIED_NAME.eq("default/databricks/1750768309"))
.where_some(Asset.TYPE_NAME.eq("DataQualityRule"))
.include_on_results(DataQualityRule.GUID)
.include_on_results(DataQualityRule.QUALIFIED_NAME)
).to_request()
results = client.asset.search(search_request) # (2)
for result in results: # (3)
print(f"Rule GUID: {result.guid}")
print(f"Rule Qualified Name: {result.qualified_name}")
# Example 2: Retrieve all information of a specific data quality rule using its qualified name
search_request = (
FluentSearch()
.where(DataQualityRule.QUALIFIED_NAME.eq("default/databricks/1750768309/dq_poc/accuweather/_quality_monitoring_summary/rule/a481d03a-7fb9-48c1-a752-3aad4f6a98c1"))
.include_on_results(DataQualityRule.GUID)
.include_on_results(DataQualityRule.QUALIFIED_NAME)
.include_on_results(DataQualityRule.DQ_RULE_BASE_COLUMN_QUALIFIED_NAME)
.include_on_results(DataQualityRule.DQ_RULE_ALERT_PRIORITY)
.include_on_results(DataQualityRule.DQ_RULE_DIMENSION)
).to_request()
result = client.asset.search(search_request)
search_result = result.current_page()[0]
print(f"GUID: {search_result.guid}")
print(f"Qualified Name: {search_result.qualified_name}")
print(f"Column Qualified Name: {search_result.dq_rule_base_column_qualified_name}")
print(f"Alert Priority: {search_result.dq_rule_alert_priority}")
print(f"Dimension: {search_result.dq_rule_dimension}")
- Create a Fluent Search request to retrieve data quality rules from a specific connection.
- Execute the search request to retrieve the data quality rules.
- Iterate through all matching data quality rules and print their details.
Delete data quality rules
To delete data quality rules, you can use the standard asset deletion method.
- Java
- Python
- Kotlin
- Raw REST API
from pyatlan.client.atlan import AtlanClient
from pyatlan.model.assets import DataQualityRule
client = AtlanClient()
response = client.asset.delete_by_guid(guid="b4113341-251b-4adc-81fb-2420501c30e6") # (1)
if deleted := response.assets_deleted(asset_type=DataQualityRule): # (2)
term = deleted[0] # (3)
- Use the
asset.delete_by_guid()method to delete a data quality rule. Provide the GUID of the rule you want to delete. - The
assets_deleted(asset_type=DataQualityRule)method returns a list of the assets of the given type that were deleted. - If an asset of the given type was deleted, then the deleted form of the asset is available.
Set row scope filter column
To configure incremental data quality monitoring, use the set_dq_row_scope_filter_column method. This enables rules to scan only the most recent day's data instead of entire datasets, making monitoring faster and more efficient for batch-based ingestions.
- Java
- Python
- Kotlin
- Raw REST API
from pyatlan.client.atlan import AtlanClient
from pyatlan.model.assets import Table
client = AtlanClient()
response = client.asset.set_dq_row_scope_filter_column( # (1)
asset_type=Table, # (2)
asset_name="monitoring", # (3)
asset_qualified_name="default/databricks/1750768309/dq/weather/monitoring", # (4)
row_scope_filter_column_qualified_name="default/databricks/1750768309/dq/weather/monitoring/updated_at" # (5)
)
- Use the
set_dq_row_scope_filter_columnmethod to configure incremental data quality monitoring. - Specify the asset type (for example, Table).
- Provide the name of the asset as it appears in Atlan.
- Provide the qualified name of the asset.
- Provide the qualified name of the column to use for tracking row updates (typically a timestamp column).
Schedule data quality rules
To add a schedule for data quality rules on an asset, you can use the add_dq_rule_schedule method. This method allows you to set up the schedule for data quality rule execution.
- Java
- Python
- Kotlin
- Raw REST API
from pyatlan.client.atlan import AtlanClient
from pyatlan.model.assets import Table
client = AtlanClient()
response = client.asset.add_dq_rule_schedule( # (1)
asset_type=Table, # (2)
asset_name="_quality_monitoring_summary", # (3)
asset_qualified_name="default/databricks/1750768309/dq_poc/accuweather/_quality_monitoring_summary", # (4)
schedule_crontab="41 20 * 1 *", # (5)
schedule_time_zone="Europe/Paris" # (6)
)
- Use the
add_dq_rule_schedulemethod to add a schedule for data quality rules on an asset. - Specify the asset type (for example, Table).
- Provide the name of the asset as it appears in Atlan.
- Provide the qualified name of the asset (same as you would see in the Atlan UI).
- Provide the cron schedule string following the standard cron format (for example, "41 20 * 1 *" means run at 20:41 only in January).
- Provide the timezone string in the format used by Atlan UI (for example, "Europe/Paris", "Asia/Calcutta").
{
"entities": [
{
"guid": "e971e35d-5d45-4d6c-a8e5-e2bc6a1e1c74", // (1)
"typeName": "Table", // (2)
"attributes": {
"name": "_quality_monitoring_summary", // (3)
"qualifiedName": "default/databricks/1750768309/dq_poc/accuweather/_quality_monitoring_summary", // (4)
"assetDQScheduleType": "CRON", // (5)
"assetDQScheduleCrontab": "41 20 * * 0,1,4-6", // (6)
"assetDQScheduleTimeZone": "Asia/Calcutta" // (7)
}
}
]
}
- The asset GUID to which the DQ rule schedule needs to be implemented.
- The type of the asset (for example, "Table").
- The name of the asset.
- The qualified name of the asset.
- Set to "CRON" for cron-based scheduling.
- The cron schedule string (for example, "41 20 * * 0,1,4-6").
- The timezone string (for example, "Asia/Calcutta").
The standard cron schedule format consists of five fields, separated by spaces:
- Minute (0-59): The minute of the hour when the command will run
- Hour (0-23): The hour of the day when the command will run (0 is midnight, 23 is 11 PM)
- Day of Month (1-31): The day of the month when the command will run
- Month (1-12): The month of the year when the command will run (1 is January, 12 is December)
- Day of Week (0-6): The day of the week when the command will run (0 is Sunday, 1 is Monday, and so on up to 6 for Saturday)