Skip to content

refactor(reconcile): consolidate data compare (join, mismatch, output) — #745#2380

Open
BesikiML wants to merge 3 commits intomainfrom
745-tech-debt-data-compare-consolidation
Open

refactor(reconcile): consolidate data compare (join, mismatch, output) — #745#2380
BesikiML wants to merge 3 commits intomainfrom
745-tech-debt-data-compare-consolidation

Conversation

@BesikiML
Copy link
Copy Markdown
Contributor

Summary

Refactors compare.py so hash reconciliation, aggregate reconciliation, and the capture (inner-join) path share the same building blocks where appropriate: aliased joins, join → prepare → persist, and value-mismatch filtering.

Related: #745

What changed

  • Shared join / persist: _aliased_join, _join_prepare_persist; hash flow uses prefixed selectExpr; aggregate flow uses prepare_persisted_aggregate_join (replaces join_aggregate_data).
  • Shared mismatch filtering: _filter_to_value_mismatches; hash path uses _value_mismatch_where_both_present and _mismatch_rows_for_prefixed_compare_column; aggregate path uses _mismatch_rows_for_aggregate_mappings (replaces _get_mismatch_agg_data).
  • Shared output assembly: _data_reconcile_output for reconcile_data and reconcile_agg_data_per_rule.
  • Hash helpers: _mismatch_projection_for_prefixed_columns, _joined_rows_missing_on_side (optional compare_basename), generic prefixed compare + _get_mismatch_data wrapper.
  • Capture path: capture_mismatch_data_and_columns / _get_mismatch_df use _inner_join_for_capture_mismatch (same _aliased_join primitive); projections split into small helpers.
  • Docs: module docstring describing the three flows (hash, aggregate, capture).

API / migration

  • Prefer prepare_persisted_aggregate_join instead of join_aggregate_data.
  • _get_mismatch_agg_data removed; aggregate mismatch logic lives in _mismatch_rows_for_aggregate_mappings (module-private).

Testing

  • Reconcile / compare unit and integration tests pass in CI

Checklist for reviewers

  • Grep full repo for join_aggregate_data / _get_mismatch_agg_data and update any remaining imports or references

…cile output

assembly; route capture mismatch through _inner_join_for_capture_mismatch; add
module docstring for hash vs aggregate vs capture flows.
@BesikiML BesikiML requested a review from a team as a code owner April 15, 2026 13:18
@BesikiML BesikiML linked an issue Apr 15, 2026 that may be closed by this pull request
1 task
@BesikiML BesikiML self-assigned this Apr 15, 2026
@github-actions
Copy link
Copy Markdown

github-actions Bot commented Apr 29, 2026

❌ 120/122 passed, 2 failed, 3 skipped, 54m11s total

❌ test_recon_databricks_job_succeeds: databricks.sdk.errors.sdk.OperationFailed: failed to reach TERMINATED or SKIPPED, got RunLifeCycleState.INTERNAL_ERROR: Task run_reconciliation failed with message: Workload failed, see run output for details. (7m45.705s)
... (skipped 14479 bytes)
_cdypgvfin.dummy_spaa1vurh
17:29 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:29 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_metric in dummy_cdypgvfin.dummy_spaa1vurh
17:29 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:29 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_detai in dummy_cdypgvfin.dummy_spaa1vurh
17:29 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:29 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_rule in dummy_cdypgvfin.dummy_spaa1vurh
17:29 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:29 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation dashboards.
17:29 INFO [databricks.labs.lakebridge.deployment.dashboard] Deploying dashboards from base folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards
17:29 INFO [databricks.labs.lakebridge.deployment.dashboard] Reading dashboard folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards/aggregate_reconciliation_metrics
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f148a7f8e71f71a9c0b5045f99186d
17:29 INFO [databricks.labs.lakebridge.deployment.dashboard] Reading dashboard folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards/reconciliation_metrics
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f148a7fa671259a76a79f63264244f
17:29 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation jobs.
17:29 INFO [databricks.labs.lakebridge.deployment.job] Deploying reconciliation job.
17:29 DEBUG [databricks.labs.lakebridge.deployment.job] Applying deployment overrides: ReconcileJobConfig(existing_cluster_id='0505-172304-0nd9vn5s', tags={'lakebridge': 'reconcile_test'})
17:29 WARNING [databricks.labs.lakebridge.deployment.job] Parsed package name databricks_labs_lakebridge does not match product name, using TEST_SCHEMA.
17:29 DEBUG [databricks.labs.lakebridge.deployment.job] Reconciliation job task cluster: existing: 0505-172304-0nd9vn5s or name: None
17:29 INFO [databricks.labs.lakebridge.deployment.job] Creating new job configuration for job `Reconciliation Runner`
17:29 INFO [databricks.labs.lakebridge.deployment.job] Reconciliation job deployed with job_id=392374794475571
17:29 INFO [databricks.labs.lakebridge.deployment.job] Job URL: https://DATABRICKS_HOST#job/392374794475571
17:29 INFO [databricks.labs.lakebridge.deployment.recon] Installation of reconcile components completed successfully.
17:29 INFO [tests.integration.reconcile.conftest] Application context setup complete for recon tests
17:29 DEBUG [databricks.labs.lakebridge.reconcile.runner] Reconcile job id found in the install state.
17:29 INFO [databricks.labs.lakebridge.reconcile.runner] Triggering the reconcile job with job_id: `392374794475571`
17:29 INFO [databricks.labs.lakebridge.reconcile.runner] 'RECONCILE' job started. Please check the job_url `https://DATABRICKS_HOST/jobs/392374794475571/runs/849902229945788` for the current status.
17:30 INFO [tests.integration.reconcile.test_recon_e2e] Reconcile job run had 1 tasks
17:30 INFO [tests.integration.reconcile.test_recon_e2e] Task run_reconciliation has error message: ImportError: cannot import name 'join_aggregate_data' from 'databricks.labs.lakebridge.reconcile.compare' (/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/compare.py)
17:30 INFO [tests.integration.reconcile.test_recon_e2e] Task run_reconciliation has error trace:
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
File ~/.ipykernel/2114/command--1-1441419722:18
     15 entry = [ep for ep in metadata.distribution("databricks_labs_lakebridge").entry_points if ep.name == "reconcile"]
     16 if entry:
     17   # Load and execute the entrypoint, assumes no parameters
---> 18   entry[0].load()()
     19 else:
     20   import importlib

File /databricks/python/lib/python3.12/site-packages/importlib_metadata/__init__.py:208, in EntryPoint.load(self)
    203 """Load the entry point from its definition. If only a module
    204 is indicated by the value, return that module. Otherwise,
    205 return the named object.
    206 """
    207 match = self.pattern.match(self.value)
--> 208 module = import_module(match.group('module'))
    209 attrs = filter(None, (match.group('attr') or '').split('.'))
    210 return functools.reduce(getattr, attrs, module)

File /usr/lib/python3.12/importlib/__init__.py:90, in import_module(name, package)
     88             break
     89         level += 1
---> 90 return _bootstrap._gcd_import(name[level:], package, level)

File <frozen importlib._bootstrap>:1387, in _gcd_import(name, package, level)

File <frozen importlib._bootstrap>:1360, in _find_and_load(name, import_)

File <frozen importlib._bootstrap>:1331, in _find_and_load_unlocked(name, import_)

File <frozen importlib._bootstrap>:935, in _load_unlocked(spec)

File <frozen importlib._bootstrap_external>:995, in exec_module(self, module)

File <frozen importlib._bootstrap>:488, in _call_with_frames_removed(f, *args, **kwds)

File /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/execute.py:12
     10 from databricks.labs.lakebridge.config import ReconcileConfig, TableRecon
     11 from databricks.labs.lakebridge.reconcile.recon_config import AGG_RECONCILE_OPERATION_NAME, RECONCILE_OPERATION_NAME
---> 12 from databricks.labs.lakebridge.reconcile.trigger_recon_aggregate_service import TriggerReconAggregateService
     13 from databricks.labs.lakebridge.reconcile.trigger_recon_service import TriggerReconService
     15 logger = logging.getLogger(__name__)

File /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/trigger_recon_aggregate_service.py:21
     14 from databricks.labs.lakebridge.reconcile.recon_config import Table
     15 from databricks.labs.lakebridge.reconcile.recon_output_config import (
     16     ReconcileProcessDuration,
     17     AggregateQueryOutput,
     18     DataReconcileOutput,
     19     ReconcileOutput,
     20 )
---> 21 from databricks.labs.lakebridge.reconcile.reconciliation import Reconciliation
     22 from databricks.labs.lakebridge.reconcile.trigger_recon_service import TriggerReconService
     24 logger = logging.getLogger(__name__)

File /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/reconciliation.py:10
      4 from sqlglot import Dialect
      6 from databricks.labs.lakebridge.config import (
      7     DatabaseConfig,
      8     ReconcileMetadataConfig,
      9 )
---> 10 from databricks.labs.lakebridge.reconcile.compare import (
     11     capture_mismatch_data_and_columns,
     12     reconcile_data,
     13     join_aggregate_data,
     14     reconcile_agg_data_per_rule,
     15 )
     16 from databricks.labs.lakebridge.reconcile.connectors.data_source import DataSource
     17 from databricks.labs.lakebridge.reconcile.connectors.dialect_utils import DialectUtils

ImportError: cannot import name 'join_aggregate_data' from 'databricks.labs.lakebridge.reconcile.compare' (/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/compare.py)
17:23 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy-uspwO3Ya: https://DATABRICKS_HOST/compute/clusters/0505-172304-0nd9vn5s
17:23 DEBUG [databricks.labs.pytester.fixtures.baseline] added cluster fixture: <databricks.sdk.service._internal.Wait object at 0x7f53931db520>
17:29 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_cdypgvfin catalog: https://DATABRICKS_HOST/#explore/data/dummy_cdypgvfin
17:29 DEBUG [databricks.labs.pytester.fixtures.baseline] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=<CatalogType.MANAGED_CATALOG: 'MANAGED_CATALOG'>, comment=None, connection_name=None, created_at=1778002144841, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', effective_predictive_optimization_flag=EffectivePredictiveOptimizationFlag(value=<EnablePredictiveOptimization.DISABLE: 'DISABLE'>, inherited_from_name='primary', inherited_from_type=None), enable_predictive_optimization=<EnablePredictiveOptimization.INHERIT: 'INHERIT'>, full_name='dummy_cdypgvfin', isolation_mode=<CatalogIsolationMode.OPEN: 'OPEN'>, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_cdypgvfin', options=None, owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', properties={'RemoveAfter': '2026050519'}, provider_name=None, provisioning_info=None, securable_type=<SecurableType.CATALOG: 'CATALOG'>, share_name=None, storage_location=None, storage_root=None, updated_at=1778002144841, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e')
17:29 INFO [tests.integration.reconcile.conftest] Created catalog dummy_cdypgvfin for recon tests
17:29 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_cdypgvfin.dummy_spaa1vurh schema: https://DATABRICKS_HOST/#explore/data/dummy_cdypgvfin/dummy_spaa1vurh
17:29 DEBUG [databricks.labs.pytester.fixtures.baseline] added schema fixture: SchemaInfo(browse_only=None, catalog_name='dummy_cdypgvfin', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='dummy_cdypgvfin.dummy_spaa1vurh', metastore_id=None, name='dummy_spaa1vurh', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None)
17:29 INFO [tests.integration.reconcile.conftest] Created schema dummy_spaa1vurh in catalog dummy_cdypgvfin for recon tests
17:29 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_spaa1vurh volume: https://DATABRICKS_HOST/#explore/data/dummy_cdypgvfin/dummy_spaa1vurh/dummy_spaa1vurh
17:29 DEBUG [databricks.labs.pytester.fixtures.baseline] added volume fixture: VolumeInfo(access_point=None, browse_only=None, catalog_name='dummy_cdypgvfin', comment=None, created_at=1778002151673, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', encryption_details=None, full_name='dummy_cdypgvfin.dummy_spaa1vurh.dummy_spaa1vurh', metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_spaa1vurh', owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', schema_name='dummy_spaa1vurh', storage_location='abfss://labs-CLOUD_ENV-TEST_CATALOG-container@databrickslabsstorage.dfs.core.windows.net/8952c1e3-b265-4adf-98c3-6f755e2e1453/volumes/c49f60de-ad27-4fd6-bfb2-2e8aca816972', updated_at=1778002151673, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', volume_id='c49f60de-ad27-4fd6-bfb2-2e8aca816972', volume_type=<VolumeType.MANAGED: 'MANAGED'>)
17:29 INFO [tests.integration.reconcile.conftest] Using recon job overrides: ReconcileJobConfig(existing_cluster_id='0505-172304-0nd9vn5s', tags={'lakebridge': 'reconcile_test'})
17:29 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_cdypgvfin.dummy_spaa1vurh.dummy_twfhs6pur schema: https://DATABRICKS_HOST/#explore/data/dummy_cdypgvfin/dummy_spaa1vurh/dummy_twfhs6pur
17:29 DEBUG [databricks.labs.pytester.fixtures.baseline] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_cdypgvfin', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_cdypgvfin.dummy_spaa1vurh.dummy_twfhs6pur', metastore_id=None, name='dummy_twfhs6pur', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026050519'}, row_filter=None, schema_name='dummy_spaa1vurh', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_spaa1vurh/dummy_twfhs6pur', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
17:29 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_cdypgvfin.dummy_spaa1vurh.dummy_t4a5ciaze schema: https://DATABRICKS_HOST/#explore/data/dummy_cdypgvfin/dummy_spaa1vurh/dummy_t4a5ciaze
17:29 DEBUG [databricks.labs.pytester.fixtures.baseline] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_cdypgvfin', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_cdypgvfin.dummy_spaa1vurh.dummy_t4a5ciaze', metastore_id=None, name='dummy_t4a5ciaze', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026050519'}, row_filter=None, schema_name='dummy_spaa1vurh', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_spaa1vurh/dummy_t4a5ciaze', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
17:29 INFO [tests.integration.reconcile.conftest] Created recon tables dummy_twfhs6pur, dummy_t4a5ciaze in schema dummy_spaa1vurh
17:29 INFO [tests.integration.reconcile.conftest] Inserted data into table dummy_twfhs6pur and got response StatementStatus(error=None, state=<StatementState.SUCCEEDED: 'SUCCEEDED'>)
17:29 INFO [tests.integration.reconcile.conftest] Inserted data into table dummy_t4a5ciaze and got response StatementStatus(error=None, state=<StatementState.SUCCEEDED: 'SUCCEEDED'>)
17:29 INFO [tests.integration.reconcile.conftest] Setting up application context for recon tests
17:29 INFO [tests.integration.reconcile.conftest] Installing app and recon configuration into workspace
17:29 DEBUG [databricks.labs.lakebridge.install] No existing version found in workspace; assuming fresh installation.
17:29 INFO [databricks.labs.lakebridge.install] Installing Lakebridge reconcile Metadata components.
17:29 INFO [databricks.labs.lakebridge.deployment.recon] Installing reconcile components.
17:29 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation metadata tables.
17:29 INFO [databricks.labs.lakebridge.deployment.table] Deploying table main in dummy_cdypgvfin.dummy_spaa1vurh
17:29 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:29 INFO [databricks.labs.lakebridge.deployment.table] Deploying table metric in dummy_cdypgvfin.dummy_spaa1vurh
17:29 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:29 INFO [databricks.labs.lakebridge.deployment.table] Deploying table detai in dummy_cdypgvfin.dummy_spaa1vurh
17:29 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:29 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_metric in dummy_cdypgvfin.dummy_spaa1vurh
17:29 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:29 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_detai in dummy_cdypgvfin.dummy_spaa1vurh
17:29 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:29 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_rule in dummy_cdypgvfin.dummy_spaa1vurh
17:29 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:29 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation dashboards.
17:29 INFO [databricks.labs.lakebridge.deployment.dashboard] Deploying dashboards from base folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards
17:29 INFO [databricks.labs.lakebridge.deployment.dashboard] Reading dashboard folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards/aggregate_reconciliation_metrics
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f148a7f8e71f71a9c0b5045f99186d
17:29 INFO [databricks.labs.lakebridge.deployment.dashboard] Reading dashboard folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards/reconciliation_metrics
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:29 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f148a7fa671259a76a79f63264244f
17:29 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation jobs.
17:29 INFO [databricks.labs.lakebridge.deployment.job] Deploying reconciliation job.
17:29 DEBUG [databricks.labs.lakebridge.deployment.job] Applying deployment overrides: ReconcileJobConfig(existing_cluster_id='0505-172304-0nd9vn5s', tags={'lakebridge': 'reconcile_test'})
17:29 WARNING [databricks.labs.lakebridge.deployment.job] Parsed package name databricks_labs_lakebridge does not match product name, using TEST_SCHEMA.
17:29 DEBUG [databricks.labs.lakebridge.deployment.job] Reconciliation job task cluster: existing: 0505-172304-0nd9vn5s or name: None
17:29 INFO [databricks.labs.lakebridge.deployment.job] Creating new job configuration for job `Reconciliation Runner`
17:29 INFO [databricks.labs.lakebridge.deployment.job] Reconciliation job deployed with job_id=392374794475571
17:29 INFO [databricks.labs.lakebridge.deployment.job] Job URL: https://DATABRICKS_HOST#job/392374794475571
17:29 INFO [databricks.labs.lakebridge.deployment.recon] Installation of reconcile components completed successfully.
17:29 INFO [tests.integration.reconcile.conftest] Application context setup complete for recon tests
17:29 DEBUG [databricks.labs.lakebridge.reconcile.runner] Reconcile job id found in the install state.
17:29 INFO [databricks.labs.lakebridge.reconcile.runner] Triggering the reconcile job with job_id: `392374794475571`
17:29 INFO [databricks.labs.lakebridge.reconcile.runner] 'RECONCILE' job started. Please check the job_url `https://DATABRICKS_HOST/jobs/392374794475571/runs/849902229945788` for the current status.
17:30 INFO [tests.integration.reconcile.test_recon_e2e] Reconcile job run had 1 tasks
17:30 INFO [tests.integration.reconcile.test_recon_e2e] Task run_reconciliation has error message: ImportError: cannot import name 'join_aggregate_data' from 'databricks.labs.lakebridge.reconcile.compare' (/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/compare.py)
17:30 INFO [tests.integration.reconcile.test_recon_e2e] Task run_reconciliation has error trace:
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
File ~/.ipykernel/2114/command--1-1441419722:18
     15 entry = [ep for ep in metadata.distribution("databricks_labs_lakebridge").entry_points if ep.name == "reconcile"]
     16 if entry:
     17   # Load and execute the entrypoint, assumes no parameters
---> 18   entry[0].load()()
     19 else:
     20   import importlib

File /databricks/python/lib/python3.12/site-packages/importlib_metadata/__init__.py:208, in EntryPoint.load(self)
    203 """Load the entry point from its definition. If only a module
    204 is indicated by the value, return that module. Otherwise,
    205 return the named object.
    206 """
    207 match = self.pattern.match(self.value)
--> 208 module = import_module(match.group('module'))
    209 attrs = filter(None, (match.group('attr') or '').split('.'))
    210 return functools.reduce(getattr, attrs, module)

File /usr/lib/python3.12/importlib/__init__.py:90, in import_module(name, package)
     88             break
     89         level += 1
---> 90 return _bootstrap._gcd_import(name[level:], package, level)

File <frozen importlib._bootstrap>:1387, in _gcd_import(name, package, level)

File <frozen importlib._bootstrap>:1360, in _find_and_load(name, import_)

File <frozen importlib._bootstrap>:1331, in _find_and_load_unlocked(name, import_)

File <frozen importlib._bootstrap>:935, in _load_unlocked(spec)

File <frozen importlib._bootstrap_external>:995, in exec_module(self, module)

File <frozen importlib._bootstrap>:488, in _call_with_frames_removed(f, *args, **kwds)

File /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/execute.py:12
     10 from databricks.labs.lakebridge.config import ReconcileConfig, TableRecon
     11 from databricks.labs.lakebridge.reconcile.recon_config import AGG_RECONCILE_OPERATION_NAME, RECONCILE_OPERATION_NAME
---> 12 from databricks.labs.lakebridge.reconcile.trigger_recon_aggregate_service import TriggerReconAggregateService
     13 from databricks.labs.lakebridge.reconcile.trigger_recon_service import TriggerReconService
     15 logger = logging.getLogger(__name__)

File /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/trigger_recon_aggregate_service.py:21
     14 from databricks.labs.lakebridge.reconcile.recon_config import Table
     15 from databricks.labs.lakebridge.reconcile.recon_output_config import (
     16     ReconcileProcessDuration,
     17     AggregateQueryOutput,
     18     DataReconcileOutput,
     19     ReconcileOutput,
     20 )
---> 21 from databricks.labs.lakebridge.reconcile.reconciliation import Reconciliation
     22 from databricks.labs.lakebridge.reconcile.trigger_recon_service import TriggerReconService
     24 logger = logging.getLogger(__name__)

File /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/reconciliation.py:10
      4 from sqlglot import Dialect
      6 from databricks.labs.lakebridge.config import (
      7     DatabaseConfig,
      8     ReconcileMetadataConfig,
      9 )
---> 10 from databricks.labs.lakebridge.reconcile.compare import (
     11     capture_mismatch_data_and_columns,
     12     reconcile_data,
     13     join_aggregate_data,
     14     reconcile_agg_data_per_rule,
     15 )
     16 from databricks.labs.lakebridge.reconcile.connectors.data_source import DataSource
     17 from databricks.labs.lakebridge.reconcile.connectors.dialect_utils import DialectUtils

ImportError: cannot import name 'join_aggregate_data' from 'databricks.labs.lakebridge.reconcile.compare' (/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/compare.py)
17:30 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 2 table fixtures
17:30 DEBUG [databricks.labs.pytester.fixtures.baseline] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_cdypgvfin', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_cdypgvfin.dummy_spaa1vurh.dummy_twfhs6pur', metastore_id=None, name='dummy_twfhs6pur', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026050519'}, row_filter=None, schema_name='dummy_spaa1vurh', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_spaa1vurh/dummy_twfhs6pur', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
17:30 DEBUG [databricks.labs.pytester.fixtures.baseline] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_cdypgvfin', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_cdypgvfin.dummy_spaa1vurh.dummy_t4a5ciaze', metastore_id=None, name='dummy_t4a5ciaze', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026050519'}, row_filter=None, schema_name='dummy_spaa1vurh', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_spaa1vurh/dummy_t4a5ciaze', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
17:30 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 1 volume fixtures
17:30 DEBUG [databricks.labs.pytester.fixtures.baseline] removing volume fixture: VolumeInfo(access_point=None, browse_only=None, catalog_name='dummy_cdypgvfin', comment=None, created_at=1778002151673, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', encryption_details=None, full_name='dummy_cdypgvfin.dummy_spaa1vurh.dummy_spaa1vurh', metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_spaa1vurh', owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', schema_name='dummy_spaa1vurh', storage_location='abfss://labs-CLOUD_ENV-TEST_CATALOG-container@databrickslabsstorage.dfs.core.windows.net/8952c1e3-b265-4adf-98c3-6f755e2e1453/volumes/c49f60de-ad27-4fd6-bfb2-2e8aca816972', updated_at=1778002151673, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', volume_id='c49f60de-ad27-4fd6-bfb2-2e8aca816972', volume_type=<VolumeType.MANAGED: 'MANAGED'>)
17:30 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 1 schema fixtures
17:30 DEBUG [databricks.labs.pytester.fixtures.baseline] removing schema fixture: SchemaInfo(browse_only=None, catalog_name='dummy_cdypgvfin', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='dummy_cdypgvfin.dummy_spaa1vurh', metastore_id=None, name='dummy_spaa1vurh', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None)
17:30 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 1 catalog fixtures
17:30 DEBUG [databricks.labs.pytester.fixtures.baseline] removing catalog fixture: CatalogInfo(browse_only=False, catalog_type=<CatalogType.MANAGED_CATALOG: 'MANAGED_CATALOG'>, comment=None, connection_name=None, created_at=1778002144841, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', effective_predictive_optimization_flag=EffectivePredictiveOptimizationFlag(value=<EnablePredictiveOptimization.DISABLE: 'DISABLE'>, inherited_from_name='primary', inherited_from_type=None), enable_predictive_optimization=<EnablePredictiveOptimization.INHERIT: 'INHERIT'>, full_name='dummy_cdypgvfin', isolation_mode=<CatalogIsolationMode.OPEN: 'OPEN'>, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_cdypgvfin', options=None, owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', properties={'RemoveAfter': '2026050519'}, provider_name=None, provisioning_info=None, securable_type=<SecurableType.CATALOG: 'CATALOG'>, share_name=None, storage_location=None, storage_root=None, updated_at=1778002144841, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e')
17:30 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 1 cluster fixtures
17:30 DEBUG [databricks.labs.pytester.fixtures.baseline] removing cluster fixture: <databricks.sdk.service._internal.Wait object at 0x7f53931db520>
[gw3] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_recon_sql_server_job_succeeds: databricks.sdk.errors.sdk.OperationFailed: failed to reach TERMINATED or SKIPPED, got RunLifeCycleState.INTERNAL_ERROR: Task run_reconciliation failed with message: Workload failed, see run output for details. (4m13.579s)
... (skipped 14485 bytes)
k4gs.dummy_sj8degx9j
17:33 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:33 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_metric in dummy_c3rnek4gs.dummy_sj8degx9j
17:33 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:33 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_detai in dummy_c3rnek4gs.dummy_sj8degx9j
17:33 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:33 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_rule in dummy_c3rnek4gs.dummy_sj8degx9j
17:33 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:33 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation dashboards.
17:33 INFO [databricks.labs.lakebridge.deployment.dashboard] Deploying dashboards from base folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards
17:33 INFO [databricks.labs.lakebridge.deployment.dashboard] Reading dashboard folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards/aggregate_reconciliation_metrics
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f148a88fa216aaa4c2fc07e29fa06f
17:33 INFO [databricks.labs.lakebridge.deployment.dashboard] Reading dashboard folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards/reconciliation_metrics
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f148a890fa162aad63bacfbd5103f6
17:33 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation jobs.
17:33 INFO [databricks.labs.lakebridge.deployment.job] Deploying reconciliation job.
17:33 DEBUG [databricks.labs.lakebridge.deployment.job] Applying deployment overrides: ReconcileJobConfig(existing_cluster_id='0505-173051-xj489x4w', tags={'lakebridge': 'reconcile_test'})
17:33 WARNING [databricks.labs.lakebridge.deployment.job] Parsed package name databricks_labs_lakebridge does not match product name, using TEST_SCHEMA.
17:33 DEBUG [databricks.labs.lakebridge.deployment.job] Reconciliation job task cluster: existing: 0505-173051-xj489x4w or name: None
17:33 INFO [databricks.labs.lakebridge.deployment.job] Creating new job configuration for job `Reconciliation Runner`
17:33 INFO [databricks.labs.lakebridge.deployment.job] Reconciliation job deployed with job_id=1004155833642890
17:33 INFO [databricks.labs.lakebridge.deployment.job] Job URL: https://DATABRICKS_HOST#job/1004155833642890
17:33 INFO [databricks.labs.lakebridge.deployment.recon] Installation of reconcile components completed successfully.
17:33 INFO [tests.integration.reconcile.conftest] Application context setup complete for recon tests
17:33 DEBUG [databricks.labs.lakebridge.reconcile.runner] Reconcile job id found in the install state.
17:33 INFO [databricks.labs.lakebridge.reconcile.runner] Triggering the reconcile job with job_id: `1004155833642890`
17:33 INFO [databricks.labs.lakebridge.reconcile.runner] 'RECONCILE' job started. Please check the job_url `https://DATABRICKS_HOST/jobs/1004155833642890/runs/27112492040686` for the current status.
17:34 INFO [tests.integration.reconcile.test_recon_e2e] Reconcile job run had 1 tasks
17:35 INFO [tests.integration.reconcile.test_recon_e2e] Task run_reconciliation has error message: ImportError: cannot import name 'join_aggregate_data' from 'databricks.labs.lakebridge.reconcile.compare' (/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/compare.py)
17:35 INFO [tests.integration.reconcile.test_recon_e2e] Task run_reconciliation has error trace:
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
File ~/.ipykernel/2169/command--1-1231361596:18
     15 entry = [ep for ep in metadata.distribution("databricks_labs_lakebridge").entry_points if ep.name == "reconcile"]
     16 if entry:
     17   # Load and execute the entrypoint, assumes no parameters
---> 18   entry[0].load()()
     19 else:
     20   import importlib

File /databricks/python/lib/python3.12/site-packages/importlib_metadata/__init__.py:208, in EntryPoint.load(self)
    203 """Load the entry point from its definition. If only a module
    204 is indicated by the value, return that module. Otherwise,
    205 return the named object.
    206 """
    207 match = self.pattern.match(self.value)
--> 208 module = import_module(match.group('module'))
    209 attrs = filter(None, (match.group('attr') or '').split('.'))
    210 return functools.reduce(getattr, attrs, module)

File /usr/lib/python3.12/importlib/__init__.py:90, in import_module(name, package)
     88             break
     89         level += 1
---> 90 return _bootstrap._gcd_import(name[level:], package, level)

File <frozen importlib._bootstrap>:1387, in _gcd_import(name, package, level)

File <frozen importlib._bootstrap>:1360, in _find_and_load(name, import_)

File <frozen importlib._bootstrap>:1331, in _find_and_load_unlocked(name, import_)

File <frozen importlib._bootstrap>:935, in _load_unlocked(spec)

File <frozen importlib._bootstrap_external>:995, in exec_module(self, module)

File <frozen importlib._bootstrap>:488, in _call_with_frames_removed(f, *args, **kwds)

File /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/execute.py:12
     10 from databricks.labs.lakebridge.config import ReconcileConfig, TableRecon
     11 from databricks.labs.lakebridge.reconcile.recon_config import AGG_RECONCILE_OPERATION_NAME, RECONCILE_OPERATION_NAME
---> 12 from databricks.labs.lakebridge.reconcile.trigger_recon_aggregate_service import TriggerReconAggregateService
     13 from databricks.labs.lakebridge.reconcile.trigger_recon_service import TriggerReconService
     15 logger = logging.getLogger(__name__)

File /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/trigger_recon_aggregate_service.py:21
     14 from databricks.labs.lakebridge.reconcile.recon_config import Table
     15 from databricks.labs.lakebridge.reconcile.recon_output_config import (
     16     ReconcileProcessDuration,
     17     AggregateQueryOutput,
     18     DataReconcileOutput,
     19     ReconcileOutput,
     20 )
---> 21 from databricks.labs.lakebridge.reconcile.reconciliation import Reconciliation
     22 from databricks.labs.lakebridge.reconcile.trigger_recon_service import TriggerReconService
     24 logger = logging.getLogger(__name__)

File /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/reconciliation.py:10
      4 from sqlglot import Dialect
      6 from databricks.labs.lakebridge.config import (
      7     DatabaseConfig,
      8     ReconcileMetadataConfig,
      9 )
---> 10 from databricks.labs.lakebridge.reconcile.compare import (
     11     capture_mismatch_data_and_columns,
     12     reconcile_data,
     13     join_aggregate_data,
     14     reconcile_agg_data_per_rule,
     15 )
     16 from databricks.labs.lakebridge.reconcile.connectors.data_source import DataSource
     17 from databricks.labs.lakebridge.reconcile.connectors.dialect_utils import DialectUtils

ImportError: cannot import name 'join_aggregate_data' from 'databricks.labs.lakebridge.reconcile.compare' (/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/compare.py)
17:30 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy-QB2PDhlb: https://DATABRICKS_HOST/compute/clusters/0505-173051-xj489x4w
17:30 DEBUG [databricks.labs.pytester.fixtures.baseline] added cluster fixture: <databricks.sdk.service._internal.Wait object at 0x7f5393842350>
17:33 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_c3rnek4gs catalog: https://DATABRICKS_HOST/#explore/data/dummy_c3rnek4gs
17:33 DEBUG [databricks.labs.pytester.fixtures.baseline] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=<CatalogType.MANAGED_CATALOG: 'MANAGED_CATALOG'>, comment=None, connection_name=None, created_at=1778002398922, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', effective_predictive_optimization_flag=EffectivePredictiveOptimizationFlag(value=<EnablePredictiveOptimization.DISABLE: 'DISABLE'>, inherited_from_name='primary', inherited_from_type=None), enable_predictive_optimization=<EnablePredictiveOptimization.INHERIT: 'INHERIT'>, full_name='dummy_c3rnek4gs', isolation_mode=<CatalogIsolationMode.OPEN: 'OPEN'>, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_c3rnek4gs', options=None, owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', properties={'RemoveAfter': '2026050519'}, provider_name=None, provisioning_info=None, securable_type=<SecurableType.CATALOG: 'CATALOG'>, share_name=None, storage_location=None, storage_root=None, updated_at=1778002398922, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e')
17:33 INFO [tests.integration.reconcile.conftest] Created catalog dummy_c3rnek4gs for recon tests
17:33 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_c3rnek4gs.dummy_sj8degx9j schema: https://DATABRICKS_HOST/#explore/data/dummy_c3rnek4gs/dummy_sj8degx9j
17:33 DEBUG [databricks.labs.pytester.fixtures.baseline] added schema fixture: SchemaInfo(browse_only=None, catalog_name='dummy_c3rnek4gs', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='dummy_c3rnek4gs.dummy_sj8degx9j', metastore_id=None, name='dummy_sj8degx9j', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None)
17:33 INFO [tests.integration.reconcile.conftest] Created schema dummy_sj8degx9j in catalog dummy_c3rnek4gs for recon tests
17:33 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_sj8degx9j volume: https://DATABRICKS_HOST/#explore/data/dummy_c3rnek4gs/dummy_sj8degx9j/dummy_sj8degx9j
17:33 DEBUG [databricks.labs.pytester.fixtures.baseline] added volume fixture: VolumeInfo(access_point=None, browse_only=None, catalog_name='dummy_c3rnek4gs', comment=None, created_at=1778002400839, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', encryption_details=None, full_name='dummy_c3rnek4gs.dummy_sj8degx9j.dummy_sj8degx9j', metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_sj8degx9j', owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', schema_name='dummy_sj8degx9j', storage_location='abfss://labs-CLOUD_ENV-TEST_CATALOG-container@databrickslabsstorage.dfs.core.windows.net/8952c1e3-b265-4adf-98c3-6f755e2e1453/volumes/8c746eba-2187-412a-91b6-d304720c2fbd', updated_at=1778002400839, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', volume_id='8c746eba-2187-412a-91b6-d304720c2fbd', volume_type=<VolumeType.MANAGED: 'MANAGED'>)
17:33 INFO [tests.integration.reconcile.conftest] Using recon job overrides: ReconcileJobConfig(existing_cluster_id='0505-173051-xj489x4w', tags={'lakebridge': 'reconcile_test'})
17:33 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_c3rnek4gs.dummy_sj8degx9j.dummy_tkw0gd3sa schema: https://DATABRICKS_HOST/#explore/data/dummy_c3rnek4gs/dummy_sj8degx9j/dummy_tkw0gd3sa
17:33 DEBUG [databricks.labs.pytester.fixtures.baseline] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_c3rnek4gs', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_c3rnek4gs.dummy_sj8degx9j.dummy_tkw0gd3sa', metastore_id=None, name='dummy_tkw0gd3sa', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026050519'}, row_filter=None, schema_name='dummy_sj8degx9j', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_sj8degx9j/dummy_tkw0gd3sa', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
17:33 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_c3rnek4gs.dummy_sj8degx9j.dummy_tm6fs8ezc schema: https://DATABRICKS_HOST/#explore/data/dummy_c3rnek4gs/dummy_sj8degx9j/dummy_tm6fs8ezc
17:33 DEBUG [databricks.labs.pytester.fixtures.baseline] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_c3rnek4gs', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_c3rnek4gs.dummy_sj8degx9j.dummy_tm6fs8ezc', metastore_id=None, name='dummy_tm6fs8ezc', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026050519'}, row_filter=None, schema_name='dummy_sj8degx9j', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_sj8degx9j/dummy_tm6fs8ezc', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
17:33 INFO [tests.integration.reconcile.conftest] Created recon tables dummy_tkw0gd3sa, dummy_tm6fs8ezc in schema dummy_sj8degx9j
17:33 INFO [tests.integration.reconcile.conftest] Inserted data into table dummy_tkw0gd3sa and got response StatementStatus(error=None, state=<StatementState.SUCCEEDED: 'SUCCEEDED'>)
17:33 INFO [tests.integration.reconcile.conftest] Inserted data into table dummy_tm6fs8ezc and got response StatementStatus(error=None, state=<StatementState.SUCCEEDED: 'SUCCEEDED'>)
17:33 INFO [tests.integration.reconcile.conftest] Setting up application context for recon tests
17:33 INFO [tests.integration.reconcile.conftest] Installing app and recon configuration into workspace
17:33 DEBUG [databricks.labs.lakebridge.install] No existing version found in workspace; assuming fresh installation.
17:33 INFO [databricks.labs.lakebridge.install] Installing Lakebridge reconcile Metadata components.
17:33 INFO [databricks.labs.lakebridge.deployment.recon] Installing reconcile components.
17:33 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation metadata tables.
17:33 INFO [databricks.labs.lakebridge.deployment.table] Deploying table main in dummy_c3rnek4gs.dummy_sj8degx9j
17:33 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:33 INFO [databricks.labs.lakebridge.deployment.table] Deploying table metric in dummy_c3rnek4gs.dummy_sj8degx9j
17:33 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:33 INFO [databricks.labs.lakebridge.deployment.table] Deploying table detai in dummy_c3rnek4gs.dummy_sj8degx9j
17:33 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:33 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_metric in dummy_c3rnek4gs.dummy_sj8degx9j
17:33 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:33 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_detai in dummy_c3rnek4gs.dummy_sj8degx9j
17:33 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:33 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_rule in dummy_c3rnek4gs.dummy_sj8degx9j
17:33 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:33 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation dashboards.
17:33 INFO [databricks.labs.lakebridge.deployment.dashboard] Deploying dashboards from base folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards
17:33 INFO [databricks.labs.lakebridge.deployment.dashboard] Reading dashboard folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards/aggregate_reconciliation_metrics
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f148a88fa216aaa4c2fc07e29fa06f
17:33 INFO [databricks.labs.lakebridge.deployment.dashboard] Reading dashboard folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards/reconciliation_metrics
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:33 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f148a890fa162aad63bacfbd5103f6
17:33 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation jobs.
17:33 INFO [databricks.labs.lakebridge.deployment.job] Deploying reconciliation job.
17:33 DEBUG [databricks.labs.lakebridge.deployment.job] Applying deployment overrides: ReconcileJobConfig(existing_cluster_id='0505-173051-xj489x4w', tags={'lakebridge': 'reconcile_test'})
17:33 WARNING [databricks.labs.lakebridge.deployment.job] Parsed package name databricks_labs_lakebridge does not match product name, using TEST_SCHEMA.
17:33 DEBUG [databricks.labs.lakebridge.deployment.job] Reconciliation job task cluster: existing: 0505-173051-xj489x4w or name: None
17:33 INFO [databricks.labs.lakebridge.deployment.job] Creating new job configuration for job `Reconciliation Runner`
17:33 INFO [databricks.labs.lakebridge.deployment.job] Reconciliation job deployed with job_id=1004155833642890
17:33 INFO [databricks.labs.lakebridge.deployment.job] Job URL: https://DATABRICKS_HOST#job/1004155833642890
17:33 INFO [databricks.labs.lakebridge.deployment.recon] Installation of reconcile components completed successfully.
17:33 INFO [tests.integration.reconcile.conftest] Application context setup complete for recon tests
17:33 DEBUG [databricks.labs.lakebridge.reconcile.runner] Reconcile job id found in the install state.
17:33 INFO [databricks.labs.lakebridge.reconcile.runner] Triggering the reconcile job with job_id: `1004155833642890`
17:33 INFO [databricks.labs.lakebridge.reconcile.runner] 'RECONCILE' job started. Please check the job_url `https://DATABRICKS_HOST/jobs/1004155833642890/runs/27112492040686` for the current status.
17:34 INFO [tests.integration.reconcile.test_recon_e2e] Reconcile job run had 1 tasks
17:35 INFO [tests.integration.reconcile.test_recon_e2e] Task run_reconciliation has error message: ImportError: cannot import name 'join_aggregate_data' from 'databricks.labs.lakebridge.reconcile.compare' (/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/compare.py)
17:35 INFO [tests.integration.reconcile.test_recon_e2e] Task run_reconciliation has error trace:
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
File ~/.ipykernel/2169/command--1-1231361596:18
     15 entry = [ep for ep in metadata.distribution("databricks_labs_lakebridge").entry_points if ep.name == "reconcile"]
     16 if entry:
     17   # Load and execute the entrypoint, assumes no parameters
---> 18   entry[0].load()()
     19 else:
     20   import importlib

File /databricks/python/lib/python3.12/site-packages/importlib_metadata/__init__.py:208, in EntryPoint.load(self)
    203 """Load the entry point from its definition. If only a module
    204 is indicated by the value, return that module. Otherwise,
    205 return the named object.
    206 """
    207 match = self.pattern.match(self.value)
--> 208 module = import_module(match.group('module'))
    209 attrs = filter(None, (match.group('attr') or '').split('.'))
    210 return functools.reduce(getattr, attrs, module)

File /usr/lib/python3.12/importlib/__init__.py:90, in import_module(name, package)
     88             break
     89         level += 1
---> 90 return _bootstrap._gcd_import(name[level:], package, level)

File <frozen importlib._bootstrap>:1387, in _gcd_import(name, package, level)

File <frozen importlib._bootstrap>:1360, in _find_and_load(name, import_)

File <frozen importlib._bootstrap>:1331, in _find_and_load_unlocked(name, import_)

File <frozen importlib._bootstrap>:935, in _load_unlocked(spec)

File <frozen importlib._bootstrap_external>:995, in exec_module(self, module)

File <frozen importlib._bootstrap>:488, in _call_with_frames_removed(f, *args, **kwds)

File /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/execute.py:12
     10 from databricks.labs.lakebridge.config import ReconcileConfig, TableRecon
     11 from databricks.labs.lakebridge.reconcile.recon_config import AGG_RECONCILE_OPERATION_NAME, RECONCILE_OPERATION_NAME
---> 12 from databricks.labs.lakebridge.reconcile.trigger_recon_aggregate_service import TriggerReconAggregateService
     13 from databricks.labs.lakebridge.reconcile.trigger_recon_service import TriggerReconService
     15 logger = logging.getLogger(__name__)

File /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/trigger_recon_aggregate_service.py:21
     14 from databricks.labs.lakebridge.reconcile.recon_config import Table
     15 from databricks.labs.lakebridge.reconcile.recon_output_config import (
     16     ReconcileProcessDuration,
     17     AggregateQueryOutput,
     18     DataReconcileOutput,
     19     ReconcileOutput,
     20 )
---> 21 from databricks.labs.lakebridge.reconcile.reconciliation import Reconciliation
     22 from databricks.labs.lakebridge.reconcile.trigger_recon_service import TriggerReconService
     24 logger = logging.getLogger(__name__)

File /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/reconciliation.py:10
      4 from sqlglot import Dialect
      6 from databricks.labs.lakebridge.config import (
      7     DatabaseConfig,
      8     ReconcileMetadataConfig,
      9 )
---> 10 from databricks.labs.lakebridge.reconcile.compare import (
     11     capture_mismatch_data_and_columns,
     12     reconcile_data,
     13     join_aggregate_data,
     14     reconcile_agg_data_per_rule,
     15 )
     16 from databricks.labs.lakebridge.reconcile.connectors.data_source import DataSource
     17 from databricks.labs.lakebridge.reconcile.connectors.dialect_utils import DialectUtils

ImportError: cannot import name 'join_aggregate_data' from 'databricks.labs.lakebridge.reconcile.compare' (/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/lakebridge/reconcile/compare.py)
17:35 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 2 table fixtures
17:35 DEBUG [databricks.labs.pytester.fixtures.baseline] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_c3rnek4gs', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_c3rnek4gs.dummy_sj8degx9j.dummy_tkw0gd3sa', metastore_id=None, name='dummy_tkw0gd3sa', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026050519'}, row_filter=None, schema_name='dummy_sj8degx9j', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_sj8degx9j/dummy_tkw0gd3sa', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
17:35 DEBUG [databricks.labs.pytester.fixtures.baseline] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_c3rnek4gs', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_c3rnek4gs.dummy_sj8degx9j.dummy_tm6fs8ezc', metastore_id=None, name='dummy_tm6fs8ezc', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026050519'}, row_filter=None, schema_name='dummy_sj8degx9j', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_sj8degx9j/dummy_tm6fs8ezc', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
17:35 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 1 volume fixtures
17:35 DEBUG [databricks.labs.pytester.fixtures.baseline] removing volume fixture: VolumeInfo(access_point=None, browse_only=None, catalog_name='dummy_c3rnek4gs', comment=None, created_at=1778002400839, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', encryption_details=None, full_name='dummy_c3rnek4gs.dummy_sj8degx9j.dummy_sj8degx9j', metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_sj8degx9j', owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', schema_name='dummy_sj8degx9j', storage_location='abfss://labs-CLOUD_ENV-TEST_CATALOG-container@databrickslabsstorage.dfs.core.windows.net/8952c1e3-b265-4adf-98c3-6f755e2e1453/volumes/8c746eba-2187-412a-91b6-d304720c2fbd', updated_at=1778002400839, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', volume_id='8c746eba-2187-412a-91b6-d304720c2fbd', volume_type=<VolumeType.MANAGED: 'MANAGED'>)
17:35 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 1 schema fixtures
17:35 DEBUG [databricks.labs.pytester.fixtures.baseline] removing schema fixture: SchemaInfo(browse_only=None, catalog_name='dummy_c3rnek4gs', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='dummy_c3rnek4gs.dummy_sj8degx9j', metastore_id=None, name='dummy_sj8degx9j', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None)
17:35 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 1 catalog fixtures
17:35 DEBUG [databricks.labs.pytester.fixtures.baseline] removing catalog fixture: CatalogInfo(browse_only=False, catalog_type=<CatalogType.MANAGED_CATALOG: 'MANAGED_CATALOG'>, comment=None, connection_name=None, created_at=1778002398922, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', effective_predictive_optimization_flag=EffectivePredictiveOptimizationFlag(value=<EnablePredictiveOptimization.DISABLE: 'DISABLE'>, inherited_from_name='primary', inherited_from_type=None), enable_predictive_optimization=<EnablePredictiveOptimization.INHERIT: 'INHERIT'>, full_name='dummy_c3rnek4gs', isolation_mode=<CatalogIsolationMode.OPEN: 'OPEN'>, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_c3rnek4gs', options=None, owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', properties={'RemoveAfter': '2026050519'}, provider_name=None, provisioning_info=None, securable_type=<SecurableType.CATALOG: 'CATALOG'>, share_name=None, storage_location=None, storage_root=None, updated_at=1778002398922, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e')
17:35 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 1 cluster fixtures
17:35 DEBUG [databricks.labs.pytester.fixtures.baseline] removing cluster fixture: <databricks.sdk.service._internal.Wait object at 0x7f5393842350>
[gw3] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python

Running from acceptance #4235

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[TECH DEBT]: Data Compare Consolidation

2 participants