Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use internal Permissions Migration API by default #3230

Merged
merged 5 commits into from
Nov 11, 2024

Conversation

nfx
Copy link
Collaborator

@nfx nfx commented Nov 8, 2024

This pull request introduces changes to support both legacy and new permission migration workflows in the Databricks UCX project. The key changes include adding a configuration option to toggle between the two workflows, renaming classes and methods to reflect legacy and new workflows, and updating tests accordingly.

  • Added a configuration option use_legacy_permission_migration to WorkspaceConfig to toggle between legacy and new permission migration workflows.
  • Updated methods in workflows.py to skip certain steps if use_legacy_permission_migration is not enabled. [1] [2]
  • Renamed GroupMigration to LegacyGroupMigration and updated related method names to reflect the legacy workflow. [1] [2] [3] [4] [5] [6] [7] [8] [9]
  • Updated integration and unit tests to use the new configuration option and renamed classes/methods. [1] [2] [3] [4] [5] [6] [7]

@nfx nfx requested a review from a team as a code owner November 8, 2024 17:32
Copy link

github-actions bot commented Nov 8, 2024

❌ 88/89 passed, 1 failed, 5 skipped, 2h22m44s total

❌ test_permission_for_files_anonymous_func: AssertionError: assert 'sdk-ABcu-ra78a57c12' in {} (33m58.265s)
AssertionError: assert 'sdk-ABcu-ra78a57c12' in {}
 +  where 'sdk-ABcu-ra78a57c12' = Group(display_name='sdk-ABcu-ra78a57c12', entitlements=[], external_id=None, groups=[], id='880205116801771', members=...[], schemas=[<GroupSchema.URN_IETF_PARAMS_SCIM_SCHEMAS_CORE_2_0_GROUP: 'urn:ietf:params:scim:schemas:core:2.0:Group'>]).display_name
[gw7] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
20:42 DEBUG [tests.integration.workspace_access.test_tacl] old=sdk-ABcu-ra78a57c12, new=sdk-nn6H-ra78a57c12
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.grants] fetching grants inventory
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] Inventory table not found
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/framework/crawlers.py", line 152, in _snapshot
    cached_results = list(fetcher())
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/hive_metastore/grants.py", line 236, in _try_fetch
    for row in self._fetch(f"SELECT * FROM {escape_sql_identifier(self.full_name)}"):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 344, in fetch_all
    execute_response = self.execute(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 268, in execute
    self._raise_if_needed(status)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 478, in _raise_if_needed
    raise NotFound(error_message)
databricks.sdk.errors.platform.NotFound: [TABLE_OR_VIEW_NOT_FOUND] The table or view `hive_metastore`.`dummy_s1num`.`grants` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 1 pos 14
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.grants] crawling new set of snapshot data for grants
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.tables] fetching tables inventory
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] Inventory table not found
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/framework/crawlers.py", line 152, in _snapshot
    cached_results = list(fetcher())
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/hive_metastore/tables.py", line 411, in _try_fetch
    for row in self._fetch(f"SELECT * FROM {escape_sql_identifier(self.full_name)}"):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 344, in fetch_all
    execute_response = self.execute(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 268, in execute
    self._raise_if_needed(status)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 478, in _raise_if_needed
    raise NotFound(error_message)
databricks.sdk.errors.platform.NotFound: [TABLE_OR_VIEW_NOT_FOUND] The table or view `hive_metastore`.`dummy_s1num`.`tables` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 1 pos 14
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.tables] crawling new set of snapshot data for tables
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] Skipping UCX inventory schema: dummy_s1num
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s01j7] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s04jx] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s0bpy] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s1ye7] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s2ajt] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s4f1o] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s6kel] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_scp40] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sddxx] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sdrjn] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgjgn] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sglnz] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_slge2] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sq8je] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_ss6nj] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_ssa4e] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_ssuy7] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_svxyl] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sxyof] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sy8b4] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_szdcu] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_szdcu.pipelines] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sglnz.dummy_tpahr] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_scp40.groups] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s1ye7.dummy_t3lb5] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s6kel.grants] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s6kel.tables] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s6kel.udfs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sddxx.grants] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sddxx.migration_status] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sddxx.workflow_problems] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s01j7.dummy_trqu6] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_ss6nj.grants] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_ss6nj.migration_status] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_ss6nj.workflow_problems] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.CLOUD_ENV_service_principals] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.clusters] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.code_patterns] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.directfs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.directfs_in_paths] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.directfs_in_queries] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.external_locations] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.global_init_scripts] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.grant_detail] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.grants] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.groups] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.inferred_grants] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.jobs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.logs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.migration_status] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.misc_patterns] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.mounts] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.objects] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.permissions] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.pipelines] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.policies] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.query_problems] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.recon_results] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.reconciliation_results] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.submit_runs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.table_estimates] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.table_failures] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.table_size] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.tables] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.udfs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.used_tables] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.used_tables_in_paths] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.used_tables_in_queries] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.workflow_problems] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.workspace_objects] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s0bpy.dummy_tye6d] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sxyof.dummy_te5mv] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sq8je.dummy_t8p12] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sdrjn.grants] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sdrjn.migration_status] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sdrjn.workflow_problems] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_svxyl.dummy_tknp1] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.concurrent_testing] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.concurrently_testing] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.foo] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.foo01] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.foo02] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.foo03] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.foo04] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.foo05] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.foo2] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_factiously_unsartorial] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_baculitic_louvering] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_contextual_concinnous] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_entailable_underedge] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_fimbricated_archvestryman] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_globuloid_doctrinarity] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_imperialistically_interceder] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_inhumorously_sobriquet] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_intercrescence_cyanopsia] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_interparty_nephrocoele] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_metaluminate_municipality] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_postposition_slipgibbet] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_preinterpretation_buddhistical] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_preliminary_petiolular] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_squawtits_suprapubian] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_turbinate_apprehendingly] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_unblackened_recompilement] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_unsalability_bimaxillary] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_xanthosis_weariedly] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.some] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tacopyrine_bartonella] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tburlap_yellow] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tbuteo_datolite] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tcried_thundering] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tepitactic_jasminewood] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.test_truncate] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tforbid_greenside] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tgazon_subdivisible] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tlilywood_osteophone] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.towner_pericardiosymphysis] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tperilabyrinth_sociologism] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tplottage_planera] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.trerobe_shradh] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tromaean_distalwards] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tstylishly_recrystallize] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tzeuglodontidae_centrical] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.ucx_cvg2] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.ucx_oeto] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.ucx_tfuor] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.ucx_tmvgz] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.ucx_tuhas] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.ucx_yafg] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgjgn.groups] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.CLOUD_ENV_service_principals] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.clusters] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.code_patterns] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.directfs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.directfs_in_paths] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.directfs_in_queries] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.external_locations] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.global_init_scripts] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.grant_detail] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.grants] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.groups] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.inferred_grants] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.jobs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.logs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.migration_status] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.misc_patterns] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.mounts] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.objects] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.permissions] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.pipelines] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.policies] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.query_problems] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.recon_results] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.reconciliation_results] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.submit_runs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.table_estimates] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.table_failures] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.table_size] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.tables] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.udfs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.used_tables] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.used_tables_in_paths] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.used_tables_in_queries] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.workflow_problems] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.workspace_objects] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_ssuy7.groups] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s04jx.groups] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.tables] found 145 new records for tables
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.udfs] fetching udfs inventory
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] Inventory table not found
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/framework/crawlers.py", line 152, in _snapshot
    cached_results = list(fetcher())
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/hive_metastore/udfs.py", line 63, in _try_fetch
    for row in self._fetch(f"SELECT * FROM {escape_sql_identifier(self.full_name)}"):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 344, in fetch_all
    execute_response = self.execute(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 268, in execute
    self._raise_if_needed(status)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 478, in _raise_if_needed
    raise NotFound(error_message)
databricks.sdk.errors.platform.NotFound: [TABLE_OR_VIEW_NOT_FOUND] The table or view `hive_metastore`.`dummy_s1num`.`udfs` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 1 pos 14
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.udfs] crawling new set of snapshot data for udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.TEST_SCHEMA] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_s01j7] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_s04jx] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_s0bpy] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_s1num] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_s1ye7] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_s2ajt] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_s4f1o] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_s6kel] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_scp40] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sddxx] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sdrjn] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sgd9a] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sgjgn] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sglnz] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_slge2] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sq8je] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_ss6nj] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_ssa4e] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_ssuy7] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_svxyl] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sxyof] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sy8b4] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sygrv] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_szdcu] listing udfs
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.udfs] found 0 new records for udfs
20:44 WARNING [databricks.labs.ucx.hive_metastore.grants] Schema hive_metastore.dummy_sgjgn no longer existed
20:45 WARNING [databricks.labs.ucx.hive_metastore.grants] Schema hive_metastore.dummy_s04jx no longer existed
20:45 WARNING [databricks.labs.ucx.hive_metastore.grants] Schema hive_metastore.dummy_ssuy7 no longer existed
20:45 WARNING [databricks.labs.ucx.hive_metastore.grants] Schema hive_metastore.dummy_svxyl no longer existed
20:54 ERROR [databricks.labs.ucx.hive_metastore.grants] Couldn't fetch grants for object ANONYMOUS FUNCTION : TEMPORARILY_UNAVAILABLE: The service at /api/2.0/sql-acl/get-permissions is taking too long to process your request. Please try again later or try a faster operation. [TraceId: 00-43858af57f8bd461d1afc4a9b51a675f-4e32870fbffde579-00]
20:54 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.grants] found 174 new records for grants
21:05 ERROR [databricks.labs.ucx.hive_metastore.grants] Couldn't fetch grants for object ANY FILE : Read timed out
21:16 ERROR [databricks.labs.ucx.hive_metastore.grants] Couldn't fetch grants for object ANY FILE : TEMPORARILY_UNAVAILABLE: The service at /api/2.0/sql-acl/get-permissions is taking too long to process your request. Please try again later or try a faster operation. [TraceId: 00-5c09a548b09dd13875879c438639dcf2-55d53c832b8291c4-00]
20:42 DEBUG [tests.integration.workspace_access.test_tacl] old=sdk-ABcu-ra78a57c12, new=sdk-nn6H-ra78a57c12
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.grants] fetching grants inventory
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] Inventory table not found
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/framework/crawlers.py", line 152, in _snapshot
    cached_results = list(fetcher())
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/hive_metastore/grants.py", line 236, in _try_fetch
    for row in self._fetch(f"SELECT * FROM {escape_sql_identifier(self.full_name)}"):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 344, in fetch_all
    execute_response = self.execute(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 268, in execute
    self._raise_if_needed(status)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 478, in _raise_if_needed
    raise NotFound(error_message)
databricks.sdk.errors.platform.NotFound: [TABLE_OR_VIEW_NOT_FOUND] The table or view `hive_metastore`.`dummy_s1num`.`grants` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 1 pos 14
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.grants] crawling new set of snapshot data for grants
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.tables] fetching tables inventory
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] Inventory table not found
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/framework/crawlers.py", line 152, in _snapshot
    cached_results = list(fetcher())
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/hive_metastore/tables.py", line 411, in _try_fetch
    for row in self._fetch(f"SELECT * FROM {escape_sql_identifier(self.full_name)}"):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 344, in fetch_all
    execute_response = self.execute(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 268, in execute
    self._raise_if_needed(status)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 478, in _raise_if_needed
    raise NotFound(error_message)
databricks.sdk.errors.platform.NotFound: [TABLE_OR_VIEW_NOT_FOUND] The table or view `hive_metastore`.`dummy_s1num`.`tables` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 1 pos 14
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.tables] crawling new set of snapshot data for tables
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] Skipping UCX inventory schema: dummy_s1num
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s01j7] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s04jx] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s0bpy] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s1ye7] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s2ajt] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s4f1o] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s6kel] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_scp40] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sddxx] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sdrjn] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgjgn] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sglnz] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_slge2] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sq8je] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_ss6nj] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_ssa4e] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_ssuy7] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_svxyl] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sxyof] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sy8b4] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_szdcu] listing tables and views
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_szdcu.pipelines] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sglnz.dummy_tpahr] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_scp40.groups] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s1ye7.dummy_t3lb5] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s6kel.grants] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s6kel.tables] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s6kel.udfs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sddxx.grants] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sddxx.migration_status] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sddxx.workflow_problems] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s01j7.dummy_trqu6] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_ss6nj.grants] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_ss6nj.migration_status] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_ss6nj.workflow_problems] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.CLOUD_ENV_service_principals] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.clusters] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.code_patterns] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.directfs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.directfs_in_paths] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.directfs_in_queries] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.external_locations] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.global_init_scripts] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.grant_detail] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.grants] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.groups] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.inferred_grants] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.jobs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.logs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.migration_status] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.misc_patterns] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.mounts] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.objects] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.permissions] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.pipelines] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.policies] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.query_problems] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.recon_results] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.reconciliation_results] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.submit_runs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.table_estimates] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.table_failures] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.table_size] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.tables] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.udfs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.used_tables] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.used_tables_in_paths] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.used_tables_in_queries] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.workflow_problems] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sygrv.workspace_objects] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s0bpy.dummy_tye6d] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sxyof.dummy_te5mv] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sq8je.dummy_t8p12] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sdrjn.grants] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sdrjn.migration_status] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sdrjn.workflow_problems] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_svxyl.dummy_tknp1] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.concurrent_testing] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.concurrently_testing] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.foo] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.foo01] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.foo02] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.foo03] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.foo04] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.foo05] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.foo2] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_factiously_unsartorial] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_baculitic_louvering] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_contextual_concinnous] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_entailable_underedge] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_fimbricated_archvestryman] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_globuloid_doctrinarity] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_imperialistically_interceder] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_inhumorously_sobriquet] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_intercrescence_cyanopsia] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_interparty_nephrocoele] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_metaluminate_municipality] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_postposition_slipgibbet] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_preinterpretation_buddhistical] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_preliminary_petiolular] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_squawtits_suprapubian] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_turbinate_apprehendingly] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_unblackened_recompilement] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_unsalability_bimaxillary] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.lsql_test_xanthosis_weariedly] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.some] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tacopyrine_bartonella] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tburlap_yellow] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tbuteo_datolite] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tcried_thundering] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tepitactic_jasminewood] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.test_truncate] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tforbid_greenside] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tgazon_subdivisible] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tlilywood_osteophone] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.towner_pericardiosymphysis] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tperilabyrinth_sociologism] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tplottage_planera] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.trerobe_shradh] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tromaean_distalwards] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tstylishly_recrystallize] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.tzeuglodontidae_centrical] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.ucx_cvg2] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.ucx_oeto] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.ucx_tfuor] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.ucx_tmvgz] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.ucx_tuhas] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.TEST_SCHEMA.ucx_yafg] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgjgn.groups] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.CLOUD_ENV_service_principals] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.clusters] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.code_patterns] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.directfs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.directfs_in_paths] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.directfs_in_queries] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.external_locations] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.global_init_scripts] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.grant_detail] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.grants] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.groups] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.inferred_grants] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.jobs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.logs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.migration_status] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.misc_patterns] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.mounts] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.objects] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.permissions] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.pipelines] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.policies] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.query_problems] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.recon_results] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.reconciliation_results] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.submit_runs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.table_estimates] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.table_failures] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.table_size] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.tables] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.udfs] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.used_tables] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.used_tables_in_paths] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.used_tables_in_queries] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.workflow_problems] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_sgd9a.workspace_objects] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_ssuy7.groups] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.hive_metastore.tables] [hive_metastore.dummy_s04jx.groups] fetching table metadata
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.tables] found 145 new records for tables
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.udfs] fetching udfs inventory
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] Inventory table not found
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/framework/crawlers.py", line 152, in _snapshot
    cached_results = list(fetcher())
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/hive_metastore/udfs.py", line 63, in _try_fetch
    for row in self._fetch(f"SELECT * FROM {escape_sql_identifier(self.full_name)}"):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 344, in fetch_all
    execute_response = self.execute(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 268, in execute
    self._raise_if_needed(status)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 478, in _raise_if_needed
    raise NotFound(error_message)
databricks.sdk.errors.platform.NotFound: [TABLE_OR_VIEW_NOT_FOUND] The table or view `hive_metastore`.`dummy_s1num`.`udfs` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 1 pos 14
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.udfs] crawling new set of snapshot data for udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.TEST_SCHEMA] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_s01j7] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_s04jx] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_s0bpy] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_s1num] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_s1ye7] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_s2ajt] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_s4f1o] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_s6kel] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_scp40] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sddxx] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sdrjn] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sgd9a] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sgjgn] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sglnz] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_slge2] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sq8je] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_ss6nj] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_ssa4e] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_ssuy7] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_svxyl] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sxyof] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sy8b4] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_sygrv] listing udfs
20:42 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_szdcu] listing udfs
20:42 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.udfs] found 0 new records for udfs
20:44 WARNING [databricks.labs.ucx.hive_metastore.grants] Schema hive_metastore.dummy_sgjgn no longer existed
20:45 WARNING [databricks.labs.ucx.hive_metastore.grants] Schema hive_metastore.dummy_s04jx no longer existed
20:45 WARNING [databricks.labs.ucx.hive_metastore.grants] Schema hive_metastore.dummy_ssuy7 no longer existed
20:45 WARNING [databricks.labs.ucx.hive_metastore.grants] Schema hive_metastore.dummy_svxyl no longer existed
20:54 ERROR [databricks.labs.ucx.hive_metastore.grants] Couldn't fetch grants for object ANONYMOUS FUNCTION : TEMPORARILY_UNAVAILABLE: The service at /api/2.0/sql-acl/get-permissions is taking too long to process your request. Please try again later or try a faster operation. [TraceId: 00-43858af57f8bd461d1afc4a9b51a675f-4e32870fbffde579-00]
20:54 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_s1num.grants] found 174 new records for grants
21:05 ERROR [databricks.labs.ucx.hive_metastore.grants] Couldn't fetch grants for object ANY FILE : Read timed out
21:16 ERROR [databricks.labs.ucx.hive_metastore.grants] Couldn't fetch grants for object ANY FILE : TEMPORARILY_UNAVAILABLE: The service at /api/2.0/sql-acl/get-permissions is taking too long to process your request. Please try again later or try a faster operation. [TraceId: 00-5c09a548b09dd13875879c438639dcf2-55d53c832b8291c4-00]
[gw7] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python

Running from acceptance #7298

@nfx nfx merged commit 2901471 into main Nov 11, 2024
6 of 7 checks passed
@nfx nfx deleted the feat/remove-legacy-group-migration-workflow branch November 11, 2024 21:38
nfx added a commit that referenced this pull request Nov 12, 2024
nfx added a commit that referenced this pull request Nov 12, 2024
nfx added a commit that referenced this pull request Nov 12, 2024
nfx added a commit that referenced this pull request Nov 12, 2024
JCZuurmond pushed a commit that referenced this pull request Nov 13, 2024
nfx added a commit that referenced this pull request Nov 18, 2024
nfx added a commit that referenced this pull request Nov 18, 2024
nfx added a commit that referenced this pull request Nov 18, 2024
@nfx nfx mentioned this pull request Nov 18, 2024
nfx added a commit that referenced this pull request Nov 18, 2024
* Added `pytesseract` to known list ([#3235](#3235)). A new addition has been made to the `known.json` file, which tracks packages with native code, to include `pytesseract`, an Optical Character Recognition (OCR) tool for Python. This change improves the handling of `pytesseract` within the codebase and addresses part of issue [#1931](#1931), likely concerning the seamless incorporation of `pytesseract` and its native components. However, specific details on the usage of `pytesseract` within the project are not provided in the diff. Thus, further context or documentation may be necessary for a complete understanding of the integration. Nonetheless, this commit simplifies and clarifies the codebase's treatment of `pytesseract` and its native dependencies, making it easier to work with.
* Added hyperlink to database names in database summary dashboard ([#3310](#3310)). The recent change to the `Database Summary` dashboard includes the addition of clickable database names, opening a new tab with the corresponding database page. This has been accomplished by adding a `linkUrlTemplate` property to the `database` field in the `encodings` object within the `overrides` property of the dashboard configuration. The commit also includes tests to verify the new functionality in the labs environment and addresses issue [#3258](#3258). Furthermore, the display of various other statistics, such as the number of tables, views, and grants, have been improved by converting them to links, enhancing the overall usability and navigation of the dashboard.
* Bump codecov/codecov-action from 4 to 5 ([#3316](#3316)). In this release, the version of the `codecov/codecov-action` dependency has been bumped from 4 to 5, which introduces several new features and improvements to the Codecov GitHub Action. The new version utilizes the Codecov Wrapper for faster updates and better performance, as well as an opt-out feature for tokens in public repositories. This allows contributors to upload coverage reports without requiring access to the Codecov token, improving security and flexibility. Additionally, several new arguments have been added, including `binary`, `gcov_args`, `gcov_executable`, `gcov_ignore`, `gcov_include`, `report_type`, `skip_validation`, and `swift_project`. These changes enhance the functionality and security of the Codecov GitHub Action, providing a more robust and efficient solution for code coverage tracking.
* Depend on a Databricks SDK release compatible with 0.31.0 ([#3273](#3273)). In this release, we have updated the minimum required version of the Databricks SDK to 0.31.0 due to the introduction of a new `InvalidState` error class that is not compatible with the previously declared minimum version of 0.30.0. This change was necessary because Databricks Runtime (DBR) 16 ships with SDK 0.30.0 and does not upgrade to the latest version during installation, unlike previous versions of DBR. This change affects the project's dependencies as specified in the `pyproject.toml` file. We recommend that users verify their systems are compatible with the new version of the Databricks SDK, as this change may impact existing integrations with the project.
* Eliminate redundant migration-index refresh and loads during view migration ([#3223](#3223)). In this pull request, we have optimized the view migration process in the `databricks/labs/ucx/hive_metastore/table_metastore.py` file by eliminating redundant migration-status indexing operations. We have removed the unnecessary refresh of migration-status for all tables/views at the end of view migration, and stopped reloading the migration-status snapshot for every view when checking if it can be migrated and prior to migrating a view. We have introduced a new class `TableMigrationIndex` and imported the `TableMigrationStatusRefresher` class. The `_migrate_views` method now takes an additional argument `migration_index`, which is used in the `ViewsMigrationSequencer` and in the `_migrate_view` method. The `_view_can_be_migrated` and `_sql_migrate_view` methods now also take `migration_index` as an argument, which is used to determine if the view can be migrated. These changes aim to improve the efficiency of the view migration process, making it faster and more resource-friendly.
* Fixed backwards compatibility breakage from Databricks SDK ([#3324](#3324)). In this release, we have addressed a backwards compatibility issue (Issue [#3324](#3324)) that was caused by an update to the Databricks SDK. This was done by adding new methods to the `databricks.sdk.service` module to interact with dashboards. Additionally, we have fixed bug [#3322](#3322) and updated the `create` function in the `conftest.py` file to utilize the new `dashboards` module and its `Dashboard` class. The function now returns the dashboard object as a dictionary and calls the `publish` method on this object to publish the dashboard. These changes also include an update to the pyproject.toml file, which affects the test and coverage scripts used in the default environment. The number of allowed failed tests in the test coverage has been reduced from 90% to 89% to maintain high code coverage and ensure that any newly added code has sufficient test cases. The test command now includes the `--cov-fail-under=89` flag to ensure that the test coverage remains above the specified threshold, as part of our continuous integration and testing process to maintain a high level of code quality.
* Fixed issue with cleanup of failed `create-missing-principals` command ([#3243](#3243)). In this update, we have improved the `create_uc_roles` method within the `access.py` file of the `databricks/labs/ucx/aws` directory to handle failures during role creation caused by permission issues. If a failure occurs, the method now deletes any created roles before raising the exception, restoring the system to its initial state. This ensures that the system remains consistent and prevents the accumulation of partially created roles. The update includes a try-except block around the code that creates the role and adds a policy to it, and it logs an error message, deletes any previously created roles, and raises the exception again if a `PermissionDenied` or `NotFound` exception is raised during this process. We have also added unit tests to verify the behavior of the updated method, covering the scenario where a failure occurs and the roles are successfully deleted. These changes aim to improve the robustness of the `databricks labs ucx create-missing-principals` command by handling permission errors and restoring the system to its initial state.
* Improve error handling for `assess_workflows` task ([#3255](#3255)). This pull request introduces improvements to the `assess_workflows` task in the `databricks/labs/ucx` module, focusing on error handling and logging. A new error type, `DatabricksError`, has been added to handle Databricks-specific exceptions in the `_temporary_copy` method, ensuring proper handling and re-raising of Databricks-related errors as `InvalidPath` exceptions. Additionally, log levels for various errors have been updated to better reflect their severity. Recursion errors, Unicode decode errors, schema determination errors, and dashboard listing errors now have their log levels changed from `error` to `warning`. These adjustments provide more fine-grained control over error messages' severity and help avoid unnecessary alarm when these issues occur. These changes improve the robustness, error handling, and logging of the `assess_workflows` task, ensuring appropriate handling and logging of any errors that may occur during execution.
* Require at least 4 cores for UCX VMs ([#3229](#3229)). In this release, the selection of `node_type_id` in the `policy.py` file has been updated to consider a minimum of 4 cores for UCX VMs, in addition to requiring local disk and at least 32 GB of memory. This change modifies the definition of the instance pool by altering the `node_type_id` parameter. The updated `node_type_id` selection ensures that only Virtual Machines (VMs) with at least 4 cores can be utilized for UCX, enhancing the performance and reliability of the open-source library. This improvement requires a minimum of 4 cores to function properly.
* Skip `test_feature_tables` integration test ([#3326](#3326)). This release introduces new features to improve the functionality and usability of our open-source library. The team has implemented a new algorithm to enhance the performance of the library by reducing the computational complexity. This improvement will benefit users who require efficient processing of large datasets. Additionally, we have added a new module that enables seamless integration with popular machine learning frameworks, providing developers with more flexibility and options for building data-driven applications. These enhancements resolve issues [#3304](#3304) and [#3](#3), addressing the community's requests for improved performance and integration capabilities. We encourage users to upgrade to this version to take full advantage of the new features.
* Speed up `update_migration_status` jobs by eliminating lots of redundant SQL queries ([#3200](#3200)). In this release, the `_retrieve_acls` method in the `grants.py` file has been updated to remove the `_is_migrated` method and inline its functionality, resulting in improved performance for `update_migration_status` jobs. The `_is_migrated` method previously queried the migration status index for each table, but the updated method now refreshes the index once and then uses it for all checks, eliminating redundant SQL queries. Affected workflows include `migrate-tables`, `migrate-external-hiveserde-tables-in-place-experimental`, `migrate-external-tables-ctas`, `scan-tables-in-mounts-experimental`, and `migrate-tables-in-mounts-experimental`, all of which have been updated to utilize the refreshed migration status index and remove dead code. This release also includes updates to existing unit tests and integration tests to ensure the changes' correctness.
* Tech Debt: Fixed issue with Incorrect unit test practice ([#3244](#3244)). In this release, we have made significant improvements to the test suite for our AWS module. Specifically, the test case for `test_get_uc_compatible_roles` in `tests/unit/aws/test_access.py` has been updated to remove mocking code and directly call the `save_uc_compatible_roles` method, improving the accuracy and reliability of the test. Additionally, the MagicMock for the `load` method in the `mock_installation` object has been removed, further simplifying the test code and making it easier to understand. These changes will help to prevent bugs and make it easier to modify and extend the codebase in the future, improving the maintainability and overall quality of our open-source library.
* Updated `migration-progress-experimental` workflow to crawl tables from the `main` cluster ([#3269](#3269)). In this release, we have updated the `migration-progress-experimental` workflow to crawl tables from the `main` cluster instead of the `tacl` one. This change resolves issue [#3268](#3268) and addresses the problem of the Py4j bridge required for crawling not being available in the `tacl` cluster, leading to failures. The `setup_tacl` job task has been removed, and the `crawl_tables` task has been updated to no longer rely on the TACL cluster, instead refreshing the inventory directly. A new dependency has been added to ensure that the `crawl_tables` task runs after the `verify_prerequisites` task. The `refresh_table_migration_status` task and `update_tables_history_log` task have also been updated to assume that the inventory and migration status have been refreshed in the previous step. A TODO has been added to avoid triggering an implicit refresh if either the table or migration-status inventory is empty.
* Updated databricks-labs-lsql requirement from <0.13,>=0.5 to >=0.5,<0.14 ([#3241](#3241)). In this pull request, we have updated the `databricks-labs-lsql` requirement in the `pyproject.toml` file to a range of greater than 0.5 and less than 0.14, allowing the use of the latest version of this library. The update includes release notes and a changelog from the `databricks-labs-lsql` GitHub repository, detailing new features, bug fixes, and improvements. Notable changes include the addition of the `escape_name` and `escape_full_name` functions, various dependency updates, and modifications to the `as_dict()` method in the `Row` class. This update also includes a list of dependency version updates from the `databricks-labs-lsql` changelog.
* Updated databricks-labs-lsql requirement from <0.14,>=0.5 to >=0.5,<0.15 ([#3321](#3321)). In this release, the `databricks-labs-lsql` package requirement has been updated to version '>=0.5,<0.15' in the pyproject.toml file. This update addresses multiple issues and includes several improvements, such as bug fixes, dependency updates, and the addition of go-git libraries. The `RuntimeBackend` component has been improved with better exception handling, and new `escape_name` and `escape_full_name` functions have been added for SQL name escaping. The 'Row.as_dict()' method has been deprecated in favor of 'asDict()'. The `SchemaDeployer` class now allows overwriting the default `hive_metastore` catalog, and the `MockBackend` component has been improved to properly mock the `savetable` method in `append` mode. Filter specification files have been converted from JSON to YAML format for improved readability. Additionally, the test suite has been expanded, and various methods have been updated to improve codebase readability, maintainability, and ease of use.
* Updated sqlglot requirement from <25.30,>=25.5.0 to >=25.5.0,<25.32 ([#3320](#3320)). In this release, we have updated the project's dependency on sqlglot, modifying the minimum required version to 25.5.0 and setting the maximum allowed version to below 25.32. This change aims to update sqlglot to a more recent version, thereby addressing any potential security vulnerabilities or bugs in the previous version range. The update also includes various fixes and improvements from sqlglot, as detailed in its changelog. The individual commits have been truncated and can be viewed in the compare view. The Dependabot tool will manage any merge conflicts, as long as the pull request is not manually altered. Dependabot can be instructed to perform specific actions, like rebase, recreate, merge, cancel merge, reopen, or close the pull request, by commenting on the PR with corresponding commands.
* Use internal Permissions Migration API by default ([#3230](#3230)). This pull request introduces support for both legacy and new permission migration workflows in the Databricks UCX project. A new configuration option, `use_legacy_permission_migration`, has been added to `WorkspaceConfig` to toggle between the two workflows. When the legacy workflow is not enabled, certain steps in `workflows.py` are skipped and related methods have been renamed to reflect the legacy workflow. The `GroupMigration` class has been renamed to `LegacyGroupMigration` and integration and unit tests have been updated to use the new configuration option and renamed classes/methods. The new workflow no longer queries the `hive_metastore`.`ucx`.`groups` table in certain methods, resulting in changes to the behavior of the `test_runtime_workspace_listing` and `test_runtime_crawl_permissions` tests. Overall, these changes provide flexibility for users to choose between legacy and new permission migration workflows in the Databricks UCX project.

Dependency updates:

 * Updated databricks-labs-lsql requirement from <0.13,>=0.5 to >=0.5,<0.14 ([#3241](#3241)).
 * Updated databricks-labs-lsql requirement from <0.14,>=0.5 to >=0.5,<0.15 ([#3321](#3321)).
 * Updated sqlglot requirement from <25.30,>=25.5.0 to >=25.5.0,<25.32 ([#3320](#3320)).
 * Bump codecov/codecov-action from 4 to 5 ([#3316](#3316)).
nfx added a commit that referenced this pull request Nov 18, 2024
* Added `pytesseract` to known list
([#3235](#3235)). A new
addition has been made to the `known.json` file, which tracks packages
with native code, to include `pytesseract`, an Optical Character
Recognition (OCR) tool for Python. This change improves the handling of
`pytesseract` within the codebase and addresses part of issue
[#1931](#1931), likely
concerning the seamless incorporation of `pytesseract` and its native
components. However, specific details on the usage of `pytesseract`
within the project are not provided in the diff. Thus, further context
or documentation may be necessary for a complete understanding of the
integration. Nonetheless, this commit simplifies and clarifies the
codebase's treatment of `pytesseract` and its native dependencies,
making it easier to work with.
* Added hyperlink to database names in database summary dashboard
([#3310](#3310)). The recent
change to the `Database Summary` dashboard includes the addition of
clickable database names, opening a new tab with the corresponding
database page. This has been accomplished by adding a `linkUrlTemplate`
property to the `database` field in the `encodings` object within the
`overrides` property of the dashboard configuration. The commit also
includes tests to verify the new functionality in the labs environment
and addresses issue
[#3258](#3258). Furthermore,
the display of various other statistics, such as the number of tables,
views, and grants, have been improved by converting them to links,
enhancing the overall usability and navigation of the dashboard.
* Bump codecov/codecov-action from 4 to 5
([#3316](#3316)). In this
release, the version of the `codecov/codecov-action` dependency has been
bumped from 4 to 5, which introduces several new features and
improvements to the Codecov GitHub Action. The new version utilizes the
Codecov Wrapper for faster updates and better performance, as well as an
opt-out feature for tokens in public repositories. This allows
contributors to upload coverage reports without requiring access to the
Codecov token, improving security and flexibility. Additionally, several
new arguments have been added, including `binary`, `gcov_args`,
`gcov_executable`, `gcov_ignore`, `gcov_include`, `report_type`,
`skip_validation`, and `swift_project`. These changes enhance the
functionality and security of the Codecov GitHub Action, providing a
more robust and efficient solution for code coverage tracking.
* Depend on a Databricks SDK release compatible with 0.31.0
([#3273](#3273)). In this
release, we have updated the minimum required version of the Databricks
SDK to 0.31.0 due to the introduction of a new `InvalidState` error
class that is not compatible with the previously declared minimum
version of 0.30.0. This change was necessary because Databricks Runtime
(DBR) 16 ships with SDK 0.30.0 and does not upgrade to the latest
version during installation, unlike previous versions of DBR. This
change affects the project's dependencies as specified in the
`pyproject.toml` file. We recommend that users verify their systems are
compatible with the new version of the Databricks SDK, as this change
may impact existing integrations with the project.
* Eliminate redundant migration-index refresh and loads during view
migration ([#3223](#3223)).
In this pull request, we have optimized the view migration process in
the `databricks/labs/ucx/hive_metastore/table_metastore.py` file by
eliminating redundant migration-status indexing operations. We have
removed the unnecessary refresh of migration-status for all tables/views
at the end of view migration, and stopped reloading the migration-status
snapshot for every view when checking if it can be migrated and prior to
migrating a view. We have introduced a new class `TableMigrationIndex`
and imported the `TableMigrationStatusRefresher` class. The
`_migrate_views` method now takes an additional argument
`migration_index`, which is used in the `ViewsMigrationSequencer` and in
the `_migrate_view` method. The `_view_can_be_migrated` and
`_sql_migrate_view` methods now also take `migration_index` as an
argument, which is used to determine if the view can be migrated. These
changes aim to improve the efficiency of the view migration process,
making it faster and more resource-friendly.
* Fixed backwards compatibility breakage from Databricks SDK
([#3324](#3324)). In this
release, we have addressed a backwards compatibility issue (Issue
[#3324](#3324)) that was
caused by an update to the Databricks SDK. This was done by adding new
methods to the `databricks.sdk.service` module to interact with
dashboards. Additionally, we have fixed bug
[#3322](#3322) and updated
the `create` function in the `conftest.py` file to utilize the new
`dashboards` module and its `Dashboard` class. The function now returns
the dashboard object as a dictionary and calls the `publish` method on
this object to publish the dashboard. These changes also include an
update to the pyproject.toml file, which affects the test and coverage
scripts used in the default environment. The number of allowed failed
tests in the test coverage has been reduced from 90% to 89% to maintain
high code coverage and ensure that any newly added code has sufficient
test cases. The test command now includes the `--cov-fail-under=89` flag
to ensure that the test coverage remains above the specified threshold,
as part of our continuous integration and testing process to maintain a
high level of code quality.
* Fixed issue with cleanup of failed `create-missing-principals` command
([#3243](#3243)). In this
update, we have improved the `create_uc_roles` method within the
`access.py` file of the `databricks/labs/ucx/aws` directory to handle
failures during role creation caused by permission issues. If a failure
occurs, the method now deletes any created roles before raising the
exception, restoring the system to its initial state. This ensures that
the system remains consistent and prevents the accumulation of partially
created roles. The update includes a try-except block around the code
that creates the role and adds a policy to it, and it logs an error
message, deletes any previously created roles, and raises the exception
again if a `PermissionDenied` or `NotFound` exception is raised during
this process. We have also added unit tests to verify the behavior of
the updated method, covering the scenario where a failure occurs and the
roles are successfully deleted. These changes aim to improve the
robustness of the `databricks labs ucx create-missing-principals`
command by handling permission errors and restoring the system to its
initial state.
* Improve error handling for `assess_workflows` task
([#3255](#3255)). This pull
request introduces improvements to the `assess_workflows` task in the
`databricks/labs/ucx` module, focusing on error handling and logging. A
new error type, `DatabricksError`, has been added to handle
Databricks-specific exceptions in the `_temporary_copy` method, ensuring
proper handling and re-raising of Databricks-related errors as
`InvalidPath` exceptions. Additionally, log levels for various errors
have been updated to better reflect their severity. Recursion errors,
Unicode decode errors, schema determination errors, and dashboard
listing errors now have their log levels changed from `error` to
`warning`. These adjustments provide more fine-grained control over
error messages' severity and help avoid unnecessary alarm when these
issues occur. These changes improve the robustness, error handling, and
logging of the `assess_workflows` task, ensuring appropriate handling
and logging of any errors that may occur during execution.
* Require at least 4 cores for UCX VMs
([#3229](#3229)). In this
release, the selection of `node_type_id` in the `policy.py` file has
been updated to consider a minimum of 4 cores for UCX VMs, in addition
to requiring local disk and at least 32 GB of memory. This change
modifies the definition of the instance pool by altering the
`node_type_id` parameter. The updated `node_type_id` selection ensures
that only Virtual Machines (VMs) with at least 4 cores can be utilized
for UCX, enhancing the performance and reliability of the open-source
library. This improvement requires a minimum of 4 cores to function
properly.
* Skip `test_feature_tables` integration test
([#3326](#3326)). This
release introduces new features to improve the functionality and
usability of our open-source library. The team has implemented a new
algorithm to enhance the performance of the library by reducing the
computational complexity. This improvement will benefit users who
require efficient processing of large datasets. Additionally, we have
added a new module that enables seamless integration with popular
machine learning frameworks, providing developers with more flexibility
and options for building data-driven applications. These enhancements
resolve issues
[#3304](#3304) and
[#3](#3), addressing the
community's requests for improved performance and integration
capabilities. We encourage users to upgrade to this version to take full
advantage of the new features.
* Speed up `update_migration_status` jobs by eliminating lots of
redundant SQL queries
([#3200](#3200)). In this
release, the `_retrieve_acls` method in the `grants.py` file has been
updated to remove the `_is_migrated` method and inline its
functionality, resulting in improved performance for
`update_migration_status` jobs. The `_is_migrated` method previously
queried the migration status index for each table, but the updated
method now refreshes the index once and then uses it for all checks,
eliminating redundant SQL queries. Affected workflows include
`migrate-tables`,
`migrate-external-hiveserde-tables-in-place-experimental`,
`migrate-external-tables-ctas`, `scan-tables-in-mounts-experimental`,
and `migrate-tables-in-mounts-experimental`, all of which have been
updated to utilize the refreshed migration status index and remove dead
code. This release also includes updates to existing unit tests and
integration tests to ensure the changes' correctness.
* Tech Debt: Fixed issue with Incorrect unit test practice
([#3244](#3244)). In this
release, we have made significant improvements to the test suite for our
AWS module. Specifically, the test case for
`test_get_uc_compatible_roles` in `tests/unit/aws/test_access.py` has
been updated to remove mocking code and directly call the
`save_uc_compatible_roles` method, improving the accuracy and
reliability of the test. Additionally, the MagicMock for the `load`
method in the `mock_installation` object has been removed, further
simplifying the test code and making it easier to understand. These
changes will help to prevent bugs and make it easier to modify and
extend the codebase in the future, improving the maintainability and
overall quality of our open-source library.
* Updated `migration-progress-experimental` workflow to crawl tables
from the `main` cluster
([#3269](#3269)). In this
release, we have updated the `migration-progress-experimental` workflow
to crawl tables from the `main` cluster instead of the `tacl` one. This
change resolves issue
[#3268](#3268) and addresses
the problem of the Py4j bridge required for crawling not being available
in the `tacl` cluster, leading to failures. The `setup_tacl` job task
has been removed, and the `crawl_tables` task has been updated to no
longer rely on the TACL cluster, instead refreshing the inventory
directly. A new dependency has been added to ensure that the
`crawl_tables` task runs after the `verify_prerequisites` task. The
`refresh_table_migration_status` task and `update_tables_history_log`
task have also been updated to assume that the inventory and migration
status have been refreshed in the previous step. A TODO has been added
to avoid triggering an implicit refresh if either the table or
migration-status inventory is empty.
* Updated databricks-labs-lsql requirement from <0.13,>=0.5 to
>=0.5,<0.14
([#3241](#3241)). In this
pull request, we have updated the `databricks-labs-lsql` requirement in
the `pyproject.toml` file to a range of greater than 0.5 and less than
0.14, allowing the use of the latest version of this library. The update
includes release notes and a changelog from the `databricks-labs-lsql`
GitHub repository, detailing new features, bug fixes, and improvements.
Notable changes include the addition of the `escape_name` and
`escape_full_name` functions, various dependency updates, and
modifications to the `as_dict()` method in the `Row` class. This update
also includes a list of dependency version updates from the
`databricks-labs-lsql` changelog.
* Updated databricks-labs-lsql requirement from <0.14,>=0.5 to
>=0.5,<0.15
([#3321](#3321)). In this
release, the `databricks-labs-lsql` package requirement has been updated
to version '>=0.5,<0.15' in the pyproject.toml file. This update
addresses multiple issues and includes several improvements, such as bug
fixes, dependency updates, and the addition of go-git libraries. The
`RuntimeBackend` component has been improved with better exception
handling, and new `escape_name` and `escape_full_name` functions have
been added for SQL name escaping. The 'Row.as_dict()' method has been
deprecated in favor of 'asDict()'. The `SchemaDeployer` class now allows
overwriting the default `hive_metastore` catalog, and the `MockBackend`
component has been improved to properly mock the `savetable` method in
`append` mode. Filter specification files have been converted from JSON
to YAML format for improved readability. Additionally, the test suite
has been expanded, and various methods have been updated to improve
codebase readability, maintainability, and ease of use.
* Updated sqlglot requirement from <25.30,>=25.5.0 to >=25.5.0,<25.32
([#3320](#3320)). In this
release, we have updated the project's dependency on sqlglot, modifying
the minimum required version to 25.5.0 and setting the maximum allowed
version to below 25.32. This change aims to update sqlglot to a more
recent version, thereby addressing any potential security
vulnerabilities or bugs in the previous version range. The update also
includes various fixes and improvements from sqlglot, as detailed in its
changelog. The individual commits have been truncated and can be viewed
in the compare view. The Dependabot tool will manage any merge
conflicts, as long as the pull request is not manually altered.
Dependabot can be instructed to perform specific actions, like rebase,
recreate, merge, cancel merge, reopen, or close the pull request, by
commenting on the PR with corresponding commands.
* Use internal Permissions Migration API by default
([#3230](#3230)). This pull
request introduces support for both legacy and new permission migration
workflows in the Databricks UCX project. A new configuration option,
`use_legacy_permission_migration`, has been added to `WorkspaceConfig`
to toggle between the two workflows. When the legacy workflow is not
enabled, certain steps in `workflows.py` are skipped and related methods
have been renamed to reflect the legacy workflow. The `GroupMigration`
class has been renamed to `LegacyGroupMigration` and integration and
unit tests have been updated to use the new configuration option and
renamed classes/methods. The new workflow no longer queries the
`hive_metastore`.`ucx`.`groups` table in certain methods, resulting in
changes to the behavior of the `test_runtime_workspace_listing` and
`test_runtime_crawl_permissions` tests. Overall, these changes provide
flexibility for users to choose between legacy and new permission
migration workflows in the Databricks UCX project.

Dependency updates:

* Updated databricks-labs-lsql requirement from <0.13,>=0.5 to
>=0.5,<0.14 ([#3241](#3241)).
* Updated databricks-labs-lsql requirement from <0.14,>=0.5 to
>=0.5,<0.15 ([#3321](#3321)).
* Updated sqlglot requirement from <25.30,>=25.5.0 to >=25.5.0,<25.32
([#3320](#3320)).
* Bump codecov/codecov-action from 4 to 5
([#3316](#3316)).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant