Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Database ... not found error terminating run on spark clusters #277

Closed
jeremyyeo opened this issue Feb 22, 2023 · 0 comments · Fixed by #281
Closed

Database ... not found error terminating run on spark clusters #277

jeremyyeo opened this issue Feb 22, 2023 · 0 comments · Fixed by #281
Assignees
Labels
bug Something isn't working

Comments

@jeremyyeo
Copy link

jeremyyeo commented Feb 22, 2023

Describe the bug

Due to a recent fix #270 - this is terminating runs on spark clusters when it previously wasn't (dbt-databricks==1.4.1)

Note that in typical fashion, dbt will build the relation cache at the start of the run via some show databases / show tables in ... queries - so I don't think the issue here is that dbt is running show ... but rather an error (that was the result of a show ... query) previously did not cause the run to terminate but now does.

Steps To Reproduce

  1. Setup a spark cluster in databricks.
  2. dbt project setup:
# ~/.dbt/profiles.yml
databricks:
  target: dev
  outputs:
    dev:
      type: databricks
      schema: dbt_jyeo
      host:  <REDACT>.cloud.databricks.com
      token: <REDACT>
      # Spark Cluster
      http_path: sql/protocolv1/o/<REDACT>/ <REDACT>

# dbt_project.yml
name: "my_dbt_project"
version: "1.0.0"
config-version: 2
profile: "databricks"
models:
  my_dbt_project:
    +materialized: table
    bravo:
      +schema: doesnt_exist_or_unauthorized

To simulate this, ensure the schema you want to use does not exist or your token has no permissions to it... here I'm using doesnt_exist_or_unauthorized.

-- models/alpha/a.sql
select 1 as id

-- models/bravo/b.sql
select 1 as id
  1. Using dbt-databricks==1.4.2 build only one of the models.
$ pip install dbt-databricks==1.4.2 && dbt run -s a
22:42:32  Running with dbt=1.4.1
22:42:33  Found 2 models, 0 tests, 0 snapshots, 0 analyses, 372 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics
22:42:33  
22:42:44  
22:42:44  Finished running  in 0 hours 0 minutes and 11.22 seconds (11.22s).
22:42:45  Encountered an error:
Runtime Error
  Runtime Error
    Runtime Error
      Database 'dbt_jyeo_doesnt_exist_or_unauthorized' not found
  1. Downgrade to dbt-databricks==1.4.1 and rerun the same command above:
$ pip install dbt-databricks==1.4.1 && dbt run -s a
22:43:41  Running with dbt=1.4.1
22:43:43  Found 2 models, 0 tests, 0 snapshots, 0 analyses, 372 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics
22:43:43  
22:43:54  Concurrency: 1 threads (target='dev')
22:43:54  
22:43:54  1 of 1 START sql table model dbt_jyeo.a ........................................ [RUN]
22:44:00  1 of 1 OK created sql table model dbt_jyeo.a ................................... [OK in 6.36s]
22:44:02  
22:44:02  Finished running 1 table model in 0 hours 0 minutes and 19.59 seconds (19.59s).
22:44:02  
22:44:02  Completed successfully
22:44:02  
22:44:02  Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1

Expected behavior

There should be consistent behaviour between dbt-databricks==1.4.2 and dbt-databricks==1.4.1 wherein models are built even if a database is not found.

Debug log output

dbt-databricks==1.4.2
�[0m22:41:18.349706 [info ] [MainThread]: Running with dbt=1.4.1
�[0m22:41:18.359202 [debug] [MainThread]: running dbt with arguments {'debug': True, 'write_json': True, 'use_colors': True, 'printer_width': 80, 'version_check': True, 'partial_parse': False, 'static_parser': True, 'profiles_dir': '/Users/jeremy/.dbt', 'send_anonymous_usage_stats': True, 'quiet': False, 'no_print': False, 'cache_selected_only': False, 'select': ['a'], 'which': 'run', 'rpc_method': 'run', 'indirect_selection': 'eager'}
�[0m22:41:18.360330 [debug] [MainThread]: Tracking: tracking
�[0m22:41:18.419267 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'start', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x103c5ad70>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x115ee55d0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x115ee56c0>]}
�[0m22:41:18.459252 [debug] [MainThread]: Partial parsing not enabled
�[0m22:41:19.580888 [debug] [MainThread]: 1699: static parser successfully parsed alpha/a.sql
�[0m22:41:19.599505 [debug] [MainThread]: 1699: static parser successfully parsed bravo/b.sql
�[0m22:41:19.660847 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'load_project', 'label': 'a66a40e3-a8a2-471a-a676-9a8c262c13ff', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x116030be0>]}
�[0m22:41:19.670863 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'resource_counts', 'label': 'a66a40e3-a8a2-471a-a676-9a8c262c13ff', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x115ee4a90>]}
�[0m22:41:19.671647 [info ] [MainThread]: Found 2 models, 0 tests, 0 snapshots, 0 analyses, 372 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics
�[0m22:41:19.672335 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': 'a66a40e3-a8a2-471a-a676-9a8c262c13ff', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x115eeb7c0>]}
�[0m22:41:19.674869 [info ] [MainThread]: 
�[0m22:41:19.678934 [debug] [MainThread]: Acquiring new databricks connection 'master'
�[0m22:41:19.681020 [debug] [ThreadPool]: Acquiring new databricks connection 'list_schemas'
�[0m22:41:19.696414 [debug] [ThreadPool]: Using databricks connection "list_schemas"
�[0m22:41:19.697450 [debug] [ThreadPool]: On list_schemas: /* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.2", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "connection_name": "list_schemas"} */

    show databases
  
�[0m22:41:19.697996 [debug] [ThreadPool]: Opening a new connection, currently in state init
�[0m22:41:23.136677 [debug] [ThreadPool]: SQL status: OK in 3.44 seconds
�[0m22:41:23.159911 [debug] [ThreadPool]: On list_schemas: Close
�[0m22:41:24.290983 [debug] [ThreadPool]: Acquiring new databricks connection 'list_None_dbt_jyeo_doesnt_exist_or_unauthorized'
�[0m22:41:24.314675 [debug] [ThreadPool]: Spark adapter: NotImplemented: add_begin_query
�[0m22:41:24.315783 [debug] [ThreadPool]: Using databricks connection "list_None_dbt_jyeo_doesnt_exist_or_unauthorized"
�[0m22:41:24.316620 [debug] [ThreadPool]: On list_None_dbt_jyeo_doesnt_exist_or_unauthorized: /* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.2", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "connection_name": "list_None_dbt_jyeo_doesnt_exist_or_unauthorized"} */
show tables in `dbt_jyeo_doesnt_exist_or_unauthorized`
  
�[0m22:41:24.317347 [debug] [ThreadPool]: Opening a new connection, currently in state closed
�[0m22:41:27.216931 [debug] [ThreadPool]: Databricks adapter: Error while running:
/* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.2", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "connection_name": "list_None_dbt_jyeo_doesnt_exist_or_unauthorized"} */
show tables in `dbt_jyeo_doesnt_exist_or_unauthorized`
  
�[0m22:41:27.218425 [debug] [ThreadPool]: Databricks adapter: <class 'databricks.sql.exc.ServerOperationError'>: Database 'dbt_jyeo_doesnt_exist_or_unauthorized' not found
�[0m22:41:27.219944 [debug] [ThreadPool]: Databricks adapter: diagnostic-info: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'dbt_jyeo_doesnt_exist_or_unauthorized' not found
	at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:47)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:435)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:257)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:123)
	at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:48)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:52)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:235)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:220)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:269)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'dbt_jyeo_doesnt_exist_or_unauthorized' not found
	at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.requireDbExists(SessionCatalog.scala:647)
	at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.doListTables(SessionCatalog.scala:1506)
	at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.doListTables(ManagedCatalogSessionCatalog.scala:1222)
	at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.listTables(ManagedCatalogSessionCatalog.scala:1318)
	at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.$anonfun$listTables$1(UnityCatalogV2Proxy.scala:184)
	at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.assertSingleNamespace(UnityCatalogV2Proxy.scala:114)
	at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.listTables(UnityCatalogV2Proxy.scala:183)
	at org.apache.spark.sql.execution.datasources.v2.ShowTablesExec.run(ShowTablesExec.scala:42)
	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43)
	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:160)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:239)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:386)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:186)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:968)
	at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:141)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:336)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:160)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:156)
	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:590)
	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:168)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:590)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:268)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:264)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:566)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:156)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:324)
	at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:156)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:141)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:132)
	at org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:186)
	at org.apache.spark.sql.execution.QueryExecution.optimizedPlan$lzycompute(QueryExecution.scala:191)
	at org.apache.spark.sql.execution.QueryExecution.optimizedPlan(QueryExecution.scala:188)
	at org.apache.spark.sql.execution.QueryExecution.assertOptimized(QueryExecution.scala:206)
	at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:225)
	at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:222)
	at org.apache.spark.sql.execution.QueryExecution.assertExecutedPlanPrepared(QueryExecution.scala:240)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$compileQuery$2(SparkExecuteStatementOperation.scala:351)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:968)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$compileQuery$1(SparkExecuteStatementOperation.scala:334)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.getOrCreateDF(SparkExecuteStatementOperation.scala:327)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.compileQuery(SparkExecuteStatementOperation.scala:334)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:390)
	... 16 more

�[0m22:41:27.222413 [debug] [ThreadPool]: Databricks adapter: operation-id: b'B\x9e&\x86\xc3\x0bE8\xad9\xfc\x8d1\xb5\xb8*'
�[0m22:41:27.223873 [debug] [ThreadPool]: Databricks adapter: Error while running:
macro show_tables
�[0m22:41:27.225883 [debug] [ThreadPool]: Databricks adapter: Runtime Error
  Database 'dbt_jyeo_doesnt_exist_or_unauthorized' not found
�[0m22:41:27.227094 [debug] [ThreadPool]: Databricks adapter: Error while running:
macro list_relations_without_caching
�[0m22:41:27.228204 [debug] [ThreadPool]: Databricks adapter: Runtime Error
  Runtime Error
    Database 'dbt_jyeo_doesnt_exist_or_unauthorized' not found
�[0m22:41:27.229536 [debug] [ThreadPool]: Databricks adapter: Error while retrieving information about `dbt_jyeo_doesnt_exist_or_unauthorized`: Runtime Error
  Runtime Error
    Database 'dbt_jyeo_doesnt_exist_or_unauthorized' not found
�[0m22:41:27.230441 [debug] [ThreadPool]: On list_None_dbt_jyeo_doesnt_exist_or_unauthorized: ROLLBACK
�[0m22:41:27.231252 [debug] [ThreadPool]: Databricks adapter: NotImplemented: rollback
�[0m22:41:27.232297 [debug] [ThreadPool]: On list_None_dbt_jyeo_doesnt_exist_or_unauthorized: Close
�[0m22:41:28.312478 [debug] [ThreadPool]: Acquiring new databricks connection 'list_None_dbt_jyeo'
�[0m22:41:28.319469 [debug] [ThreadPool]: Spark adapter: NotImplemented: add_begin_query
�[0m22:41:28.320065 [debug] [ThreadPool]: Using databricks connection "list_None_dbt_jyeo"
�[0m22:41:28.320649 [debug] [ThreadPool]: On list_None_dbt_jyeo: /* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.2", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "connection_name": "list_None_dbt_jyeo"} */
show tables in `dbt_jyeo`
  
�[0m22:41:28.321144 [debug] [ThreadPool]: Opening a new connection, currently in state closed
�[0m22:41:30.169989 [debug] [ThreadPool]: SQL status: OK in 1.85 seconds
�[0m22:41:30.194616 [debug] [ThreadPool]: Using databricks connection "list_None_dbt_jyeo"
�[0m22:41:30.195302 [debug] [ThreadPool]: On list_None_dbt_jyeo: /* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.2", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "connection_name": "list_None_dbt_jyeo"} */
show views in `dbt_jyeo`
  
�[0m22:41:31.244088 [debug] [ThreadPool]: SQL status: OK in 1.05 seconds
�[0m22:41:31.252343 [debug] [ThreadPool]: On list_None_dbt_jyeo: ROLLBACK
�[0m22:41:31.253307 [debug] [ThreadPool]: Databricks adapter: NotImplemented: rollback
�[0m22:41:31.254899 [debug] [ThreadPool]: On list_None_dbt_jyeo: Close
�[0m22:41:32.145839 [debug] [MainThread]: Connection 'master' was properly closed.
�[0m22:41:32.146470 [debug] [MainThread]: Connection 'list_None_dbt_jyeo' was properly closed.
�[0m22:41:32.147036 [info ] [MainThread]: 
�[0m22:41:32.147549 [info ] [MainThread]: Finished running  in 0 hours 0 minutes and 12.47 seconds (12.47s).
�[0m22:41:32.148152 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x103c5ad70>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x11619ccd0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x11619ce50>]}
�[0m22:41:32.148804 [debug] [MainThread]: Flushing usage events
�[0m22:41:33.253042 [error] [MainThread]: Encountered an error:
Runtime Error
  Runtime Error
    Runtime Error
      Database 'dbt_jyeo_doesnt_exist_or_unauthorized' not found
dbt-databricks==1.4.1
�[0m22:43:41.938456 [info ] [MainThread]: Running with dbt=1.4.1
�[0m22:43:41.942147 [debug] [MainThread]: running dbt with arguments {'write_json': True, 'use_colors': True, 'printer_width': 80, 'version_check': True, 'partial_parse': False, 'static_parser': True, 'profiles_dir': '/Users/jeremy/.dbt', 'send_anonymous_usage_stats': True, 'quiet': False, 'no_print': False, 'cache_selected_only': False, 'select': ['a'], 'which': 'run', 'rpc_method': 'run', 'indirect_selection': 'eager'}
�[0m22:43:41.942590 [debug] [MainThread]: Tracking: tracking
�[0m22:43:41.959946 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'start', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1236ff7c0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1236f3dc0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1236f3e80>]}
�[0m22:43:41.981801 [debug] [MainThread]: Partial parsing not enabled
�[0m22:43:43.078277 [debug] [MainThread]: 1699: static parser successfully parsed alpha/a.sql
�[0m22:43:43.094397 [debug] [MainThread]: 1699: static parser successfully parsed bravo/b.sql
�[0m22:43:43.147373 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'load_project', 'label': 'd561e661-8158-4aff-ae67-df9168000a58', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x12384c8e0>]}
�[0m22:43:43.175334 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'resource_counts', 'label': 'd561e661-8158-4aff-ae67-df9168000a58', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1236f02e0>]}
�[0m22:43:43.175942 [info ] [MainThread]: Found 2 models, 0 tests, 0 snapshots, 0 analyses, 372 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics
�[0m22:43:43.176475 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': 'd561e661-8158-4aff-ae67-df9168000a58', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1236ff820>]}
�[0m22:43:43.178442 [info ] [MainThread]: 
�[0m22:43:43.181505 [debug] [MainThread]: Acquiring new databricks connection 'master'
�[0m22:43:43.182876 [debug] [ThreadPool]: Acquiring new databricks connection 'list_schemas'
�[0m22:43:43.197433 [debug] [ThreadPool]: Using databricks connection "list_schemas"
�[0m22:43:43.198253 [debug] [ThreadPool]: On list_schemas: /* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.1", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "connection_name": "list_schemas"} */

    show databases
  
�[0m22:43:43.198598 [debug] [ThreadPool]: Opening a new connection, currently in state init
�[0m22:43:45.615494 [debug] [ThreadPool]: SQL status: OK in 2.42 seconds
�[0m22:43:45.627996 [debug] [ThreadPool]: On list_schemas: Close
�[0m22:43:46.851182 [debug] [ThreadPool]: Acquiring new databricks connection 'list_None_dbt_jyeo'
�[0m22:43:46.865802 [debug] [ThreadPool]: Spark adapter: NotImplemented: add_begin_query
�[0m22:43:46.866257 [debug] [ThreadPool]: Using databricks connection "list_None_dbt_jyeo"
�[0m22:43:46.866590 [debug] [ThreadPool]: On list_None_dbt_jyeo: /* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.1", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "connection_name": "list_None_dbt_jyeo"} */
show tables in `dbt_jyeo`
  
�[0m22:43:46.866887 [debug] [ThreadPool]: Opening a new connection, currently in state closed
�[0m22:43:48.863467 [debug] [ThreadPool]: SQL status: OK in 2.0 seconds
�[0m22:43:48.883134 [debug] [ThreadPool]: Using databricks connection "list_None_dbt_jyeo"
�[0m22:43:48.883730 [debug] [ThreadPool]: On list_None_dbt_jyeo: /* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.1", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "connection_name": "list_None_dbt_jyeo"} */
show views in `dbt_jyeo`
  
�[0m22:43:49.843810 [debug] [ThreadPool]: SQL status: OK in 0.96 seconds
�[0m22:43:49.849162 [debug] [ThreadPool]: On list_None_dbt_jyeo: ROLLBACK
�[0m22:43:49.849805 [debug] [ThreadPool]: Databricks adapter: NotImplemented: rollback
�[0m22:43:49.850214 [debug] [ThreadPool]: On list_None_dbt_jyeo: Close
�[0m22:43:50.946685 [debug] [ThreadPool]: Acquiring new databricks connection 'list_None_dbt_jyeo_doesnt_exist_or_unauthorized'
�[0m22:43:50.967412 [debug] [ThreadPool]: Spark adapter: NotImplemented: add_begin_query
�[0m22:43:50.967956 [debug] [ThreadPool]: Using databricks connection "list_None_dbt_jyeo_doesnt_exist_or_unauthorized"
�[0m22:43:50.968379 [debug] [ThreadPool]: On list_None_dbt_jyeo_doesnt_exist_or_unauthorized: /* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.1", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "connection_name": "list_None_dbt_jyeo_doesnt_exist_or_unauthorized"} */
show tables in `dbt_jyeo_doesnt_exist_or_unauthorized`
  
�[0m22:43:50.968753 [debug] [ThreadPool]: Opening a new connection, currently in state closed
�[0m22:43:53.108998 [debug] [ThreadPool]: Databricks adapter: Error while running:
/* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.1", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "connection_name": "list_None_dbt_jyeo_doesnt_exist_or_unauthorized"} */
show tables in `dbt_jyeo_doesnt_exist_or_unauthorized`
  
�[0m22:43:53.109943 [debug] [ThreadPool]: Databricks adapter: <class 'databricks.sql.exc.ServerOperationError'>: Database 'dbt_jyeo_doesnt_exist_or_unauthorized' not found
�[0m22:43:53.110648 [debug] [ThreadPool]: Databricks adapter: diagnostic-info: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'dbt_jyeo_doesnt_exist_or_unauthorized' not found
	at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:47)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:435)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:257)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:123)
	at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:48)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:52)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:235)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:220)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:269)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'dbt_jyeo_doesnt_exist_or_unauthorized' not found
	at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.requireDbExists(SessionCatalog.scala:647)
	at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.doListTables(SessionCatalog.scala:1506)
	at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.doListTables(ManagedCatalogSessionCatalog.scala:1222)
	at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.listTables(ManagedCatalogSessionCatalog.scala:1318)
	at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.$anonfun$listTables$1(UnityCatalogV2Proxy.scala:184)
	at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.assertSingleNamespace(UnityCatalogV2Proxy.scala:114)
	at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.listTables(UnityCatalogV2Proxy.scala:183)
	at org.apache.spark.sql.execution.datasources.v2.ShowTablesExec.run(ShowTablesExec.scala:42)
	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43)
	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:160)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:239)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:386)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:186)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:968)
	at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:141)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:336)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:160)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:156)
	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:590)
	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:168)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:590)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:268)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:264)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:566)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:156)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:324)
	at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:156)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:141)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:132)
	at org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:186)
	at org.apache.spark.sql.execution.QueryExecution.optimizedPlan$lzycompute(QueryExecution.scala:191)
	at org.apache.spark.sql.execution.QueryExecution.optimizedPlan(QueryExecution.scala:188)
	at org.apache.spark.sql.execution.QueryExecution.assertOptimized(QueryExecution.scala:206)
	at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:225)
	at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:222)
	at org.apache.spark.sql.execution.QueryExecution.assertExecutedPlanPrepared(QueryExecution.scala:240)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$compileQuery$2(SparkExecuteStatementOperation.scala:351)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:968)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$compileQuery$1(SparkExecuteStatementOperation.scala:334)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.getOrCreateDF(SparkExecuteStatementOperation.scala:327)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.compileQuery(SparkExecuteStatementOperation.scala:334)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:390)
	... 16 more

�[0m22:43:53.111294 [debug] [ThreadPool]: Databricks adapter: operation-id: b'\xe6\xb8L\xf4\x07{E\x95\x90 &\x1c@\xdb\xa0$'
�[0m22:43:53.112036 [debug] [ThreadPool]: Databricks adapter: Error while running:
macro show_tables
�[0m22:43:53.113573 [debug] [ThreadPool]: Databricks adapter: Runtime Error
  Database 'dbt_jyeo_doesnt_exist_or_unauthorized' not found
�[0m22:43:53.114359 [debug] [ThreadPool]: Databricks adapter: Error while running:
macro list_relations_without_caching
�[0m22:43:53.114917 [debug] [ThreadPool]: Databricks adapter: Runtime Error
  Runtime Error
    Database 'dbt_jyeo_doesnt_exist_or_unauthorized' not found
�[0m22:43:53.115719 [debug] [ThreadPool]: Databricks adapter: Error while retrieving information about `dbt_jyeo_doesnt_exist_or_unauthorized`: Runtime Error
  Runtime Error
    Database 'dbt_jyeo_doesnt_exist_or_unauthorized' not found
�[0m22:43:53.116323 [debug] [ThreadPool]: On list_None_dbt_jyeo_doesnt_exist_or_unauthorized: ROLLBACK
�[0m22:43:53.116867 [debug] [ThreadPool]: Databricks adapter: NotImplemented: rollback
�[0m22:43:53.117579 [debug] [ThreadPool]: On list_None_dbt_jyeo_doesnt_exist_or_unauthorized: Close
�[0m22:43:54.094517 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': 'd561e661-8158-4aff-ae67-df9168000a58', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1239841f0>]}
�[0m22:43:54.095598 [debug] [MainThread]: Spark adapter: NotImplemented: add_begin_query
�[0m22:43:54.096217 [debug] [MainThread]: Spark adapter: NotImplemented: commit
�[0m22:43:54.097640 [info ] [MainThread]: Concurrency: 1 threads (target='dev')
�[0m22:43:54.098506 [info ] [MainThread]: 
�[0m22:43:54.108159 [debug] [Thread-1 (]: Began running node model.my_dbt_project.a
�[0m22:43:54.109023 [info ] [Thread-1 (]: 1 of 1 START sql table model dbt_jyeo.a ........................................ [RUN]
�[0m22:43:54.110378 [debug] [Thread-1 (]: Acquiring new databricks connection 'model.my_dbt_project.a'
�[0m22:43:54.111092 [debug] [Thread-1 (]: Began compiling node model.my_dbt_project.a
�[0m22:43:54.113992 [debug] [Thread-1 (]: Writing injected SQL for node "model.my_dbt_project.a"
�[0m22:43:54.115515 [debug] [Thread-1 (]: Timing info for model.my_dbt_project.a (compile): 2023-02-22 22:43:54.111485 => 2023-02-22 22:43:54.115345
�[0m22:43:54.116069 [debug] [Thread-1 (]: Began executing node model.my_dbt_project.a
�[0m22:43:54.136688 [debug] [Thread-1 (]: Spark adapter: NotImplemented: add_begin_query
�[0m22:43:54.137295 [debug] [Thread-1 (]: Using databricks connection "model.my_dbt_project.a"
�[0m22:43:54.137771 [debug] [Thread-1 (]: On model.my_dbt_project.a: /* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.1", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "node_id": "model.my_dbt_project.a"} */

      describe extended `dbt_jyeo`.`a`
  
�[0m22:43:54.138083 [debug] [Thread-1 (]: Opening a new connection, currently in state closed
�[0m22:43:56.287096 [debug] [Thread-1 (]: SQL status: OK in 2.15 seconds
�[0m22:43:56.359603 [debug] [Thread-1 (]: Writing runtime sql for node "model.my_dbt_project.a"
�[0m22:43:56.360706 [debug] [Thread-1 (]: Using databricks connection "model.my_dbt_project.a"
�[0m22:43:56.361063 [debug] [Thread-1 (]: On model.my_dbt_project.a: /* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.1", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "node_id": "model.my_dbt_project.a"} */

  
    
        create or replace table `dbt_jyeo`.`a`
      
      
    using delta
      
      
      
      
      
      
      as
      select 1 as id
  
�[0m22:43:59.551516 [debug] [Thread-1 (]: SQL status: OK in 3.19 seconds
�[0m22:43:59.584524 [debug] [Thread-1 (]: Timing info for model.my_dbt_project.a (execute): 2023-02-22 22:43:54.116408 => 2023-02-22 22:43:59.584457
�[0m22:43:59.584934 [debug] [Thread-1 (]: On model.my_dbt_project.a: ROLLBACK
�[0m22:43:59.585193 [debug] [Thread-1 (]: Databricks adapter: NotImplemented: rollback
�[0m22:43:59.585417 [debug] [Thread-1 (]: On model.my_dbt_project.a: Close
�[0m22:44:00.474004 [debug] [Thread-1 (]: Sending event: {'category': 'dbt', 'action': 'run_model', 'label': 'd561e661-8158-4aff-ae67-df9168000a58', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x123987190>]}
�[0m22:44:00.475594 [info ] [Thread-1 (]: 1 of 1 OK created sql table model dbt_jyeo.a ................................... [�[32mOK�[0m in 6.36s]
�[0m22:44:00.480760 [debug] [Thread-1 (]: Finished running node model.my_dbt_project.a
�[0m22:44:00.483989 [debug] [MainThread]: Acquiring new databricks connection 'master'
�[0m22:44:00.484800 [debug] [MainThread]: On master: ROLLBACK
�[0m22:44:00.485286 [debug] [MainThread]: Opening a new connection, currently in state init
�[0m22:44:01.830538 [debug] [MainThread]: Databricks adapter: NotImplemented: rollback
�[0m22:44:01.831472 [debug] [MainThread]: Spark adapter: NotImplemented: add_begin_query
�[0m22:44:01.832375 [debug] [MainThread]: Spark adapter: NotImplemented: commit
�[0m22:44:01.832989 [debug] [MainThread]: On master: ROLLBACK
�[0m22:44:01.833526 [debug] [MainThread]: Databricks adapter: NotImplemented: rollback
�[0m22:44:01.834044 [debug] [MainThread]: On master: Close
�[0m22:44:02.765282 [debug] [MainThread]: Connection 'master' was properly closed.
�[0m22:44:02.765992 [debug] [MainThread]: Connection 'model.my_dbt_project.a' was properly closed.
�[0m22:44:02.769017 [info ] [MainThread]: 
�[0m22:44:02.769841 [info ] [MainThread]: Finished running 1 table model in 0 hours 0 minutes and 19.59 seconds (19.59s).
�[0m22:44:02.770862 [debug] [MainThread]: Command end result
�[0m22:44:02.789908 [info ] [MainThread]: 
�[0m22:44:02.790647 [info ] [MainThread]: �[32mCompleted successfully�[0m
�[0m22:44:02.791589 [info ] [MainThread]: 
�[0m22:44:02.792312 [info ] [MainThread]: Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1
�[0m22:44:02.793243 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1236f0a00>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x123830ca0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1238e0a60>]}
�[0m22:44:02.794201 [debug] [MainThread]: Flushing usage events

System information

The output of dbt --version:

Core:
  - installed: 1.4.1
  - latest:    1.4.1 - Up to date!

Plugins:
  - databricks: 1.4.2 - Up to date!

The operating system you're using:
macOS

The output of python --version:

Python 3.10.10

Additional context

Note that there is no issue if not using SQL endpoints instead:

# ~/.dbt/profiles.yml
databricks:
  target: dev
  outputs:
    dev:
      type: databricks
      schema: dbt_jyeo
      host:  <REDACT>.cloud.databricks.com
      token: <REDACT>
      # SQL Endpoint
      http_path: /sql/1.0/endpoints/<REDACT>
$ pip install dbt-databricks==1.4.2 && dbt --debug run -s a

22:51:54.490469 [info ] [MainThread]: Running with dbt=1.4.1
22:51:54.494788 [debug] [MainThread]: running dbt with arguments {'debug': True, 'write_json': True, 'use_colors': True, 'printer_width': 80, 'version_check': True, 'partial_parse': False, 'static_parser': True, 'profiles_dir': '/Users/jeremy/.dbt', 'send_anonymous_usage_stats': True, 'quiet': False, 'no_print': False, 'cache_selected_only': False, 'select': ['a'], 'which': 'run', 'rpc_method': 'run', 'indirect_selection': 'eager'}
22:51:54.495345 [debug] [MainThread]: Tracking: tracking
22:51:54.520472 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'start', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x120aff3d0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x120af3e20>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x120af3e80>]}
22:51:54.544075 [debug] [MainThread]: Partial parsing not enabled
22:51:55.611167 [debug] [MainThread]: 1699: static parser successfully parsed alpha/a.sql
22:51:55.629224 [debug] [MainThread]: 1699: static parser successfully parsed bravo/b.sql
22:51:55.689219 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'load_project', 'label': '5cddf83f-cef3-44bf-be5c-16d87db45aef', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x120c4d660>]}
22:51:55.699627 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'resource_counts', 'label': '5cddf83f-cef3-44bf-be5c-16d87db45aef', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x120af0460>]}
22:51:55.700505 [info ] [MainThread]: Found 2 models, 0 tests, 0 snapshots, 0 analyses, 372 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics
22:51:55.701045 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': '5cddf83f-cef3-44bf-be5c-16d87db45aef', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x120aff8e0>]}
22:51:55.703388 [info ] [MainThread]: 
22:51:55.706536 [debug] [MainThread]: Acquiring new databricks connection 'master'
22:51:55.708344 [debug] [ThreadPool]: Acquiring new databricks connection 'list_schemas'
22:51:55.723368 [debug] [ThreadPool]: Using databricks connection "list_schemas"
22:51:55.724855 [debug] [ThreadPool]: On list_schemas: /* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.2", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "connection_name": "list_schemas"} */

    show databases
  
22:51:55.725628 [debug] [ThreadPool]: Opening a new connection, currently in state init
22:55:41.416332 [debug] [ThreadPool]: SQL status: OK in 225.69 seconds
22:55:42.660434 [debug] [ThreadPool]: On list_schemas: Close
22:55:43.530143 [debug] [ThreadPool]: Acquiring new databricks connection 'list_None_dbt_jyeo'
22:55:43.553594 [debug] [ThreadPool]: Spark adapter: NotImplemented: add_begin_query
22:55:43.554351 [debug] [ThreadPool]: Using databricks connection "list_None_dbt_jyeo"
22:55:43.555043 [debug] [ThreadPool]: On list_None_dbt_jyeo: /* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.2", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "connection_name": "list_None_dbt_jyeo"} */
show tables in `dbt_jyeo`
  
22:55:43.555774 [debug] [ThreadPool]: Opening a new connection, currently in state closed
22:55:46.290224 [debug] [ThreadPool]: SQL status: OK in 2.73 seconds
22:55:46.306179 [debug] [ThreadPool]: Using databricks connection "list_None_dbt_jyeo"
22:55:46.306809 [debug] [ThreadPool]: On list_None_dbt_jyeo: /* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.2", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "connection_name": "list_None_dbt_jyeo"} */
show views in `dbt_jyeo`
  
22:55:47.681592 [debug] [ThreadPool]: SQL status: OK in 1.37 seconds
22:55:47.689817 [debug] [ThreadPool]: On list_None_dbt_jyeo: ROLLBACK
22:55:47.690888 [debug] [ThreadPool]: Databricks adapter: NotImplemented: rollback
22:55:47.691848 [debug] [ThreadPool]: On list_None_dbt_jyeo: Close
22:55:48.631531 [debug] [ThreadPool]: Acquiring new databricks connection 'list_None_dbt_jyeo_doesnt_exist_or_unauthorized'
22:55:48.648015 [debug] [ThreadPool]: Spark adapter: NotImplemented: add_begin_query
22:55:48.649010 [debug] [ThreadPool]: Using databricks connection "list_None_dbt_jyeo_doesnt_exist_or_unauthorized"
22:55:48.649994 [debug] [ThreadPool]: On list_None_dbt_jyeo_doesnt_exist_or_unauthorized: /* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.2", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "connection_name": "list_None_dbt_jyeo_doesnt_exist_or_unauthorized"} */
show tables in `dbt_jyeo_doesnt_exist_or_unauthorized`
  
22:55:48.650580 [debug] [ThreadPool]: Opening a new connection, currently in state closed
22:55:50.952165 [debug] [ThreadPool]: Databricks adapter: Error while running:
/* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.2", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "connection_name": "list_None_dbt_jyeo_doesnt_exist_or_unauthorized"} */
show tables in `dbt_jyeo_doesnt_exist_or_unauthorized`
  
22:55:50.953217 [debug] [ThreadPool]: Databricks adapter: <class 'databricks.sql.exc.ServerOperationError'>: [SCHEMA_NOT_FOUND] The schema `dbt_jyeo_doesnt_exist_or_unauthorized` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a catalog, verify the current_schema() output, or qualify the name with the correct catalog.
To tolerate the error on drop use DROP SCHEMA IF EXISTS.
22:55:50.954117 [debug] [ThreadPool]: Databricks adapter: diagnostic-info: org.apache.hive.service.cli.HiveSQLException: Error running query: [SCHEMA_NOT_FOUND] org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: [SCHEMA_NOT_FOUND] The schema `dbt_jyeo_doesnt_exist_or_unauthorized` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a catalog, verify the current_schema() output, or qualify the name with the correct catalog.
To tolerate the error on drop use DROP SCHEMA IF EXISTS.
        at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:585)
        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:41)
        at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:99)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:484)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:353)
        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:149)
        at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:49)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:60)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:331)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:316)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:365)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: [SCHEMA_NOT_FOUND] The schema `dbt_jyeo_doesnt_exist_or_unauthorized` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a catalog, verify the current_schema() output, or qualify the name with the correct catalog.
To tolerate the error on drop use DROP SCHEMA IF EXISTS.
        at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.requireDbExists(SessionCatalog.scala:720)
        at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.doListTables(SessionCatalog.scala:1617)
        at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.doListTables(ManagedCatalogSessionCatalog.scala:1396)
        at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.listTables(ManagedCatalogSessionCatalog.scala:1492)
        at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.$anonfun$listTables$1(UnityCatalogV2Proxy.scala:189)
        at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.assertSingleNamespace(UnityCatalogV2Proxy.scala:104)
        at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.listTables(UnityCatalogV2Proxy.scala:188)
        at org.apache.spark.sql.execution.datasources.v2.ShowTablesExec.run(ShowTablesExec.scala:42)
        at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.$anonfun$result$1(V2CommandExec.scala:47)
        at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)
        at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:47)
        at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:45)
        at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:54)
        at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:250)
        at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:245)
        at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:414)
        at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:190)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1003)
        at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:144)
        at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:364)
        at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:250)
        at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:235)
        at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:248)
        at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:241)
        at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:519)
        at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:106)
        at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:519)
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:31)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:268)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:264)
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
        at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:495)
        at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:241)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:324)
        at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:241)
        at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:195)
        at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:186)
        at org.apache.spark.sql.Dataset.<init>(Dataset.scala:248)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$2(SparkExecuteStatementOperation.scala:478)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1003)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$1(SparkExecuteStatementOperation.scala:460)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.getOrCreateDF(SparkExecuteStatementOperation.scala:446)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.analyzeQuery(SparkExecuteStatementOperation.scala:460)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$5(SparkExecuteStatementOperation.scala:519)
        at org.apache.spark.util.Utils$.timeTakenMs(Utils.scala:697)
        at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency(QueryResultCache.scala:149)
        at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency$(QueryResultCache.scala:145)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.recordLatency(SparkExecuteStatementOperation.scala:60)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:519)
        ... 20 more

22:55:50.956160 [debug] [ThreadPool]: Databricks adapter: operation-id: b'\x01\xed\xb3\x04\x0c0\x1b[\x82\xf4\x93\x95\x16\n@\xd3'
22:55:50.957554 [debug] [ThreadPool]: Databricks adapter: Error while running:
macro show_tables
22:55:50.959436 [debug] [ThreadPool]: Databricks adapter: Runtime Error
  [SCHEMA_NOT_FOUND] The schema `dbt_jyeo_doesnt_exist_or_unauthorized` cannot be found. Verify the spelling and correctness of the schema and catalog.
  If you did not qualify the name with a catalog, verify the current_schema() output, or qualify the name with the correct catalog.
  To tolerate the error on drop use DROP SCHEMA IF EXISTS.
22:55:50.961659 [debug] [ThreadPool]: Databricks adapter: Error while running:
macro list_relations_without_caching
22:55:50.962557 [debug] [ThreadPool]: Databricks adapter: Runtime Error
  Runtime Error
    [SCHEMA_NOT_FOUND] The schema `dbt_jyeo_doesnt_exist_or_unauthorized` cannot be found. Verify the spelling and correctness of the schema and catalog.
    If you did not qualify the name with a catalog, verify the current_schema() output, or qualify the name with the correct catalog.
    To tolerate the error on drop use DROP SCHEMA IF EXISTS.
22:55:50.963485 [debug] [ThreadPool]: On list_None_dbt_jyeo_doesnt_exist_or_unauthorized: ROLLBACK
22:55:50.964250 [debug] [ThreadPool]: Databricks adapter: NotImplemented: rollback
22:55:50.965034 [debug] [ThreadPool]: On list_None_dbt_jyeo_doesnt_exist_or_unauthorized: Close
22:55:51.831212 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': '5cddf83f-cef3-44bf-be5c-16d87db45aef', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x120db1330>]}
22:55:51.832800 [debug] [MainThread]: Spark adapter: NotImplemented: add_begin_query
22:55:51.833999 [debug] [MainThread]: Spark adapter: NotImplemented: commit
22:55:51.836358 [info ] [MainThread]: Concurrency: 1 threads (target='dev')
22:55:51.837479 [info ] [MainThread]: 
22:55:51.847881 [debug] [Thread-1 (]: Began running node model.my_dbt_project.a
22:55:51.849383 [info ] [Thread-1 (]: 1 of 1 START sql table model dbt_jyeo.a ........................................ [RUN]
22:55:51.851129 [debug] [Thread-1 (]: Acquiring new databricks connection 'model.my_dbt_project.a'
22:55:51.852742 [debug] [Thread-1 (]: Began compiling node model.my_dbt_project.a
22:55:51.856114 [debug] [Thread-1 (]: Writing injected SQL for node "model.my_dbt_project.a"
22:55:51.858518 [debug] [Thread-1 (]: Timing info for model.my_dbt_project.a (compile): 2023-02-22 22:55:51.853566 => 2023-02-22 22:55:51.858330
22:55:51.859587 [debug] [Thread-1 (]: Began executing node model.my_dbt_project.a
22:55:51.881371 [debug] [Thread-1 (]: Spark adapter: NotImplemented: add_begin_query
22:55:51.882096 [debug] [Thread-1 (]: Using databricks connection "model.my_dbt_project.a"
22:55:51.883442 [debug] [Thread-1 (]: On model.my_dbt_project.a: /* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.2", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "node_id": "model.my_dbt_project.a"} */

      describe extended `dbt_jyeo`.`a`
  
22:55:51.883944 [debug] [Thread-1 (]: Opening a new connection, currently in state closed
22:55:57.838643 [debug] [Thread-1 (]: SQL status: OK in 5.95 seconds
22:55:57.911002 [debug] [Thread-1 (]: Writing runtime sql for node "model.my_dbt_project.a"
22:55:57.912187 [debug] [Thread-1 (]: Using databricks connection "model.my_dbt_project.a"
22:55:57.912730 [debug] [Thread-1 (]: On model.my_dbt_project.a: /* {"app": "dbt", "dbt_version": "1.4.1", "dbt_databricks_version": "1.4.2", "databricks_sql_connector_version": "2.4.0", "profile_name": "databricks", "target_name": "dev", "node_id": "model.my_dbt_project.a"} */

  
    
        create or replace table `dbt_jyeo`.`a`
      
      
    using delta
      
      
      
      
      
      
      as
      select 1 as id
  
22:56:33.518119 [debug] [Thread-1 (]: SQL status: OK in 35.6 seconds
22:56:34.508670 [debug] [Thread-1 (]: Timing info for model.my_dbt_project.a (execute): 2023-02-22 22:55:51.860437 => 2023-02-22 22:56:34.508605
22:56:34.509304 [debug] [Thread-1 (]: On model.my_dbt_project.a: ROLLBACK
22:56:34.509804 [debug] [Thread-1 (]: Databricks adapter: NotImplemented: rollback
22:56:34.510315 [debug] [Thread-1 (]: On model.my_dbt_project.a: Close
22:56:35.651712 [debug] [Thread-1 (]: Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '5cddf83f-cef3-44bf-be5c-16d87db45aef', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x120d8f310>]}
22:56:35.652999 [info ] [Thread-1 (]: 1 of 1 OK created sql table model dbt_jyeo.a ................................... [OK in 43.80s]
22:56:35.656267 [debug] [Thread-1 (]: Finished running node model.my_dbt_project.a
22:56:35.658587 [debug] [MainThread]: Acquiring new databricks connection 'master'
22:56:35.659297 [debug] [MainThread]: On master: ROLLBACK
22:56:35.659941 [debug] [MainThread]: Opening a new connection, currently in state init
22:56:36.644359 [debug] [MainThread]: Databricks adapter: NotImplemented: rollback
22:56:36.644938 [debug] [MainThread]: Spark adapter: NotImplemented: add_begin_query
22:56:36.645447 [debug] [MainThread]: Spark adapter: NotImplemented: commit
22:56:36.645941 [debug] [MainThread]: On master: ROLLBACK
22:56:36.646421 [debug] [MainThread]: Databricks adapter: NotImplemented: rollback
22:56:36.646896 [debug] [MainThread]: On master: Close
22:56:37.774027 [debug] [MainThread]: Connection 'master' was properly closed.
22:56:37.775075 [debug] [MainThread]: Connection 'model.my_dbt_project.a' was properly closed.
22:56:37.778418 [info ] [MainThread]: 
22:56:37.779448 [info ] [MainThread]: Finished running 1 table model in 0 hours 4 minutes and 42.07 seconds (282.07s).
22:56:37.780522 [debug] [MainThread]: Command end result
22:56:37.799242 [info ] [MainThread]: 
22:56:37.800714 [info ] [MainThread]: Completed successfully
22:56:37.802326 [info ] [MainThread]: 
22:56:37.803135 [info ] [MainThread]: Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1
22:56:37.804150 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x120bdfee0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x120bdee90>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x120c30ca0>]}
22:56:37.805183 [debug] [MainThread]: Flushing usage events

Notably the errors returned are different between spark clusters and SQL endpoints for the same error.

@jeremyyeo jeremyyeo added the bug Something isn't working label Feb 22, 2023
@andrefurlan-db andrefurlan-db self-assigned this Feb 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants