You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have searched the existing issues and didn't find my feature already requested there
Feature description
I'm putting this in as a feature because I'm not sure it is supported yet. At least it broke in a way that indicated it may not have been considered yet when I tried it.
Where the URL/connection string was defined in the format below:
# How my connection string is constructed# From https://docs.databricks.com/en/dev-tools/sqlalchemy.html#authenticationurl= (
f"databricks://token:{ACCESS_TOKEN}@{SERVER_HOSTNAME}?"+f"http_path={HTTP_PATH}&catalog={CATALOG}&schema={SCHEMA}"
)
Which resulted in the following errors:
databricks.sql.exc.ServerOperationError: [NO_SUCH_CATALOG_EXCEPTION] Catalog 'none' was not found.
Please verify the catalog name and then retry the query or command again.databricks.sql.exc.ServerOperationError:
[NO_SUCH_CATALOG_EXCEPTION] Catalog 'none' was not found. Please verify the catalog name and then retry the query or command again.
sqlalchemy.exc.DatabaseError: (databricks.sql.exc.ServerOperationError)
[NO_SUCH_CATALOG_EXCEPTION] Catalog 'none' was not found. Please verify the catalog name and then retry the query or command again.
[SQL: SHOW TABLES FROM `None`.`None`]
This suggests that the catalog and schema haven't been parsed correctly from the databricks connection string. It may be as simple as handling this specific type of URL, but there may be more to it that I'm currently ignorant of.
Use case
I want to generate SQLModels from an existing schema on Databricks where tables have been created directly from Spark dataframes on ingest process.
The text was updated successfully, but these errors were encountered:
Things to check first
Feature description
I'm putting this in as a feature because I'm not sure it is supported yet. At least it broke in a way that indicated it may not have been considered yet when I tried it.
I ran:
sqlacodegen --generator sqlmodels "databricks://my_connection_string"
Where the URL/connection string was defined in the format below:
Which resulted in the following errors:
This suggests that the catalog and schema haven't been parsed correctly from the databricks connection string. It may be as simple as handling this specific type of URL, but there may be more to it that I'm currently ignorant of.
Use case
I want to generate SQLModels from an existing schema on Databricks where tables have been created directly from Spark dataframes on ingest process.
The text was updated successfully, but these errors were encountered: