-
Notifications
You must be signed in to change notification settings - Fork 234
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] fs.azure.account.keyInvalid
configuration issue while reading from Unity Catalog Tables on Azure DB
#10318
Comments
What is the table format -- is it a Delta Lake table, a raw Parquet table, or something else? A stacktrace of the error would help. Assuming this is with a table that's ultimately comprised of Parquet files, does this happen even with the config spark.rapids.sql.format.parquet.reader.type=PERFILE? If it works with the PERFILE reader, then that tells us the issue is with setting up the proper context for the multithreaded readers. |
Yes, its a Delta lake table. It didn't work with the Stack Trace:
|
@mattahrens @sameerz Is this related to #8242? |
Yes it is related. We do not need to test with Alluxio, but with filecache. |
Describe the bug
While using Azure Databricks and attempting to read a Managed Table from the Unity Catalog Metastore with the RAPIDS Accelerator, I encountered invalid credentials issue with the following message:
Failure to initialize configuration for storage account databricksmetaeast.dfs.core.windows.net: Invalid configuration value detected for fs.azure.account.keyInvalid configuration value detected for fs.azure.account.key.
However, this error doesn't occur when RAPIDS is disabled.Notes
Adding the credentials of the storage container to the Spark configuration properties can serve as an interim solution. However, this approach is not scalable when there are multiple storage containers.
Environment details
Managed Tables on Azure Databricks with Unity Catalog and RAPIDS Accelerator
The text was updated successfully, but these errors were encountered: