Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Databricks] Security exception is thrown on a high concurrency cluster with auth passthrough. #22787

Closed
markbaas opened this issue Jul 6, 2021 · 3 comments
Labels
Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. Data Bricks

Comments

@markbaas
Copy link

markbaas commented Jul 6, 2021

In Databricks, I'm trying to connect with my cosmos db using a high concurrency cluster with aad passthrough.

After settings the spark settings:

%python
spark.conf.set("spark.sql.catalog.cosmosCatalog.spark.cosmos.accountEndpoint", "https://blabla.documents.azure.com:443/")
spark.conf.set("spark.sql.catalog.cosmosCatalog.spark.cosmos.accountKey", dbutils.secrets.get("main", "cosmosMasterKey"))
spark.conf.set("spark.sql.catalog.cosmosCatalog", "com.azure.cosmos.spark.CosmosCatalog")

I'm running the following sql code:

select * from cosmosCatalog.bladb.blacollection limit 10

The following exception is through:

Error in SQL statement: SecurityException: Only default session catalog is supported for Credential Passthrough or Table ACL enabled cluster. Try to load: com.azure.cosmos.spark.CosmosCatalog
@markbaas
Copy link
Author

markbaas commented Jul 6, 2021

Seems it won't work with regular cluster in combination with passthrough either.

@joshfree joshfree added Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. Data Bricks labels Jul 6, 2021
@ralphke
Copy link

ralphke commented Oct 4, 2021

@joshfree I do get the same error message on an Databricks premium cluster version 9.1 LTS when trying to access a cosmosDB database. Neither read from the dataset nor write works as expected.
This is the code artifact I am using:
cosmosEndpoint = "https://.documents.azure.com:443/"
cosmosMasterKey = ""
cosmosDatabaseName = "southridge"
cosmosContainerName = "movies"
cosmosApplicationName = "Databricks"

spark.conf.set("spark.sql.catalog.cosmosCatalog", "com.azure.cosmos.spark.CosmosCatalog")
spark.conf.set("spark.sql.catalog.cosmosCatalog.spark.cosmos.accountEndpoint", cosmosEndpoint)
spark.conf.set("spark.sql.catalog.cosmosCatalog.spark.cosmos.accountKey", cosmosMasterKey)
spark.conf.set("spark.sql.catalog.cosmosCatalog.spark.cosmos.container", cosmosContainerName)
spark.conf.set("spark.sql.catalog.cosmosCatalog.spark.cosmos.applicationName", cosmosApplicationName)

spark.sql("CREATE DATABASE IF NOT EXISTS cosmosCatalog.{};".format(cosmosDatabaseName))
spark.sql("CREATE TABLE IF NOT EXISTS cosmosCatalog.{}.{} using cosmos.oltp TBLPROPERTIES(partitionKeyPath = '/id', manualThroughput = '400')".format(cosmosDatabaseName, cosmosContainerName))

azure-sdk pushed a commit to azure-sdk/azure-sdk-for-java that referenced this issue Mar 1, 2023
[Hub Generated] Publish private branch 'release-managednetworkfabric-Microsoft.ManagedNetworkFabric-2023-02-01-preview' (Azure#22787)

* add or modify files

* Adding custom-words
Updating *.md files

* Update readme.python.md

* update go readme and remove ruby

---------

Co-authored-by: Yuchao Yan <[email protected]>
Co-authored-by: ArcturusZhang <[email protected]>
@joshfree
Copy link
Member

Closing old issue with no activity since Oct 4, 2021

@github-actions github-actions bot locked and limited conversation to collaborators Jun 12, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. Data Bricks
Projects
None yet
Development

No branches or pull requests

3 participants