-
Notifications
You must be signed in to change notification settings - Fork 174
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for Databricks backend #1377
Comments
> class(con)
[1] "Spark SQL"
attr(,"package")
[1] ".GlobalEnv" |
Adding this two links I included in the issue for
|
For |
First off, thanks -- super helpful in a lot of ways. Do we anticipate never being able to upload data to databricks in a permanent way (i.e. when setting And then something I just noticed is that it wasn't possible to pass a name for a temporary view using catalogs with this just yet either, in case that's something worth addressing
=>
|
@kmishra9 can you please file a new issue? You got lucky this time, but generally you shouldn't expect folks to be reading closed issues. |
Thanks to the default "DBI" translation, most of the
dplyr
/dbplyr
operations work. There are Databricks specific operations, for examplevar()
, currently marked as not-supported bydbplyr
, due to the fact thatvar()
is not in the default translation. Databricks uses "Spark SQL", which is underpinned by the Hive SQL syntax. My suggestion would be to use the same SQL variance currently in use for the Hive backend support.The second issue uncovered, is that
copy_to()
does not work. The Databricks back-end does not support transactions. So I think a customdb_copy_to()
function will be needed here.Reprex
The following
reprex
contains the specific code that can be used to connect to Databricks using a PAT. It also contains confirmation thatvar()
is supported in Databricks. The confirmation is in calling SQL directly viaDBI
, and then attemptingto do the same via
dplyr
, which errors out. It also contains the error received when trying to usecopy_to()
Created on 2023-10-24 with reprex v2.0.2
The text was updated successfully, but these errors were encountered: