-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Publicize the databricks delta lake destination connector #6043
Comments
Just following up on this. |
We do want to adopt airbyte over rudderstack, but the destination of choice is not completely supported. Just following up on this. |
@nikhilalmeida, thank you for the interest in this connector. Currently, to use the Databricks connector in your own deployment of Airbyte, you can build the image yourself, and upload it to a private registry. The building process has been simplified and is the same as any other connector now. See the developer doc for details. |
+1 For adopting Airbyte and using Databricks connector. I will follow @tuliren 's build process for now but we should definitely have a mechanism to pull the image after agreeing to the License agreement. Ill also bring this up with my Databricks CSR at the next call. EDIT I finally got the connector to build. For anyone trying I used an aws ubuntu 20.04 lightsail server and ran this script that I made
|
@klogdog, sorry that I missed your question.
No, unfortunately it is not. The license requires the users to accept the terms before they can download the driver file. But to use the connector, users will technically download the image including the drive file first. The good news is that Databricks has released a new public JDBC driver earlier this month. I am working on migrating our Databricks connector. Will make this connector public this week. |
The Databricks destination connector is now available for everyone. The next Airbyte version will include it by default. You can also add it manually. Its docker image can be found here: |
@tuliren , I am attempting to use the latest connector you've pushed to dockerhub. I'm getting the following error when I try to
|
@tuliren Comment above resolved. I was attempting to connect using a SQL Endpoint Cluster. The issue is resolved when I used a Databricks Cluster (as per documentation here: https://docs.airbyte.com/integrations/destinations/databricks#requirements). |
@jonathanneo, thank you for the update. I will update our documentation to include this in a troubleshooting section. |
Tell us about the problem you're trying to solve
Currently the databricks destination connector is private due to legal reasons. We should find a way to publicize it.
This relates to #2075, and is a follow-up issue from PR #5998.
TODO
The text was updated successfully, but these errors were encountered: