-
Notifications
You must be signed in to change notification settings - Fork 604
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature/adbc flight sql #5010
Feature/adbc flight sql #5010
Conversation
Thanks @prmoore77! We'll need to set up some test infrastructure around this -- I'm going to look at generating the required tables and inserting them into the dockerized flight-sql server on demand. |
Codecov Report
@@ Coverage Diff @@
## master #5010 +/- ##
==========================================
- Coverage 95.10% 87.61% -7.50%
==========================================
Files 401 211 -190
Lines 44705 23289 -21416
Branches 4379 3246 -1133
==========================================
- Hits 42516 20404 -22112
- Misses 1690 2463 +773
+ Partials 499 422 -77
|
@@ -0,0 +1,106 @@ | |||
# Copyright 2015 Cloudera Inc. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think you need this license header.
@@ -167,6 +167,7 @@ snowflake = "ibis.backends.snowflake" | |||
spark = "ibis.backends.pyspark" | |||
sqlite = "ibis.backends.sqlite" | |||
trino = "ibis.backends.trino" | |||
adbc = "ibis.backends.adbc" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You also need to add adbc
to the extras
section and declare its list of optional dependencies. Looking at your code, I think that's at least adbc = ["pyarrow", "sqlalchemy"]
and possibly more if there's an ADBC dependency somewhere.
adbc = pytest.importorskip("pyarrow.flight_sql") | ||
|
||
@staticmethod | ||
@functools.lru_cache(maxsize=None) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wouldn't worry about caching yet.
def connect(data_directory: Path) -> BaseBackend: | ||
|
||
flight_password = os.environ["FLIGHT_PASSWORD"] | ||
authorization_header = f"Basic {str(base64.b64encode(bytes(f'flight_username:{flight_password}', encoding='utf-8')), encoding='utf-8')}" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull the f-string expression out into a variable :)
""" | ||
Create an Ibis client connected via ADBC. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
""" | |
Create an Ibis client connected via ADBC. | |
"""Create an Ibis client connected via ADBC. |
Example: grpc+tls://localhost:31337 | ||
dialect | ||
The SQLAlchemy dialect name (the scheme of a connection URI). | ||
db_kwargs |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why not accept **kwargs
and just thread that through?
|
||
return result | ||
|
||
def _sqla_connect(self, dialect, conn_rec, conn_args, conn_params): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this need to be a method as opposed to a function defined inline in do_connect
? You're not using self
anywhere here.
The other thing is, I think in the previous ADBC related PRs we discussed whether ADBC should be a backend, or a base backend (like SQLAlchemy) - so are we ok having it as a toplevel backend here? (The latter makes slightly more sense to me.) |
Superseded by #5475. |
I have added a new ADBC Flight SQL back-end with the help of @lidavidm and @gforsyth . I haven't added tests as of yet - so this PR isn't ready to merge, but I will work with the Ibis team to learn what to do :). Thank you!