-
Notifications
You must be signed in to change notification settings - Fork 795
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error when calling chart.to_dict
when using __dataframe__
protocol
#3110
Comments
Thanks for reporting. Yeah, we shouldn't be using We'd also want to avoid calling |
I made a local change to get around the error with: elif hasattr(data, "__dataframe__"):
pi = import_pyarrow_interchange()
pa_table = pi.from_dataframe(data)
assert pa_table.num_rows < max_rows
return pa_table This works with a pyarrow table, but fails with Ibis:
It looks like Ibis isn't expecting the Can repro without Altair with: from pyarrow.interchange import from_dataframe
import ibis
data = ibis.memtable(
{'x': ['A', 'B', 'C', 'D', 'E'],
'y': [5, 3, 6, 7, 2]}
)
from_dataframe(data)
If pyarrow is passing |
Thanks Jon! Yeah, I discovered that earlier today, fixed here (that's why I listed |
Haha, missed that 👍 |
Here's a quick PR to try: #3111. It needs some testing before we merge, but it would be great if you want to start playing with it in case you run into other similar issues. |
Thanks y'all! |
Similar to #3109, except using
ibis
as the input and callingchart.to_dict
(as is done automatically when rendering charts in a notebook).This outputs:
Ibis tables don't support
len
, while many other dataframe-like objects (pyarrow
/pandas
/polars
/...) do. I suspect this has masked this issue when using other dataframe like inputs.A few possible fixes:
pd.api.interchange.from_dataframe
). This would let you then write only pandas-compatible code, since things are automatically converted at the input toaltair
.data.__dataframe__().num_rows()
instead oflen(data)
when dealing with a__dataframe__
protocol input.Versions:
The text was updated successfully, but these errors were encountered: