You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I believe this is somewhat related to #121 where up front computation is slowed down due to ignoring the batch_dims keyword argument, meaning that batches are essentially duplicated, and very large to compute. This is reproducible in a google colab notebook just by changing the installed version between 0.1.0 and 0.2.0.
Yes, this is connected to #121, with an explanation of the problem in #121 (comment). In summary, batch_dims in v0.2.0 erroneously only controls the number of batches but not their size. I'm aiming to open a proposed fix today.
What happened?
I believe this is somewhat related to #121 where up front computation is slowed down due to ignoring the
batch_dims
keyword argument, meaning that batches are essentially duplicated, and very large to compute. This is reproducible in a google colab notebook just by changing the installed version between 0.1.0 and 0.2.0.Minimal Complete Verifiable Example
Environment
On version 0.1.0 I get the output:
On version 0.2.0 I get the output:
The text was updated successfully, but these errors were encountered: