Batching is not suported? #410
Unanswered
DavidGOrtega
asked this question in
Q&A
Replies: 1 comment
-
Technically it is possible. But we do not share this code, becase for some reason it is hard for people to wrap their heads around it. Though ofc course without clever post-processing it is much easier.
Right. There are discussions where people experiment with this stuff. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Im looking at the code and evidently batching can not be supported as the inference seems to be updating 'h', 'c', 'sr' being reused in the next inference. At most I presume that batches should be done at chunks of time level, right?
Beta Was this translation helpful? Give feedback.
All reactions