-
Notifications
You must be signed in to change notification settings - Fork 4.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement interface for bulk inferencing in TF models #8560
Conversation
Commit: ba17484, The full report is available as an artifact. Dataset:
Dataset:
Dataset:
Dataset:
Dataset:
Dataset:
|
@dakshvar22 I can see failed training on Sara with
Is this something to be fixed in this PR, or is it unrelated? |
I don't think that's related but weird that it fails specifically on that pair of dataset and config. |
Hey @dakshvar22! 👋 To run model regression tests, comment with the Tips 💡: The model regression test will be run on Tips 💡: Every time when you want to change a configuration you should edit the comment with the previous configuration. You can copy this in your comment and customize:
|
/modeltest include:
- dataset: ["Sara"]
config: ["Sparse + DIET(seq) + ResponseSelector(t2t)"]
|
The model regression tests have started. It might take a while, please be patient. Used configuration can be found in the comment. |
Commit: ba17484, The full report is available as an artifact. Dataset:
|
@samsucik Okay it ran successfully this time. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, Daksh, especially for adding the tests! A few tiny things may require changes, but nothing serious as far as I can see 🙂
Co-authored-by: Sam Sucik <[email protected]>
@samsucik I also made |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good 🚀
Co-authored-by: Sam Sucik <[email protected]>
Thanks for the good discussion @samsucik 🙌 |
Proposed changes:
run_inference
method to generate predictions through the model.run_inference
is meant to perform batch inferencing as well which means that it implements the batching and combining of output for different batches. The specific TF models likeTED
,DIET
do not need to know the implementation details of this batch inferencing.IntentTEDPolicy
.Status (please check what you already did):
black
(please check Readme for instructions)