Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Support batch processing for predictSoftly #28

Closed
LostMekka opened this issue Dec 21, 2020 · 0 comments
Closed

[Feature Request] Support batch processing for predictSoftly #28

LostMekka opened this issue Dec 21, 2020 · 0 comments
Labels
enhancement New feature or request
Milestone

Comments

@LostMekka
Copy link
Contributor

Sequential.predictSoftly should also have a batch version, similar to Sequential.predict.

This came up in #25:

c.)
There should be a means to call predictSoftly on a batch of inputs. At the moment this is only possible for predict.

That would be awesome as well. My current pet project to try out this library is a board game AI that very crudely learns through self play, where the neural net is the search heuristic. The AI plays a semi-random game sequence and for each pair of successive moves I create a training data point. (needs 1 predictSoftly per data point) Building these self-play datasets would probably greatly benefit from batch soft-predicting 😄

Originally posted by @LostMekka in #25 (comment)

@zaleslaw zaleslaw added the enhancement New feature or request label Dec 21, 2020
@zaleslaw zaleslaw added this to the 0.1.1 milestone Dec 21, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants