Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

batch processing/parallel processing #585

Open
oldcpple opened this issue Jun 17, 2024 · 1 comment
Open

batch processing/parallel processing #585

oldcpple opened this issue Jun 17, 2024 · 1 comment

Comments

@oldcpple
Copy link

Hi there, does Petals currenly support batch processing/parallel processing? For example, to increase resource usage or system throughput, we would like to see servers parallelly processing multiple prompts at the same time, aka batch processing. Is this possible?
Thanks a lot.

@justheuristic
Copy link
Collaborator

justheuristic commented Jul 11, 2024

Hi! Both forward/backward and autoregressive inference can run with any batch size, provided that you have enough memory for that.

In our training examples, we use batched training, e.g. this one https://github.com/bigscience-workshop/petals/blob/main/examples/prompt-tuning-sst2.ipynb as a batch size of 32

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants