-
-
Notifications
You must be signed in to change notification settings - Fork 856
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Parallel requests #50
Comments
Currently I'm using async def get_async_results(*, kwargs):
...
queries = []
for i in ...:
# modify params dict
queries.append(async_session.get("/services.json", params=params))
return await asyncio.gather(*queries)
results = asyncio.run(get_async_results(kwargs))
return sorted([inner for outer in results for inner in outer]) And then I join the results together and return the resulting formatted data. I recently came across the requests-toolbelt again and found it has a solution for threading as well: Is the second example something I could use in my current code? I'd imagine something as simple as: results = []
with client.parallel() as parallel:
for page_number in range(0, 10):
parallel.get(f"http://example.com/{page_number}")
while parallel.pending:
results.extend(parallel.next_response())
results.sort() Would get me far. |
Just curious, why not just |
The ability to loop through several generated requests and extend a list of results is appealing, and I haven't figured out how to do that with |
Let's just leave this up to the primatives of whatever concurrency framework the user is working with. |
We could consider adding a concurrency API to simplify making multiple requests in parallel.
I'll give sync examples here, but we'd also have equivelent async cases too.
If you have a number of requests that you'd like to send in paralell, then...
Alternatively, if you don't need to care about getting the responses back out-of-order, then:
Nice things here:
asyncio.gather
or whatevs.The text was updated successfully, but these errors were encountered: