Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make it possible to disable batch requests support #744

Merged
merged 1 commit into from
May 4, 2022

Conversation

popzxc
Copy link
Contributor

@popzxc popzxc commented Apr 28, 2022

While it doesn't comply to the JSON RPC spec, some servers may explicitly decide to not support batched requests for various reasons. This PR makes it possible.

@popzxc popzxc requested a review from a team as a code owner April 28, 2022 13:00
Copy link
Member

@niklasad1 niklasad1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

I wonder what those reasons are to reject batch requests because I think the max_response_body_size could work too for that reason but I realized now that we just use for individual calls it seems.

@popzxc
Copy link
Contributor Author

popzxc commented Apr 28, 2022

Well, the server may have computationally expensive requests, for example.
An example of real-world RPC server that explicitly disables batch requests would be Amazon Managed Blockchain.

@niklasad1
Copy link
Member

Fair enough, I see.

@jsdw
Copy link
Collaborator

jsdw commented Apr 29, 2022

I'm not sure what avoiding computationally expensive code has to do with batches, offhand; what's to stop a user submitting the same as multiple individual messages, which is basically how jsonrpsee treats batches anyway?

Do you have a real use case for this that you might be able to share so that I can better understand the need for it?

@popzxc
Copy link
Contributor Author

popzxc commented Apr 30, 2022

what's to stop a user submitting the same as multiple individual messages, which is basically how jsonrpsee treats batches anyway?

Throttling/rate limiting algorithms, basically. Obviously, it's possible to implement those on the application layer, but it'll be more clunky.

Do you have a real use case for this that you might be able to share so that I can better understand the need for it?

Sure. We have RPC methods that invoke virtual machine to execute smart contracts. It means that these methods can be arbitrary "heavy", and we need to protect our servers from being overloaded. We're working on the composite solution that involves throttling/rate limiting, and ability to send batch requests kind of stands in the way right now.

@niklasad1
Copy link
Member

so actually I think the resource limiting stuff could be a good alternative to disabling batch requests completely however I think it's fine to have an option to disable batch requests as well.

@popzxc
Copy link
Contributor Author

popzxc commented May 1, 2022

I mean, they are not mutually exclusive. We will have the full set: both resource limiting and disabled batches. Just with rate-limiting but with enabled batches it's possible to obtain weird result: imagine that you sent the batch with requests that mutate state, and only some of them were executed. It will be confusing to users, so better to prevent this situation.

@jsdw
Copy link
Collaborator

jsdw commented May 3, 2022

At the end of the day, the code change is fairly minimal, and it's well tested and looks good to me, so I'm happy to approve :)

@niklasad1
Copy link
Member

sorry @popzxc can you fix the conflicts so we can merge this?

@niklasad1 niklasad1 merged commit 19aaf65 into paritytech:master May 4, 2022
@popzxc popzxc deleted the disable-batch-requests branch May 4, 2022 09:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants