-
Notifications
You must be signed in to change notification settings - Fork 3.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Set a default max batch size for SQL Server #9270
Comments
cc @roji since recently we talked about similar observations. |
What is the default for SQL Server MaxBatchSize option? |
@ErikEJ No limit, which in practice means we split to a new batch every 2100 parameters. |
For this test it ends up being about 130. |
Hmmm... I always told people you had a default of 1000... There you go. I will strongly suggest anyone to lower if they are doing mass inserts |
Interesting stuff, it would be great to be able to run this benchmark as-is on PostgreSQL. I think I ran something similar a while ago and found simpler results, i.e. that batching only improves performance (or at least doesn't degrade it), but it would be good to make sure. |
I updated the code. It sets different batch size and run it twice. (first one is ignored as warmup) |
Team decision: default size = 'E' + 'F' |
Set it to 42! |
And update the XML documentation |
@smitpatel ran a few custom batching benchmarks recently that showed interesting results. His benchmarks focuses on INSERT batching for the same entity, but similar analysis could be performed for other operations.
The test inserts 1000 rows on a table in SQL Server. There are two separate series, one for a local instance and one for a remote instance. The chart shows the time elapsed (on Y) for different batch sizes (on X):
To make sense if this chart, keep in mind that at maximum batch size of 1, there is now batching, and we resort to different SQL that has a smaller fixed cost.
There are a few interesting things to observe:
Possible conclusions and follow up actions:
The code of the test follows:
The text was updated successfully, but these errors were encountered: