-
Notifications
You must be signed in to change notification settings - Fork 337
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use sql_select_limit for CommandBehavior.SingleRow #679
Comments
I can't reproduce the problem with the sample code you've provided. (I'm guessing there might be significant latency between your application and the DB server? How many rows would be returned from However, I do see the underlying problem. Connector/NET executes a |
It would return no more than 1000 rows. But execution time is more than 300 seconds. It seams like
Actually I see the same issue with default @bgrainger I am going to adjust the sample app to be able to reproduce the issue locally |
@segor Are you able to test 0.57.0-beta8 and let me know if there's a performance improvement? |
@bgrainger Yes with 0.57.0-beta8 the But I still see significant difference in execution time of |
Are you reading all 1000 (or so) rows from the reader? Are you measuring the time it takes for just that method to return, or something else?
Is that the case? 600ms just to return one row seems like a very long time. |
I measure the var sw = new Stopwatch();
sw.Start();
// ExecuteReaderAsync is about 100 times slower when MySqlConnector is referenced
var reader = await cmd.ExecuteReaderAsync();
Console.WriteLine($"ExecuteReader: {sw.ElapsedMilliseconds}ms"); |
@bgrainger I have retested again, |
During migration from
Mysql.Data
we got performance issue in code that reads the first row from a SQL query using Dapper methodQueryFirstOrDefaultAsync
. With Oracle driver execution time is about 600 ms but with MySqlConnector execution time is about 50 seconds.After some investigation I found that the problem is inside MySqlCommand.ExecuteReaderAsync. Here is the app I use to reproduce the issue:
Workaround is adding
LIMIT 1
to the query but it will require to review all code we haveThe text was updated successfully, but these errors were encountered: