-
Notifications
You must be signed in to change notification settings - Fork 3.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
IN() list queries are not parameterized, causing increased SQL Server CPU usage #13617
Comments
Related issue: #12777 |
I wrote an extension to EF6 to handle this exact problem for our product and apparently it is extremely similar to what Nick is proposing above. By replacing this where clause: For example, this expression: would effectively become the following two queryables, where the numbers are supplied as variables that allow EF to parameterize the query rather than embed them as constants: EF then converts those LINQ expressions into this SQL expression: Check out the source here: https://gist.github.com/kroymann/e57b3b4f30e6056a3465dbf118e5f13d |
@kroymann Works like a charm! Big Thanks! public static class QueryableExtension<TQuery>
{
internal static IEnumerable<IQueryable<TQuery>> WhereIn<TKey>(IQueryable<TQuery> queryable,
Expression<Func<TQuery, TKey>> keySelector, IEnumerable<TKey> values, int batchSize)
{
List<TKey> distinctValues = values.Distinct().ToList();
int lastBatchSize = distinctValues.Count % batchSize;
if (lastBatchSize != 0)
{
distinctValues.AddRange(Enumerable.Repeat(distinctValues.Last(), batchSize - lastBatchSize));
}
int count = distinctValues.Count;
for (int i = 0; i < count; i += batchSize)
{
var body = distinctValues
.Skip(i)
.Take(batchSize)
.Select(v =>
{
// Create an expression that captures the variable so EF can turn this into a parameterized SQL query
Expression<Func<TKey>> valueAsExpression = () => v;
return Expression.Equal(keySelector.Body, valueAsExpression.Body);
})
.Aggregate((a, b) => Expression.OrElse(a, b));
if (body == null)
{
yield break;
}
var whereClause = Expression.Lambda<Func<TQuery, bool>>(body, keySelector.Parameters);
yield return queryable.Where(whereClause);
}
}
// doesn't use batching
internal static IQueryable<TQuery> WhereIn<TKey>(IQueryable<TQuery> queryable,
Expression<Func<TQuery, TKey>> keySelector, IEnumerable<TKey> values)
{
TKey[] distinctValues = values.Distinct().ToArray();
int count = distinctValues.Length;
for (int i = 0; i < count; ++i)
{
var body = distinctValues
.Select(v =>
{
// Create an expression that captures the variable so EF can turn this into a parameterized SQL query
Expression<Func<TKey>> valueAsExpression = () => v;
return Expression.Equal(keySelector.Body, valueAsExpression.Body);
})
.Aggregate((a, b) => Expression.OrElse(a, b));
var whereClause = Expression.Lambda<Func<TQuery, bool>>(body, keySelector.Parameters);
return queryable.Where(whereClause);
}
return Enumerable.Empty<TQuery>().AsQueryable();
}
} Usage: int[] a = Enumerable.Range(1, 10).ToArray();
var queries = QueryableExtension<User>.WhereIn(dbContext.Users, t => t.Id, a, 5);
foreach (var queryable in queries)
{
_ = queryable.ToArray();
}
var everything = QueryableExtension<User>.WhereIn(dbContext.Users, t => t.Id, a);
_ = everything.ToArray(); |
Also such queries don't aggregate well in logging systems like Application Insights, cause every query has unique sql. |
I just wanted to share here some lexperiments I did some time ago with different alternative approaches to Enumerable.Contains using existing EF Core APIs: https://github.com/divega/ContainsOptimization/ The goal was both to find possible workarounds, and to explore how we could deal with Contains in the future. I timed the initial construction and first execution of the query with different approaches. This isn't a representative benchmark because there is no data and it doesn't measure the impact of caching, but I think it is still interesting to see the perf sensitivity to the number of parameters. The code tests 4 approaches: Test("standard Contains", context.People.Where(p => (ids.Contains(p.Id))));
Test("Parameter rewrite", context.People.In(ids, p => p.Id));
Test("Split function",
context.People.FromSql(
$"select * from People p where p.Id in(select value from string_split({string.Join(",", ids)}, ','))"));
Test("table-valued parameter",
context.People.FromSql(
$"select * from People p where p.Id in(select * from {context.CreateTableValuedParameter(ids, "@p0")})")); Standard ContainsThis is included as a baseline and is what happens today when you call Contains with a list: the elements of the list get in-lined as constants in the SQL as literals and the SQL cannot be reused. Parameter rewriteThis approaches tries to rewrite the expression to mimic what the compiler would produce for a query like this: var p1 = 1;
var p2 = 2;
var p3 = 3;
context.People.Where(p => (new List<int> {p1, p2, p3}.Contains(p.Id))).ToList(); It also "bucketizes" the list, meaning that it only produces lists of specific lengths (powers of 2), repeating the last element as necessary, with the goal of favoring caching. This approach only works within the limits of 2100 parameters in SQL Server, and sp_executesql takes a couple of parameters, so the size of the last possible bucket is 2098. Overall this seems to be the most expensive approach using EF Core, at least on initial creation and execution. Split functionThis approach was mentioned by @NickCraver and @mgravell as something that Dapper can leverage. I implemented it using FromSql. It is interesting because it leads to just one (potentially very long) string parameter and it seems to perform very well (at least in the first query), but the STRING_SPLIT function Table-valued parameterFor this I took advantage of the ability to create table-valued parameters that contain an |
Here's an alternative solution I discovered thanks to a semi-related SO post by @davidbaxterbrowne: var records = await _context.Items
.AsNoTracking()
.FromSql(
"SELECT * FROM dbo.Items WHERE Id IN (SELECT value FROM OPENJSON({0}))",
JsonConvert.SerializeObject(itemIds.ToList()))
.ToListAsync(); It's specific to SQL Server 2017+ (or Azure SQL Server), and in my testing with a list of 50,000 IDs against a remote Azure SQL database it was 5x faster than a standard batching solution (dividing into lists of 1,000 IDs). |
This comment was marked as resolved.
This comment was marked as resolved.
This comment was marked as resolved.
This comment was marked as resolved.
@roji Have you had a chance to revisit the |
@michaelmesser I haven't, and unfortunately there's little chance I'd be able to for EF Core 8.0 as there are many other highly-prioritized work items. Can you please open a new issue for this, so that it's tracked? It would be most helpful if you could attach the full repro given the latest SQL currently generated by EF (best to use a daily build). If there's a very clear, reproducible improvement given the latest SQL and it's not too hard, I may try to slip it in... |
Interesting discussion. It would be interesting to see benchmark results with the upcoming JSON native type which "will allow you to store JSON documents in a native binary format that is optimized for storage and query performance". DECLARE @json JSON = N'
[
{
"OrderNumber": "S043659",
"Date":"2022-05-24T08:01:00",
"AccountNumber":"AW29825",
"Price":59.99,
"Quantity":1
},
{
"OrderNumber": "S043661",
"Date":"2022-05-20T12:20:00",
"AccountNumber":"AW73565",
"Price":24.99,
"Quantity":3
}
]'; Although I'm not sure if it's possible to use this type via c# code. |
@clement911 note that the above IN/Contains discussion is likely somewhat orthogonal to the new JSON native type, since with IN/Contains there's no column containing JSON data - only a parameter (though it's possible parameter transfer may be affected as well). But it absolutely could affect cases where on searches for a value inside a JSON column (rather than a parameter). The continuing improvement of JSON support across databases (e.g. this new SQL Server JSON type) was one of the factors that led us to go in this direction here. Regardless, I've opened #32150 to track EF support for the new JSON native type, when it's available. |
does this now leveraging OPENJSON allow us to leverage Composite key in Contains? eg:
|
@drdamour, you can give https://github.com/yv989c/BlazarTech.QueryableValues a try. With it, you can compose in-memory collections of complex objects in your EF queries. An example can be seen here: |
yeah thats what i use today...was hoping it made its way to ef core :( |
Hello everyone Ich just tested the feature and found out that OPENJSON ist not used when the collection is a HashSet. I am using EF Core Version 8.0.1. LINQ with HashSet: var productSets= await dbContext.ProductSets()
.Where(ps => ids.ToHashSet().Contains(ps.ProductId))
.ToListAsync(cancellationToken) QUERY with HashSet: SELECT [p].[Id], [p].[ProductId], [p].[ReactivateIfSubProductsReappear], [p].[Version], [p0].[Id], [p0].[IsMainProduct], [p0].[ProductId], [p0].[ProductSetId], [p0].[Quantity]
FROM [externaldata].[ProductSet] AS [p]
LEFT JOIN [externaldata].[ProductSetProduct] AS [p0] ON [p].[Id] = [p0].[ProductSetId]
WHERE [p].[ProductId] IN (CAST(213171 AS bigint), CAST(234107 AS bigint), CAST(254039 AS bigint), CAST(252018 AS bigint), CAST(257030 AS bigint), CAST(282759 AS bigint), CAST(290215 AS bigint), CAST(352284 AS bigint), CAST(354318 AS bigint), CAST(354179 AS bigint))
ORDER BY [p].[Id] LINQ with List: var productSets= await dbContext.ProductSets()
.Where(ps => ids.ToList().Contains(ps.ProductId))
.ToListAsync(cancellationToken) QUERY with List: SELECT [p].[Id], [p].[ProductId], [p].[ReactivateIfSubProductsReappear], [p].[Version], [p0].[Id], [p0].[IsMainProduct], [p0].[ProductId], [p0].[ProductSetId], [p0].[Quantity]
FROM [externaldata].[ProductSet] AS [p]
LEFT JOIN [externaldata].[ProductSetProduct] AS [p0] ON [p].[Id] = [p0].[ProductSetId]
WHERE [p].[ProductId] IN (
SELECT [i].[value]
FROM OPENJSON(@__ids_0) WITH ([value] bigint '$') AS [i]
)
ORDER BY [p].[Id] Would it be possible to add support for HashSets? |
For everyone interessted, the issue can be tracked here #32365 |
Is this extension still good to go? |
Currently, EF Core is not parameterizing
IN(...)
queries created from.Contains()
(and maybe other cases). This has a very detrimental impact on SQL Server itself because:Note: when SQL Server has memory pressure, plan cache is the first thing to empty. So this has a profound impact at scale, doubly so when things have gone sideways.
Steps to reproduce
Here's a reduced down version of the problem:
This results in the following:
Further technical details
EF Core version:
2.1.4
Database Provider:
Microsoft.EntityFrameworkCore.SqlServer
Operating system:
Windows 2016/Windows 10
IDE:
Visual Studio 2017 15.9
Proposal
EF Core should parameterize here. Instead of
IN (1, 2, 3, ...)
we should seeIN (@__ids1, @__ids2, @__ids3, ...)
or similar. This would allow query plan cache to be shared. For example if we ran this 1,000 times to fetch 1,000,000 users in batches, we'd have 1 plan in cache, whereas today we have 1,000 plans. Let's say a user gets removed (or added!) on page 2 of 1,000...today we'd calculate and cache another 999 plans next run.To further address the cardinality problem, an approach similar to what Dapper does would be at least a starting point. We generate only lists of certain sizes, let's just say 5, 10, 50, 100, 500, 1000. Here's an example with 3 parameters:
The 5 Ids are so that 1-5 values all use the same plan. Anything <
n
in length repeats the last value (don't usenull
!). In this case let's say our values are1
,2
, and3
. Our parameterization would be:@ids1 = 1;
@ids2 = 2;
@ids3 = 3;
@ids4 = 3;
@ids5 = 3;
This fetches the users we want, is friendly to the cache, and lessens the amount of generated permutations for all layers.
To put this in perspective, at Stack Overflow scale we're generating millions of one-time-use plans needlessly in EF Core (the parameterization in Linq2Sql lowered this to only cardinality permutations). We alleviate the cache issue by enabling ad-hoc query mode on our SQL Servers, but that doesn't lessen the CPU use from all the query hashes involved except in the (very rare) reuse case of the exact same list.
This problem is dangerous because it's also hard to see. If you're looking at the plan cache, you're not going to see it by any sort of impact query analysis. Each query, hash, and plan is different. There's no sane way to group them. It's death by a thousand cuts you can't see. I'm currently finding and killing as many of these in our app as I can, but we should fix this at the source for everyone.
The text was updated successfully, but these errors were encountered: