-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow developers to use built-in meters for tracking stats on caches #67770
Comments
I couldn't figure out the best area label to add to this issue. If you have write-permissions please help me learn by adding exactly one area label. |
Tagging subscribers to this area: @dotnet/area-extensions-caching Issue DetailsIn the first issue in #50406, we added support for adding metrics for
Next we want to focus on having a good user experience for developers who wish to track statistics for multiple memory caches built-in to the library. Today getting statistics for multiple caches is possible but requires developer to write their own meter. Our focus here is to add support for cache names, a built-in Meter, and a default naming convention for the cache created by
|
MemoryCache
built-in in the library
Some feedback from my recent experimentation in using This may or may not be the best place for this feedback, but I was too slow to comment on #50406 😄 I really appreciate the work here to add further metrics for My comment is around API naming and the possible need to clearly set expectations in docs/announcements regarding this new feature because I think many developers will wrongly assume what type of value I was rather excited to try out this feature, with the mistaken expectation that Knowing that historically it was quite a challenge to calculate an estimation of real world memory usage for the in-memory cache, I was excited at the prospect of finally having a quick way to gauge this! (My fault for not reading the docs / changes closely enough!) In retrospect, it seems totally understandable that However, I think there are a couple of issues here which I believe will lead a lot of developers (perhaps most) to make the same incorrect assumption as me. Firstly the use of "Estimated" led me to assume this was a measure of real-world memory usage because it wasn't initially intuitive to me that you could know the exact deterministic count of cache entries via The second challenge is the use of the terminology The combination of "estimated" and "size" may lead a lot of developers to assume that this is a memory footprint estimation. (Perhaps with a dose of wishful thinking driving this too...) Having watched the code review video regarding this, where the estimated size property was initially proposed, I noticed that the first assumption by one of the review team was also that this size property, would be a "bytes" measurement too. (Followed by lengthy discussion/clarification about what the "Size" terminology actually means in the context of IMemoryCache!). Lastly, there was a comment in #50406 which suggested the addition of an estimated cache size. I'm not sure if this was the "request" that was mentioned (in the video) to support the addition of For me, in a Blazor Server app, where memory usage is a primary concern and I am working on ways to both effectively monitor and regulate this usage, an estimated measurement of real-world memory usage would have been very useful. I do also fully understand why this is difficult to achieve. In summary, I think the default assumption for most people, when encountering this property for the first time, (based on all of the above) is that Having said all that, these new metrics are very useful and a great addition, thank you team! If this feedback is best left elsewhere, please do let me know. |
Thank you, @basecde, for the feedback! Do you happen to have a suggestion that you think would alleviate the problem? Can you think of a better name that would better describe the property? |
Well, you've called my bluff there @eerhardt 😄 No, given the historical context and the fact that we already have On the assumption that it's not possible to provide a property with a real memory usage estimate (and I'm guessing it's not, at least for .NET 7), my concrete suggestion would just be around setting developer expectations carefully in the announcement of this feature for .NET 7 and in accompanying docs. Specifically, calling out (or reminding) potential consumers of this API that the Size in question is the total "arbitrary Size" value and not an absolute measurement in bytes. I think there is no problem at all here if the potential confusion is anticipated and expectations are set up-front in the docs! Here's a suggested tweak to the current feature announcement wording: Current version:
Proposed updated version:
If helpful, I'm happy to make concrete wording suggestions for any proposed docs for this feature too (not sure if these are written yet?) |
@noahfalk @davidfowl as per offline conversation moving this out of 7.0 to first investigate proper design for metrics in DI aware systems in a non static way. |
This not only helps try out library but also encourages future contribution to issue dotnet#67770.
This request has been stale for a while. I assume at this point that this won't hit .NET9 anymore as the window for that has now closed, so this is at least .NET10 timeframe? Or perhaps this is not necessarily tied with the .NET releases since this is all library code inside Our team has been struggling a bit with finding optimization opportunities and I stumbled upon this whole memory caching metrics thing after thinking about adding custom metrics to our caches and searching for something native first before doing that. I landed on that So, my take thus far is that there is an idea to provide native metrics from the library itself, but that's not yet available (basically, this issue here). Thus, in the meantime, the expectation is that consumers would create their own custom instrumentation leveraging the Assuming we went with our own custom metrics for now, would the team recommend anything in particular? For example, should we consider using a The reasons I ask these questions are of course I want to avoid unnecessary clashing when the "official" meters are exposed in case we need something before then, which is looking very likely now that this is seemingly out of scope for v9 of the library. I'd also like to know if all of this work also applies to |
Summary
Today, the user has to write their own metrics retrieval system. By adding built-in metrics, the library could hook into existing list of caches, and publish all stats so it could handle all that user would do otherwise in their code to support multiple caches. To help identify caches, MemoryCache would need a
Name
property to help support the meter scenario.With the built-in metrics the name is shown per memory cache, and the onus is on user to provide a unique name otherwise the library could add a warning and either pick one or update the duplicated name with a warning.
To learn more check out this gist.
Goal
This issue helps focus on a good user experience for developers who wish to track statistics for multiple memory caches by having built-in meters added to the library. Today getting statistics for multiple caches is possible but requires developer to write their own meter.
Our focus here is to add support for cache names, a built-in Meter, and a default naming convention for the cache created by
AddMemoryCache()
in M.E.C (tracked in #67769).NOTE:
This issue needs to also address #67769
For more information refer to #66479 (comment).
What is already available in Preview 4:
GetCurrentStatistics()
API (based on #50406) allows app developers to use either event counters or metrics APIs to track statistics for one or more memory caches with code snippets.With
IMemoryCache.GetCurrentStatistics()
, the user now has support for the following use cases:Using
IMemoryCache.GetCurrentStatistics()
for one memory cacheUse
AddMemoryCache
API to instantiate a single memory cache and via DI get it injected to enable them callingGetCurrentStatistics
.Sample usage/screenshot for event counter:
Helps them view stats below with
dotnet-counters
tool:Using
IMemoryCache.GetCurrentStatistics()
for multiple memory cachesIn order to get stats for more than one memory cache in the app, the user may use metrics APIs in their own code, so long as they have a way of distinguishing their caches by name or ID:
sample usage/screenshot for multiple caches using metrics APIs
Sample stats with
dotnet-counters
tool:Each metrics would need to create its own observable gauge (one for hits, then misses, etc.) and each callback function for the gauge iterates through list of caches creating measurements.
The text was updated successfully, but these errors were encountered: