-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cache memory management #3
Comments
@js-choi You could map |
@michaelficarra: If we used a WeakMap cache and |
Oh yep, I forgot that the boxed primitives will be unique for the purposes of WeakMap keys. |
|
I think that managing cache is an implementation detail best left to the browser which would have the best information about how much memory is available and what the "hot paths" are for cache lookups (e.g. to switch between LFU and LRU). The fact that users of memoization libraries have to deal with setting and managing cache is a leaky abstraction. If the cache could grow and shrink as the system determines the best use of memory, it would free up the developer to only worry about whether or not the function to be memoized is pure. The browser could determine that some memoized functions actually run fast without memoization or have mostly cache-misses (which greatly grows the cache) and opt them out of memoization. Or in a memory-constrained environment, like IOT devices, // heavyFn has a sizeable cache, but it greatly improves repeated calls
// and has mostly cache hits, so it is truly memoized
const heavyFn = Function.memoize(heavy_fn);
// oneValueFn has a small cache, only one key-value, but the repeated
// calls are faster, so it is truly memoized
const oneValueFn = Function.memoize(only_called_with_same_arguments_fn);
// fastFn actually runs almost as fast as a cache lookup, so it is eventually
// opted out of true memoization and ran every call
const fastFn = Function.memoize(fast_fn);
// mostlyCacheMisses keeps growing its cache size with few repeated hits,
// so it is opted out of true memoization and ran every call
const mostlyCacheMisses = Function(lots_of_varied_calls_fn);
// rareCall is called a few times on page load, but then is never called again.
// After some time, the browser opts it out of true caching and frees up its memory
const rareCall = Function.memoize(on_page_load); On a technical level, |
While I think this could be a great default behavior, there is something to be said for managing your own cache to prevent entries from going stale. For example, if you're caching an async Still, all of those examples are related to you manually choosing to eject something from the cache. It may be nice to have your custom ejection behaviors layered on top of the system behavior, where the system will likewise auto-eject items from the cache when memory starts to get full. So, perhaps we still don't need to give people the power to go so low-level as to actually choose between LRU and LFU, dunno. |
How should cache memory management work? Is there a way to get garbage collection by default? (Using WeakMaps for the caches would be ideal…except that WeakMaps do not support primitives as keys.)
Should we just use Maps and make the developer manage the cache memory themselves? (Related: I am planning to propose built-in LRUMaps and LFUMaps.)
There is also the compositeKeys proposal (#1).
The text was updated successfully, but these errors were encountered: