-
-
Notifications
You must be signed in to change notification settings - Fork 164
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update README/introduction docs #54
Comments
Maybe the docs should state more prominently that multi-threaded access to cache instances must be locked properly. Seems to be a recurring issue, e.g. #80, http://www.paulsprogrammingnotes.com/2016/08/python-cachetools-lrucache-keyerror.html. |
E.g. "contrary to popular believe, cache instances are not thread safe - neither is, for example, OrderedDict". |
BTW, is anywhere mentioned that the |
Is this still true? I want to use lrucache.get(keystring, None) and only write to it using lrucache[keystring] = array I don't mind if reads are stale (correct for some time in the past), but I do want to be sure that I don't corrupt the state with concurrent Can I make these assumptions or do I need to lock it just to be safe? (Locking with my current cache implementation is a performance bottleneck.) |
You need to lock all access to the cache. |
Okay, thanks for the info. |
cachetools
now features much more than cache classes, so the introductionpycon
samples should also show use of the decorators and maybe custom keys.The text was updated successfully, but these errors were encountered: