Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update README/introduction docs #54

Closed
tkem opened this issue Sep 29, 2015 · 6 comments
Closed

Update README/introduction docs #54

tkem opened this issue Sep 29, 2015 · 6 comments
Milestone

Comments

@tkem
Copy link
Owner

tkem commented Sep 29, 2015

cachetools now features much more than cache classes, so the introduction pycon samples should also show use of the decorators and maybe custom keys.

@tkem tkem added this to the v1.2.0 milestone Oct 1, 2015
@tkem tkem modified the milestones: v2.1.0, v2.0.0 Oct 3, 2016
@tkem
Copy link
Owner Author

tkem commented Dec 14, 2016

Maybe the docs should state more prominently that multi-threaded access to cache instances must be locked properly. Seems to be a recurring issue, e.g. #80, http://www.paulsprogrammingnotes.com/2016/08/python-cachetools-lrucache-keyerror.html.

@tkem
Copy link
Owner Author

tkem commented Oct 24, 2017

E.g. "contrary to popular believe, cache instances are not thread safe - neither is, for example, OrderedDict".

@tkem
Copy link
Owner Author

tkem commented Oct 24, 2017

BTW, is anywhere mentioned that the func decorators are thread-safe by default?

@tkem tkem modified the milestones: v2.1.0, v3.0.0 May 12, 2018
@jpivarski
Copy link

Is this still true? I want to use cachetools.LRUCache (not the function decorator; the dict-like class) and share that cache across threads. I can guarantee that I will only read from it using

lrucache.get(keystring, None)

and only write to it using

lrucache[keystring] = array

I don't mind if reads are stale (correct for some time in the past), but I do want to be sure that I don't corrupt the state with concurrent __setitem__ or read an invalid result if the read is concurrent with a write.

Can I make these assumptions or do I need to lock it just to be safe? (Locking with my current cache implementation is a performance bottleneck.)

@tkem
Copy link
Owner Author

tkem commented Jul 10, 2018

You need to lock all access to the cache.

@jpivarski
Copy link

Okay, thanks for the info.

tkem added a commit that referenced this issue Nov 3, 2018
tkem added a commit that referenced this issue Nov 4, 2018
tkem added a commit that referenced this issue Nov 4, 2018
@tkem tkem closed this as completed in a1a6d7c Nov 4, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants