-
Notifications
You must be signed in to change notification settings - Fork 123
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for ConnectionPool (with Dalli) #105
Comments
+1; Support for connection pools could be beneficial when running in a multi-threaded webserver such as Puma for the same reasons it is recommended to use a connection pool when using Dalli for the Rails cache. Please correct me if this cannot be compared with using Dalli for Rack::Cache and this is somehow not an issue. I don't see how the suggested solution would work though. When looking at I would opt for a more generic solution, since ConnectionPool is very generic in itself and not tied to Dalli, and could therefore potentially also be used to wrap a bunch of Memcached clients. |
Hello, thanks for the reply. Sadly it has been almost two years and I don't really remember much about this anymore. I can't look into it in the near future so feel free to close this issue. :) |
If you keep it open maybe someone else will pick it up... I'd be happy to give it a shot some time. |
FYI a funky issue that would also need fixing is that we always write and read from the cache instead of only writing. |
Okey, no probs, I'll reopen it. :) |
Hello,
I think it would be a good idea to support the ConnectionPool (same author as Dalli) as the storage instance. I was pointed to this https://github.com/rtomayko/rack-cache/blob/master/lib/rack/cache/storage.rb#L41-L52 so I hacked around and added another if branch and it looks like it works. I had to try this on the 1.2 version because I was not getting any messages on the master branch. Would you like a pull request? It looks like this gem is has not been updated in a while though. :-(
Thanks.
The text was updated successfully, but these errors were encountered: