We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The docs suggest to use the keys argument to support multiple limits:
keys
@ratelimit(keys=lambda x: 'min', rate='1/m') @ratelimit(keys=lambda x: 'hour', rate='10/h') @ratelimit(keys=lambda x: 'day', rate='50/d') def post(request): # Stack them. # Note: once a decorator limits the request, the ones after # won't count the request for limiting. return HttpResponse()
Without that keys argument, then all three limits use the same cache key and therefore whichever gets cleared first clears them all.
It'd be nice if stacking ratelimits worked better with cache keys.
The text was updated successfully, but these errors were encountered:
Closing in favor of #48.
Sorry, something went wrong.
No branches or pull requests
The docs suggest to use the
keys
argument to support multiple limits:Without that
keys
argument, then all three limits use the same cache key and therefore whichever gets cleared first clears them all.It'd be nice if stacking ratelimits worked better with cache keys.
The text was updated successfully, but these errors were encountered: