Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rate-limiting plugin not executed after re-creation of the Consumer it is attached to (db mode) #6266

Closed
pamiel opened this issue Aug 21, 2020 · 7 comments

Comments

@pamiel
Copy link
Contributor

pamiel commented Aug 21, 2020

Summary

Rate-limiting plugin is no more executed if the Consumer (on which the plugin has been defined) is deleted and then recreated (together with the rate limiting plugin, of course!!).
Tested on Kong 1.5.2 and 2.0.5, single instance, in DB mode (Postgres).
Rate-limiting plugin configured in "local" mode.

Steps To Reproduce

  1. Create a Service and a Route, and allocate an authentication plugin to the Route (whatever auth plugin...)
  2. Create a Consumer "myconsumer" and declare a rate-limiting plugin on this Consumer
  3. Test that rate limiting is working well by calling the Route path, giving the correct authentication credential: everything is ok; the X-RateLimit- headers are present
  4. Delete the Consumer and recreate it (with the same username: "myconsumer") and re-create a rate-limiting plugin on this Consumer
  5. Call again the Route with the same auth credentials (thus targeting the same Consumer): rate limiting is not applied (the X-RateLimit- headers are not present) while it should!

If you stop and restart Kong and call again the Route, rate limiting is now applied (which make me think that the issue is related to a the cache...).

Instead of deleting the Consumer:

  • If you just update the rate-limiting plugin, everything works well (rate limiting continues to be applied)
  • If you just delete the rate limiting plugin from the Consumer and then re-create it (so without deleting the Consumer itself), everything works well (rate limiting continues to be applied).

Additional Details & Logs

  • Kong version: tested on 1.5.2 and 2.0.5
  • Operating system: Alpine
  • Logs (Kong 2.0.5):
    As I'm suspecting the management of the cache to involved, here are the logs extract when deleting the Consumer:
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:23: Loading Admin API endpoints
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: correlation-id
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: pre-function
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: cors
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: ldap-auth
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: loggly
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: hmac-auth
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: zipkin
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: request-size-limiting
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: azure-functions
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: request-transformer
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: oauth2
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: response-transformer
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: ip-restriction
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: statsd
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: jwt
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:106: Loading API endpoints for plugin: proxy-cache
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: basic-auth
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: key-auth
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: http-log
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: file-log
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: datadog
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: tcp-log
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: rate-limiting
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: post-function
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:106: Loading API endpoints for plugin: prometheus
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:106: Loading API endpoints for plugin: acl
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: syslog
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:106: Loading API endpoints for plugin: acme
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: udp-log
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: response-ratelimiting
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: aws-lambda
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: session
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: bot-detection
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] init.lua:115: No API endpoints loaded for plugin: request-termination
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] events.lua:211: do_event(): worker-events: handling event; source=dao:crud, event=delete, pid=nil, data=table: 0x7fe18be16008
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] cache.lua:307: invalidate_local(): [DB cache] invalidating (local): 'consumers:22f47195-3450-40dd-8d2f-9d93f52af973::::'
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_db_cache, pid=32, data=consumers:22f47195-3450-40dd-8d2f-9d93f52af973::::
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] cache.lua:323: invalidate(): [DB cache] broadcasting (cluster) invalidation for key: 'consumers:22f47195-3450-40dd-8d2f-9d93f52af973::::'
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] events.lua:211: do_event(): worker-events: handling event; source=crud, event=consumers, pid=nil, data=table: 0x7fe18be16008
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] events.lua:211: do_event(): worker-events: handling event; source=crud, event=consumers:delete, pid=nil, data=table: 0x7fe18be16008
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] events.lua:211: do_event(): worker-events: handling event; source=dao:crud, event=delete, pid=nil, data=table: 0x7fe18be18788
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] cache.lua:307: invalidate_local(): [DB cache] invalidating (local): 'plugins:rate-limiting:::22f47195-3450-40dd-8d2f-9d93f52af973:'
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_core_db_cache, pid=32, data=plugins:rate-limiting:::22f47195-3450-40dd-8d2f-9d93f52af973:
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] cache.lua:323: invalidate(): [DB cache] broadcasting (cluster) invalidation for key: 'plugins:rate-limiting:::22f47195-3450-40dd-8d2f-9d93f52af973:'
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] events.lua:211: do_event(): worker-events: handling event; source=crud, event=plugins, pid=nil, data=table: 0x7fe18be18788
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] handler.lua:254: [events] Plugin updated, invalidating plugins iterator
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] cache.lua:307: invalidate_local(): [DB cache] invalidating (local): 'plugins_iterator:version'
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_core_db_cache, pid=32, data=plugins_iterator:version
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] cache.lua:323: invalidate(): [DB cache] broadcasting (cluster) invalidation for key: 'plugins_iterator:version'
2020/08/21 13:36:25 [debug] 32#0: *285 [lua] events.lua:211: do_event(): worker-events: handling event; source=crud, event=plugins:delete, pid=nil, data=table: 0x7fe18be18788
2020/08/21 13:36:25 [info] 32#0: *285 client 172.17.0.1 closed keepalive connection
2020/08/21 13:36:26 [debug] 35#0: *302 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_db_cache, pid=32, data=consumers:22f47195-3450-40dd-8d2f-9d93f52af973::::
2020/08/21 13:36:26 [debug] 35#0: *302 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_core_db_cache, pid=32, data=plugins:rate-limiting:::22f47195-3450-40dd-8d2f-9d93f52af973:
2020/08/21 13:36:26 [debug] 35#0: *302 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_core_db_cache, pid=32, data=plugins_iterator:version
2020/08/21 13:36:26 [debug] 34#0: *304 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_db_cache, pid=32, data=consumers:22f47195-3450-40dd-8d2f-9d93f52af973::::
2020/08/21 13:36:26 [debug] 34#0: *304 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_core_db_cache, pid=32, data=plugins:rate-limiting:::22f47195-3450-40dd-8d2f-9d93f52af973:
2020/08/21 13:36:26 [debug] 34#0: *304 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_core_db_cache, pid=32, data=plugins_iterator:version
2020/08/21 13:36:26 [debug] 33#0: *308 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_db_cache, pid=32, data=consumers:22f47195-3450-40dd-8d2f-9d93f52af973::::
2020/08/21 13:36:26 [debug] 33#0: *308 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_core_db_cache, pid=32, data=plugins:rate-limiting:::22f47195-3450-40dd-8d2f-9d93f52af973:
2020/08/21 13:36:26 [debug] 33#0: *308 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_core_db_cache, pid=32, data=plugins_iterator:version

Here are now the logs when re-creating the Consumer:

2020/08/21 13:39:21 [debug] 32#0: *3264 [lua] events.lua:211: do_event(): worker-events: handling event; source=dao:crud, event=create, pid=nil, data=table: 0x7fe18baea638
2020/08/21 13:39:21 [debug] 32#0: *3264 [lua] cache.lua:307: invalidate_local(): [DB cache] invalidating (local): 'consumers:fe06beef-6179-4ce5-a665-c5b44a90379b::::'
2020/08/21 13:39:21 [debug] 32#0: *3264 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_db_cache, pid=32, data=consumers:fe06beef-6179-4ce5-a665-c5b44a90379b::::
2020/08/21 13:39:21 [debug] 32#0: *3264 [lua] cache.lua:323: invalidate(): [DB cache] broadcasting (cluster) invalidation for key: 'consumers:fe06beef-6179-4ce5-a665-c5b44a90379b::::'
2020/08/21 13:39:21 [debug] 32#0: *3264 [lua] events.lua:211: do_event(): worker-events: handling event; source=crud, event=consumers, pid=nil, data=table: 0x7fe18baea638
2020/08/21 13:39:21 [debug] 32#0: *3264 [lua] events.lua:211: do_event(): worker-events: handling event; source=crud, event=consumers:create, pid=nil, data=table: 0x7fe18baea638
2020/08/21 13:39:21 [info] 32#0: *3264 client 172.17.0.1 closed keepalive connection
2020/08/21 13:39:22 [debug] 33#0: *3266 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_db_cache, pid=32, data=consumers:fe06beef-6179-4ce5-a665-c5b44a90379b::::
2020/08/21 13:39:22 [debug] 34#0: *3265 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_db_cache, pid=32, data=consumers:fe06beef-6179-4ce5-a665-c5b44a90379b::::
2020/08/21 13:39:22 [debug] 35#0: *3273 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_db_cache, pid=32, data=consumers:fe06beef-6179-4ce5-a665-c5b44a90379b::::
2020/08/21 13:39:23 [debug] 33#0: *3281 [lua] init.lua:288: [cluster_events] polling events from: 1598016988.428

Finally, here are the logs when re-creating the rate-limiting plugin:

2020/08/21 13:41:00 [debug] 32#0: *4928 [lua] events.lua:211: do_event(): worker-events: handling event; source=dao:crud, event=create, pid=nil, data=table: 0x7fe18bab81a8
2020/08/21 13:41:00 [debug] 32#0: *4928 [lua] cache.lua:307: invalidate_local(): [DB cache] invalidating (local): 'plugins:rate-limiting:::fe06beef-6179-4ce5-a665-c5b44a90379b:'
2020/08/21 13:41:00 [debug] 32#0: *4928 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_core_db_cache, pid=32, data=plugins:rate-limiting:::fe06beef-6179-4ce5-a665-c5b44a90379b:
2020/08/21 13:41:00 [debug] 32#0: *4928 [lua] cache.lua:323: invalidate(): [DB cache] broadcasting (cluster) invalidation for key: 'plugins:rate-limiting:::fe06beef-6179-4ce5-a665-c5b44a90379b:'
2020/08/21 13:41:00 [debug] 32#0: *4928 [lua] events.lua:211: do_event(): worker-events: handling event; source=crud, event=plugins, pid=nil, data=table: 0x7fe18bab81a8
2020/08/21 13:41:00 [debug] 32#0: *4928 [lua] handler.lua:254: [events] Plugin updated, invalidating plugins iterator
2020/08/21 13:41:00 [debug] 32#0: *4928 [lua] cache.lua:307: invalidate_local(): [DB cache] invalidating (local): 'plugins_iterator:version'
2020/08/21 13:41:00 [debug] 32#0: *4928 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_core_db_cache, pid=32, data=plugins_iterator:version
2020/08/21 13:41:00 [debug] 32#0: *4928 [lua] cache.lua:323: invalidate(): [DB cache] broadcasting (cluster) invalidation for key: 'plugins_iterator:version'
2020/08/21 13:41:00 [debug] 32#0: *4928 [lua] events.lua:211: do_event(): worker-events: handling event; source=crud, event=plugins:create, pid=nil, data=table: 0x7fe18bab81a8
2020/08/21 13:41:00 [info] 32#0: *4928 client 172.17.0.1 closed keepalive connection
2020/08/21 13:41:01 [debug] 33#0: *4935 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_core_db_cache, pid=32, data=plugins:rate-limiting:::fe06beef-6179-4ce5-a665-c5b44a90379b:
2020/08/21 13:41:01 [debug] 35#0: *4934 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_core_db_cache, pid=32, data=plugins:rate-limiting:::fe06beef-6179-4ce5-a665-c5b44a90379b:
2020/08/21 13:41:01 [debug] 34#0: *4936 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_core_db_cache, pid=32, data=plugins:rate-limiting:::fe06beef-6179-4ce5-a665-c5b44a90379b:
2020/08/21 13:41:01 [debug] 33#0: *4935 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_core_db_cache, pid=32, data=plugins_iterator:version
2020/08/21 13:41:01 [debug] 35#0: *4934 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_core_db_cache, pid=32, data=plugins_iterator:version
2020/08/21 13:41:01 [debug] 34#0: *4936 [lua] events.lua:211: do_event_json(): worker-events: handling event; source=mlcache, event=mlcache:invalidations:kong_core_db_cache, pid=32, data=plugins_iterator:version
2020/08/21 13:41:03 [debug] 35#0: *4967 [lua] init.lua:288: [cluster_events] polling events from: 1598017163.472

No specific logs are available when calling the Route after the re-creation process...

@fairyqb
Copy link
Contributor

fairyqb commented Aug 23, 2020

Is incorrect usage ?

kong.client.get_consumer()

Returns the consumer entity of the currently authenticated consumer. If not set yet, it returns nil.

consumer and authenticate (BASIC/ API KEYS / HMAC /OAUTH2 / JWT), both indispensability.

@pamiel
Copy link
Contributor Author

pamiel commented Aug 24, 2020

Hi @fairyqb
I don't think this is an incorrect usage, and I'm not talking here about an issue in the PDK, but an issue in the application of the rate-limiting plugin.

Just to try to clarify:

  • At step /bin/kong global script #3 of the "steps to reproduce" mentioned above, the Consumer exists together with its credential. Rate limiting plugin is declared on the Consumer itself, with a config.limit_by parameter set to consumer => configuration is correct, so authentication and rate limiting are applied correctly.
  • At step site: general TODO + dicussions #5, the exact same configuration is done, except that the Consumer (and thus its credential and the rate-limiting plugin that was attached to it) have been first deleted in step Polishing the controllers + more integration tests #4, and then re-created.

So configuration is exactly the same... but in step #5, while authentication is working (and Consumer has been found and allocated to the request), rate limiting is not applied. Nothing related to the PDK I think.

@fairyqb
Copy link
Contributor

fairyqb commented Aug 24, 2020

I didn't reproduce this problem ,I used the version v2.1.0

@pamiel
Copy link
Contributor Author

pamiel commented Aug 24, 2020

Oh, you are right: I initially tested with a custom authentication plugin. I thought I also tested with the standard basic-auth plugin… but I should have make a mistake: testing it again, I confirm that everything is working properly with the standard plugins; my apologies!
This being said, I don’t understand what happens with my custom authentication plugin (with which the test still fails). Maybe you could help on that:

However, the main difference I have compared to the basic-auth implementation is that I’m not using credentials stored in Kong… so at the Consumer retrival time, I just have in my hands the Consumer username and not its uuid. So, instead of (https://github.com/Kong/kong/blob/master/kong/plugins/basic-auth/access.lua#L189):

local consumer_cache_key = kong.db.consumers:cache_key(credential.consumer.id)
local consumer, err      = kong.cache:get(consumer_cache_key, nil, kong.client.load_consumer, credential.consumer.id)

my code is:

local consumer_cache_key = kong.db.consumers:cache_key(<the Consumer username>)
local consumer, err      = kong.cache:get(consumer_cache_key, nil, kong.client.load_consumer, <the Consumer username>, true)

And then, instead of (https://github.com/Kong/kong/blob/master/kong/plugins/basic-auth/access.lua#L112):

kong.client.authenticate(consumer, credential)

my code is:

kong.client.authenticate(consumer, nil)

I thought it would be valid to use <the Consumer username> instead of a Consumer uuid in the cache_key method because this method intends to generate a string that will serve as a key in the cache, not to run the query in the db; and then to add true as a last parameter of the kong.cache:get call as the kong.client.load_consumer callback will look for Consumers by username if true is set as a parameter.

Strangely, the logs I put in my code show that no error happens when using my code during step # 3 as well as step # 5 (err is always nil and consumer always contains something)… but I now suspect that the “caching” issue I mentioned in my first post might be in this piece of code !... but I still don't understand why rate limiting is at the end working in step # 3, but not in step # 5!

So:

  • Is my usage of cache_key method good? If not, how to do that in case you don’t have the Consumer uuid?
  • Should I directly use kong.db.consumers:select_by_username instead ? (but this time I will not take benefit of the caching mechanism?)

@fairyqb
Copy link
Contributor

fairyqb commented Aug 24, 2020

inspect identifier value from plugins\rate-limiting.lua

`

      local function get_identifier(conf)
  local identifier
  if conf.limit_by == "service" then
	identifier = (kong.router.get_service() or
				  EMPTY).id
  elseif conf.limit_by == "consumer" then
	identifier = (kong.client.get_consumer() or
				  kong.client.get_credential() or
				  EMPTY).id

  elseif conf.limit_by == "credential" then
	identifier = (kong.client.get_credential() or
				  EMPTY).id

  elseif conf.limit_by == "header" then
	identifier = kong.request.get_header(conf.header_name)
  end

  return identifier or kong.client.get_forwarded_ip()
end

`

@pamiel
Copy link
Contributor Author

pamiel commented Aug 25, 2020

Yes, @fairyqb , that's the reason why I'm calling kong.client.authenticate at the end of the authentication process: it is setting the internal ctx.authenticated_consumer, while in the rate-limiting plugin, get_identifier is calling kong.client.get_consumer (confi.limit_by is set to consumer) that reads the same internal ngx.ctx.authenticated_consumer => this part is ok.

Indeed, I think I found where the problem is: I printed the id of the Consumer I'm retrieving from the cache (i.e. retrieved by the call to kong.cache:get, and thus set in ngx.ctx.authenticated_consumer):

  • In step # 3, just after its creation and when calling the Route for the first time, it is set (let's say) to "5d340687-653d-4c35-b435-90938c9ca76e". Rate limiting applies correctly.
  • Then, in step # 5, this Consumer has been deleted and a new one has been created => the cache should have been cleaned from this Consumer instance and the call to kong.cache:get should return the new Consumer instance that has a different id... but indeed kong.cache:get returns the same Consumer, with the same id "5d340687-653d-4c35-b435-90938c9ca76e", i.e. kong.cache:get returns the Consumer that no longer exists in the database... but is still present in the cache!

My understanding is that the deletion of the Consumer through the Admi API is removing the Consumer from the cache using a cache key that has been built based on the uuid of the Consumer... while my plugin has built a cache key using the username (as it does not have the uuid of the Consumer but just the username). Both cache keys will never match, and deletion request through the Admin API will never remove the other instance from the cache.
Any call to kong.cache:get in step # 5 will then return the deleted Consumer instead of the new one!

My conclusion is that I have 2 options:

  1. Either I abandon the usage of the cache and always run a query to the db each time I want to get the Consumer (using kong.db.consumers:select_by_username). This solution costs 1 query in the DB for each call to the Route.
  2. Or I still want to rely on the cache and I need a trick such as: a/First get the Consumer from the cache using a cache key built based on the username. b/ Then get the id from this Consumer, and run again a query in the cache with a new key built based on this uuid. c/ Finally compare the id of both retrieved Consumers: if they are the same, fine; if they are not, then remove the first cache key entry from the cache as it corresponds to a Consumer that no longer exists in the DB, and use the second Consumer in the kong.client.authenticate call.
    Cost of this solution is: if Consumer is not in cache (should happen one time only), then this costs 2 access to the cache + 2 access to the DB. If Consumer is already in the cache (should be most of the time), then it costs 2 access to the cache. If Consumer in the cache is a deleted Consumer (should happen very few times), then is costs 2 access in the cache + 1 in the DB + 1 cache entry invalidation.

What would be your advice ? Which one of the 2 solution looks better? (in stable mode, option 1 should costs 1 DB access while option 2 should cost only 2 cache access... but maybe the DB access has a cache itself that I don't see!)

Any alternate solution you would recommend instead?

Thanks a lot for your help !

@kikito kikito removed the task/bug label Sep 16, 2020
@hishamhm
Copy link
Contributor

@pamiel DB access does not have a cache, so solution 2 is better!

I think we're good to close this issue since the rate-limiting plugin itself is okay and now the matter lies in your custom plugin, but I think you're well set in the path for a solution! If you spot any other issue on the Kong side, make sure to let us know!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants