Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Cache OAuth2 introspection client #482

Closed
wants to merge 1 commit into from

Conversation

ptescher
Copy link
Contributor

Related issue

#293
#424

Proposed changes

#424 Cached the responses of OAuth2 introspection, but for repeated calls to similar requests with pre_authorization enabled Oathkeeper will fetch a new access token per request. This can generate a massive number of access tokens!

Checklist

  • I have read the contributing guidelines.
  • I have read the security policy.
  • I confirm that this pull request does not address a security
    vulnerability. If this pull request addresses a security. vulnerability, I
    confirm that I got green light (please contact
    [email protected]) from the maintainers to push
    the changes.
  • I have added tests that prove my fix is effective or that my feature
    works.
  • I have added or changed the documentation.

Further comments

Although the configuration for authenticators. oauth2_introspection.config. pre_authorization is not likely to change, the way the authenticators are built the configuration is not available on initialization. In order to get around this limitation this patch introduces a "cache" for the client. That cache uses a mutex which could be quite heavy for this use case. If there is a lighter solution that would work in this case it might be preferred.

@ptescher ptescher changed the title Cache OAuth2 introspection client feat: Cache OAuth2 introspection client Jul 15, 2020
Copy link
Member

@aeneasr aeneasr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is a great idea! Regarding implementation I have some ideas how to improve the correctness of the code. It is also imperative to add tests which, at least:

  • Check if the cache is used in general
  • Check that there are no collisions (e.g. the same cache used for two different configs)
  • Check that the cache is invalidated when requests fail

To check if the cache is used, the best idea would be to mock a pre-auth endpoint and count how often that endpoint was called while doing things against the cache.

Keep in mind that, if you use ristretto, this counter will be off sometimes because ristretto does not always cache all items which would make the test flaky. If you want to use ristretto, doing multiple requests (e.g. 10 or 20) and counting if the total number of actual HTTP requests is lower (e.g. 2 or 3) would be the way to approach this.

Scopes: c.PreAuth.Scope,
TokenURL: c.PreAuth.TokenURL,
}).Client(context.Background()).Transport
rt = a.preAuthCache.Client(context.Background(), c.PreAuth).Transport
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

c.PreAuth is a parameter which may change per route, but the authenticator is instantiated only once with one global cache, which will cause errors when more than one PreAuth are configured. Instead, this should probably be cached using the c.PreAuth contents as the key (e.g. crc32ing a string representation of c.PreAuth) .

Another issue is that the cache is never invalidated, even if requests start failing. This is problematic when, for example, the configuration is changed and subsequently reloaded.

@aeneasr aeneasr closed this Mar 29, 2021
@tarjanik
Copy link

@aeneasr Is this feature resolved in any other way? For me it seems the pre_authorization is not usable without this in production. Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants