-
Notifications
You must be signed in to change notification settings - Fork 275
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Metadata API: (maybe) Enforce public key content uniqueness in keys #1429
Comments
I think in the end the exact method of producing the keyids is uninteresting (and the spec probably should not specify it): the one thing that matters is that the keyid must be unique for the key -- or in other words the same public key content must not be present under multiple keyids. |
I'm really not familiar with this but are you saying the current key representation in tuf allows two seemingly different keys to actually be representations of the same key? I was assuming we could just compare the keytype/scheme/keyval of each key and accept the keys as unique if the combination of keytype/scheme/keyval is unique, but maybe I was being naive. |
The issue is that the keyval can differ for the same key. As an example, PEM formatting supports additional whitespace, optional headers, etc. |
That being said, de-duplicating on just the keytype/scheme/keyval would probably still prevent accidental key reuse. |
Thanks. So to recap:
|
That sounds right! I don't think the attack is very serious for the reasons you mention, but I don't want to oversell our protection against duplicate keys. |
From the comments, I understand that we cannot enforce key uniqueness based on the public key content because one key can have different public key values. |
Maybe yes?
|
Okay so the core findings are:
With this in mind, TUF should not try to enforce uniqueness or look at keyval contents at all:
So i'm closing this. |
Root and targets have a
keys
dictionary withkeyid
as key. It would be nice to know for sure that these keys are actually unique keys (and not e.g. the same key content with two different keyids).One solution would be to validate that the keyid is correct according to spec:
Having looked at the securesystemslib implementation I don't think we can consider keyid reproducible... what SSLib considers canonical form depends on the settings that were used on the machine that generated the key, and those settings are not stored on the key itself.
The other solution would be to validate key content uniqueness in the keys dict ourselves. This might make sense: we could just not accept a keys dictionary that contained two keys with identical key content -- I don't think there's a non-malicious, non-error case where that would happen.
The text was updated successfully, but these errors were encountered: