Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Metadata API: (maybe) Enforce public key content uniqueness in keys #1429

Closed
jku opened this issue May 31, 2021 · 10 comments
Closed

Metadata API: (maybe) Enforce public key content uniqueness in keys #1429

jku opened this issue May 31, 2021 · 10 comments

Comments

@jku
Copy link
Member

jku commented May 31, 2021

Root and targets have a keys dictionary with keyid as key. It would be nice to know for sure that these keys are actually unique keys (and not e.g. the same key content with two different keyids).

One solution would be to validate that the keyid is correct according to spec:

KEYID: ... a hexdigest of the SHA-256 hash of the canonical form of the key.

Having looked at the securesystemslib implementation I don't think we can consider keyid reproducible... what SSLib considers canonical form depends on the settings that were used on the machine that generated the key, and those settings are not stored on the key itself.

The other solution would be to validate key content uniqueness in the keys dict ourselves. This might make sense: we could just not accept a keys dictionary that contained two keys with identical key content -- I don't think there's a non-malicious, non-error case where that would happen.

@mnm678
Copy link
Contributor

mnm678 commented Jun 1, 2021

This would be a nice check to have. I thought about something similar while working on TAP 12 (see #1084). This would require using a standardized representation of keys, like the modulus and exponent for rsa, as the representation of the key can differ for the same key material.

@jku
Copy link
Member Author

jku commented Jun 1, 2021

I think in the end the exact method of producing the keyids is uninteresting (and the spec probably should not specify it): the one thing that matters is that the keyid must be unique for the key -- or in other words the same public key content must not be present under multiple keyids.

@jku
Copy link
Member Author

jku commented Jun 1, 2021

This would require using a standardized representation of keys, like the modulus and exponent for rsa, as the representation of the key can differ for the same key material.

I'm really not familiar with this but are you saying the current key representation in tuf allows two seemingly different keys to actually be representations of the same key? I was assuming we could just compare the keytype/scheme/keyval of each key and accept the keys as unique if the combination of keytype/scheme/keyval is unique, but maybe I was being naive.

@mnm678
Copy link
Contributor

mnm678 commented Jun 1, 2021

The issue is that the keyval can differ for the same key. As an example, PEM formatting supports additional whitespace, optional headers, etc.

@mnm678
Copy link
Contributor

mnm678 commented Jun 1, 2021

That being said, de-duplicating on just the keytype/scheme/keyval would probably still prevent accidental key reuse.

@jku
Copy link
Member Author

jku commented Jun 2, 2021

Thanks. So to recap:

  • we could check for key content uniqueness
  • it would be a minor usability/sanitycheck feature, to prevent accidentally using the same key with two keyids
  • it would not prevent maliciously using the same key in two forms (but if attacker is able to define the key content, they are typically also able to e.g. change thresholds or add new keys so this is not a huge issue)

@mnm678
Copy link
Contributor

mnm678 commented Jun 3, 2021

That sounds right!

I don't think the attack is very serious for the reasons you mention, but I don't want to oversell our protection against duplicate keys.

@MVrachev
Copy link
Collaborator

MVrachev commented Jun 8, 2021

From the comments, I understand that we cannot enforce key uniqueness based on the public key content because one key can have different public key values.
The best we can do is if we have an exact match of keytype, keyval and scheme in keys is to ignore it and warn the user?
We want to warn the user on accidental duplicating keys, but do we want to stop the execution with an exception?

@jku
Copy link
Member Author

jku commented Jun 8, 2021

We want to warn the user on accidental duplicating keys, but do we want to stop the execution with an exception?

Maybe yes?

  • it would prevent a repository tool from creating broken metadata
  • it would prevent a client from being in a situation where signatures refers to a key that exists (in the file) and does not exist (in deserialized keys) at the same time

@MVrachev MVrachev added the backlog Issues to address with priority for current development goals label Jun 9, 2021
@joshuagl joshuagl removed the backlog Issues to address with priority for current development goals label Jul 7, 2021
@jku
Copy link
Member Author

jku commented Jul 7, 2021

Okay so the core findings are:

  • keyval["public"] as defined in the spec for e.g. RSA keys does not make it possible to enforce uniqueness (it's not hard to create different values for the same private key)
  • The spec does not actually define keyval content for keys in general (only for the three included key formats): "public" is not even a required field.

With this in mind, TUF should not try to enforce uniqueness or look at keyval contents at all:

  • key uniqueness is something the repository must enforce: client (with access only to public metadata) can only trust that repository is doing the right thing
  • TUF (the metadata API) will find out if key is supported (or if key content is valid) when it tries to use the key

So i'm closing this.

@jku jku closed this as completed Jul 7, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants