-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Halve Commit Size with Ed25519 and no HSM changes #7892
Comments
Soundness condition: Suppose that the prover aggregates any N purported signatures Proof sketch:
First note that To satisfy the system, the prover must provide Thus the probability of breaking soundness is This proof is really a more detailed sketch, and isn't careful with the random distributions (128 bit, vs random in the field). It should be placed in a game based definition, where the prover is given oracle access to A_j, however it seems clear to me that this can be directly translated to the game based definition. |
Similar to BLS aggregation within txs in a block, this optimization should apply equally to combining Ed25519 / Ristretto-subgroup EdDSA signatures across txs in a block, for a factor of 2 space savings. |
Pretty exciting @ValarDragon! So the only thing missing is a detailed wite-up of above proof-sketch, or am I missing sth else here? |
I think we just need to do that, and talk to other people to show we're not missing something.
Then we need to prove that soundness of this aggregation does in fact reduce to the above. The proof sketch I made earlier was combining both of these, but splitting it up should hopefully make it easier to follow. It turns out that the Batch PoK of DL is already proven in stronger forms!
I think the presence of the first bullet point really just means that we should sanity check this with some cryptographers, and can then proceed. |
This is seems an awful lot like the straw man scheme in Section 4 of Yao's Gamma Signatures https://eprint.iacr.org/2018/414.pdf The rogue key attack makes the section 4 scheme somewhat less useful for combining transaction signatures but is somewhat fine for consensus signatures where you can require a proof of knowledge of the discrete log when registering. |
The reason section 4 has that attack is because they don't take take a random linear combination, they just do a straight forward addition. The random linear combination here is what gives this security. This is reminiscent to the idea of https://crypto.stanford.edu/~dabo/pubs/abstracts/BLSmultisig.html. |
This scheme (with derandomization) is the same scheme as proposed in: https://eprint.iacr.org/2021/350.pdf ! So we inherit the proof of security from there =) |
This is a proposal to halve the current Tendermint commit size when using Ed25519, with no changes required to hardware. I expect the verifier time to be notably improved.
The high level idea is we use Ed25519 batch verification algorithm, where the verifier randomness is obtained from hashing the entire commit. Ed25519 signatures are composed of two equal sized parts
(R, s)
. Using batch verification, the verifier needs{R_i}_{i in validators}
, but only needs the random linear combination of alls_i
's. Thus, the commit will send all R_i, but only the single aggregated s. (This still must be proven sound)Let
G
be the generator of the Ed25519 group,z_i
be theith
random coefficient,P_i
be thei
th validators pubkey,m_i
is the message which is being signed,(R_i, s_i)
is the i-th Ed25519 signature andh_i = H(R_i, P_i, m_i)
. (Capitals are curve points, lower-case letters are scalars) The Ed25519 batch-verification equation that must be checked by the verifier is:One way to check the equation is to give the verifier
(-\sum_{i} z_i s_i)
, and all of theR_i
.The proposed procedure is:
(-\sum_{i} z_i s_i)
, pseudo-commit}To verify this, as either a full node or as a lite-client:
(-\sum_{i} z_i s_i)
, pseudo-commit}It remains to show that checking the equation given
(-\sum_{i} z_i s_i)
instead of all s_i is secure. I do not currently have a proof of it, but I have an impression that this is true. (EDIT: See following comment) The reason that it feels true is that we are taking a random linear combination of the relevant curve points we care about (which already depend on the public key and the message), and we are essentially asking for the discrete logarithm of the random linear combination. So if you didn't know any constituent signature, it seems unlikely you could know the resulting discrete logarithm. Random linear combinations of this sort typically have soundness1/(Randomness sample size)
, and we choose each z_i from a set of size 2^128. Sinceh_i
has key-prefixing and is therefore distinct for every pubkey, this builds some confidence against being able to do BLS-style rogue public key attacks.Additional optimizations:
If the verifier wants to take the optimization where they only verify a particular subset
D
of size (2/3 weight) of the validators who signed the prior block, then we've already assumed a single round of communication from the verifier to their prover where they communicated D. In this model, the verifier can still achieve this same optimization. We redefine the commit to now be{(-\sum_{i} z_i s_i), MT of pseudo-commit}
. The verifier receives{commit, {R_i, T_i}_{i \in D}, multi-membership proof for {R_i, T_i}_{i \in D} in pseudo-commit, (-\sum_{i \in D} z'_i s_i)}
. The coefficientsz'_i
are all obtained from getting sufficient bytes fromH({R_i, T_i}_{i \in D})
.Essentially the verifiers receives a commitment to the commit (essentially all signatures in a merkle tree), and is sent a multi-membership proof of all the signatures of interest. The random-coefficient optimization is the same, but is only for the received signatures, and so the size of the leafs is still halved.
The text was updated successfully, but these errors were encountered: