You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@LoupVaillant suggested doing the following when handling Ed25519:
In my opinion standardizing signatures and public keys is much more
important than worrying about anything related to the private key. And
just at that level you have to grapple with much more fundamental issues
than how to define your private key:
So you have a public key A, and a signature R || S.
A and R are points on the curve, and S is just a number.
Thankfully, the main issues were dealt with from the beginning:
Points on the curve are compressed as a field element and a sign bit.
All numbers are encoded in little-endian.
A, R, and S are all serialised with 32 bytes.
But there's still room for variation in the verifier:
Do we accept S when it exceeds the order of the curve?
Do we accept A and R when they have low order?
Do we accept non-canonical encodings of A and R?
What verification equation do we use exactly?
When two verifiers disagree on any of the above, this can cause problems
when maliciously crafted signatures end up being accepted by some and
rejected by others, leading to problems like network partitions. Worse,
the RFC didn't clearly answer all of those questions, and allowed users
to chose which verification equation they would use. And it's difficult
in practice to find two implementations that behave identically. It's a
freaking nightmare.
My choice for Monocypher was to do the same as Zebra:
Reject any S that equals or exceeds the order of the curve.
Accept low-order A and R.
Accept non-canonical A and R.
Use the batch verification equation (it's the forgiving one).
The reason I reject high S is because (i) everyone else does, and (ii)
accepting it would enable malleability attacks. For everything else I
chose to be as permissive as possible. This has the advantage of being
backwards compatible with any other implementation: no signature that
was previously accepted will be rejected.
The RFC on the other hand made the following choices:
Reject any S that equals or exceeds the order of the curve.
Accept low-order A and R.
Reject non-canonical A and R.
Leave equation choice to the implementer.
I personally disagree with the last two items. Interoperability with
batch verification (which is twice as fast as regular verification)
should be mandatory, and rejecting non-canonical points makes the code
more complex for no benefit at all.
You'll have to make your own choice too if you want a complete
specification. I personally would recommend you imitate Zebra and
Monocypher, because many implementations can be made compatible with a
bit of pre-processing:
Reject the signature if S is too big. Almost all implementations
already do this however, so you can generally skip this step.
If both A and R have low order, and S == 0, accept the signature.
In total, low order points have 14 different encodings, so you can
just use a table and compare buffers to do that check.
Run your implementation of choice. It must use the batch equation.
If it accepts the signature, accept it.
If it rejects the signature, reject it.
@LoupVaillant suggested doing the following when handling Ed25519:
In my opinion standardizing signatures and public keys is much more
important than worrying about anything related to the private key. And
just at that level you have to grapple with much more fundamental issues
than how to define your private key:
https://hdevalence.ca/blog/2020-10-04-its-25519am
So you have a public key A, and a signature R || S.
A and R are points on the curve, and S is just a number.
Thankfully, the main issues were dealt with from the beginning:
But there's still room for variation in the verifier:
When two verifiers disagree on any of the above, this can cause problems
when maliciously crafted signatures end up being accepted by some and
rejected by others, leading to problems like network partitions. Worse,
the RFC didn't clearly answer all of those questions, and allowed users
to chose which verification equation they would use. And it's difficult
in practice to find two implementations that behave identically. It's a
freaking nightmare.
My choice for Monocypher was to do the same as Zebra:
The reason I reject high S is because (i) everyone else does, and (ii)
accepting it would enable malleability attacks. For everything else I
chose to be as permissive as possible. This has the advantage of being
backwards compatible with any other implementation: no signature that
was previously accepted will be rejected.
The RFC on the other hand made the following choices:
I personally disagree with the last two items. Interoperability with
batch verification (which is twice as fast as regular verification)
should be mandatory, and rejecting non-canonical points makes the code
more complex for no benefit at all.
You'll have to make your own choice too if you want a complete
specification. I personally would recommend you imitate Zebra and
Monocypher, because many implementations can be made compatible with a
bit of pre-processing:
already do this however, so you can generally skip this step.
In total, low order points have 14 different encodings, so you can
just use a table and compare buffers to do that check.
If it accepts the signature, accept it.
If it rejects the signature, reject it.
Ran into this great presentation on normalizing/standardizing Ed25519, "Taming the many EdDSAs"
https://csrc.nist.gov/csrc/media/Presentations/2023/crclub-2023-03-08/images-media/20230308-crypto-club-slides--taming-the-many-EdDSAs.pdf
Also consider the advice in:
The text was updated successfully, but these errors were encountered: