ed25519-dalek
ed25519-dalek copied to clipboard
Add an optimization for trivial batch sizes.
[Hello, fine Dalek maintainers! This is my first open-source rust patch.]
When told to validate zero signatures, there's no need to go through any pre-computation to be sure that there are no invalid signatures in the empty set.
When told to validate one signature, a single signature verification call is faster than validating a single signature via the batch-verification mechanism.
On my not-very-reliable desktop, the performance difference is ~99% when there are no signatures, and ~25% when there is one signature.
Why would a programmer use batch verification in this case? Sometimes it makes sense to write code that handles an unknown number of signatures at once: for example, when validating a download that contains multiple objects, between zero and many of which are ed25519-signed objects. Instead of the programmer having to pick which API to use, IMO it's better just to make the "batch" API fast in all cases.
This commit adds cases for 0 and 1-signature batches to the benchmark. It also adds a case for a 2-signature batch, to verify that batch verification is faster for all n>1.
Actually features cannot be used here @huitseeker due to being additive: Any crate using "standard" cofactorless would silently be switched to cofactored whenever a crate using cofactored gets included.
You'll could use two different Signature
types for cofactored and cofactorless, like I suggested in https://github.com/dalek-cryptography/ed25519-dalek/pull/117#issuecomment-717729008 or perhaps even some runtime flag:
pub struct Signature([u8; SIGNATURE_LENGTH], cofactored: bool);
@burdges I see your point.
I note, however, that we could make compilation blow up with something like
#[cfg(all(feature = "cofactored", feature = "cofactorless"))] // with cofactorless default
compile_error!(
"only one signature paradigm should be used!"
);
But a more interesting alternative is that since cofactored verification will always accept cofactorless-correct inputs, and the discrepancy in behavior can only be observed on signatures produced outside of any (draft-)standard —or this library—, we could consider feature unification under cofactorless behavior as a "feature" :smiling_imp:
To make things compatible with unification, we'd hence define cofactorless verification function (still the default) under
#[cfg(and(not(feature = "cofactored"), feature = "cofactorless"))]
and the corresponding cofactored verification under
#[cfg(feature = "cofactored")]
The real risk occurs across libraries / implementations, where you could maliciously craft signatures so that one implementation disagrees with the other, not if all your code is "downgraded" to cofactorless unbeknownst to you. You would be downgraded, after all, to the recommended standard behavior.
Because the "cofactored" feature would be non-default, the feature approach gives us a meaningful way to ensure upon cofactored opt-in that all the components of a code base are turned to cofactored behavior. At the admitted cost of possibly introducing a discrepancy with another implementation (e.g. ring) , but:
- that's why you have the opt-in,results
- I am tempted to submit a PR for cofactored verification to BoringSSL (and all the others) and replicate the feature trick in ring.
You could just name cofactored Signature
and name cofactorless SignatureLegacy
or whatever, so then you encourage cofactored but nobody who reads docs carefully gets surprised. We'd never produce SignatureLegacy
by signing of course, only by deserialization.
I see. That works for the purpose of bringing the ecosystem to cofactored verification, but:
- it puts the burden on dalek maintainers to keep an explicit cofactorless signature around forever, as soon as dependencies start using it,
- I'm not sure anybody "reads docs carefully", so I'm worried that it is not opt-in. Most people would be silently "switched" to cofactored given their current code uses
Signature
. As a consequence, we get the "reverse" dependency problem: those inattentive Dalek users would be performing a cofactored verification while the rest of the ecosystem is still, largely, on cofactorless.
I suspect features cause more harm than good here, but you could've a feature that controls a type alias.