circl
circl copied to clipboard
kem: Use case for deterministic encapsulation
The kem interface requires deterministic encapsulation, is there a use case for this functionality?
cc: @chris-wood @bwesterb
If there is no use case, I strongly suggest we remove it.
The Go TLS tests require the code to be deterministic. In the case of the normal TLS code, they pass around a rand io.Reader
which is set to something deterministic during the tests. To have KEM TLS not break the Go TLS tests we need a method of deterministic generation as well.
One way to hack on this is passing a random source to GenerateKey
? (we have talked about this in the past, this is another chance to think about it again)
Yes, by passing a random source or optional seed to both GenerateKey and Encapsulate. I considered that, but I prefer the common interface to be as simple as possible. The user shouldn't have to think whether to pass nil or crypto.Rand
. The downside is a little more work when adding new KEMs, but the upside is that the enduser is less likely to make a mistake.
The Go TLS tests require the code to be deterministic.
I really don't think exporting functions for internal tests is a good reason to impact the API. I suggest we investigate the alternative -- passing random
sources to the relevant functions -- instead. This seems fairly idiomatic in Go APIs that I'm aware of, and also doesn't needlessly extend the API.
I really don't think exporting functions for internal tests is a good reason to impact the API.
I'm pretty sure deterministic key generation is useful in some protocols. For instance when you're implicitly authenticating with a KEM key pair derived from a password.
This seems fairly idiomatic in Go APIs that I'm aware of,
It certainly is idiomatic, but I think it's bad design.
- For almost all use cases using
crypto/rand.Reader
for randomness is what you want. The user shouldn't have to think about this and certainly shouldn't easily be able to make the mistake of providing the wrong randomness. - In this case we know how much randomness we need. Providing the randomness as a seed (in case the user does want to provide it) is simpler and more to the point then providing a
Reader
.
It's not my call, but I want to throw my support behind doing away with the *Deterministic(seed []byte)
idiom and passing an io.Reader
(typically crypto/rand.Reader
) to randomized algorithms.
While I agree with @bwesterb's point that this API is not misuse-resistant, the fact that io.Reader
is idiomatic in Go is important, because it means that users are primed to have a certain expectation of how APIs for randomized algorithms work. We should meet those expectations wherever possible. To prevent misuse, we need to encourage users to do the right thing. In particular, examples and test code (except for the test vectors, which used a particular sequence of coin flips), should use crypto/rand.Reader
.
To the point about the KEMs in CIRCL having a fixed-length seed as input: this is true of the current set of KEMs, but there are lots of reasons why KEMs might use, potentially, an unbounded number of coins.
Another point in favor of using io.Reader
is that crypto/tls
expects to be able to specify the randomness used for every operation.
To the point about the KEMs in CIRCL having a fixed-length seed as input: this is true of the current set of KEMs, but there are lots of reasons why KEMs might use, potentially, an unbounded number of coins.
In that case I'm pretty sure that the designers will use SHAKE or another PRNG to derive the randomness from a single seed, as happens, for instance, in Dilithium during signing.
That may be true, but I doubt it will be true for all KEMs that get standardized going forward. I think it's better to be general.
@armfazh how do we want to proceed here? Can we remove the deterministic APIs?
I propose to split the current kem.Scheme
interface in two smaller interfaces:
type Scheme interface {
Name() string
GenerateKeyPair(rand io.Reader) (PublicKey, PrivateKey, error)
Encapsulate(rand io.Reader, pk PublicKey) (ct, ss []byte, err error)
Decapsulate(sk PrivateKey, ct []byte) ([]byte, error)
UnmarshalBinaryPublicKey([]byte) (PublicKey, error)
UnmarshalBinaryPrivateKey([]byte) (PrivateKey, error)
CiphertextSize() int
SharedKeySize() int
PrivateKeySize() int
PublicKeySize() int
}
and
type SchemeWithSeed interface {
SeedSize() int
DeriveKeyPair(seed []byte) (PublicKey, PrivateKey)
EncapsulationSeedSize() int
EncapsulateDeterministically(pk PublicKey, seed []byte) ( ct, ss []byte, err error)
}
Also, it seems to me that the SchemeWithSeed
can be implemented automatically by mocking the random source passed in GenerateKeyPair
and Encapsulate
functions.
type seededReader struct{ seed []byte}
func (s seededReader) Read(p []byte) (n int, err error) { /* use seed os shake on seed */ }
thoughts?
I like the proposed change to Scheme
. I'd say there's no need for the new SchemeWithSeed
-- it suffices to just pass a seededReader
as the io.Reader
, as you suggest.
thoughts?
Using a seed instead of an io.Reader
is only a small advantage which I don't think is worth complicating the API. My primary motivation for having two versions is so that one doesn't need any randomness input from the user — even though it's idiomatic Go's standard library, it's in my opinion bad design.