eudi-doc-architecture-and-reference-framework
eudi-doc-architecture-and-reference-framework copied to clipboard
Privacy has not started to be taken into account in the ARF 1.4
Privacy has not started to be taken into account in the ARF 1.4. The minimum would be to include the two following properties as mandatory objectives of the architecture:
- Full unlinkability
- Everlasting privacy (also known as « unconditional privacy » in the academic literature)
A basic definition of unlinkability would be to avoid tracking through metadata and other information not willingly revealed in clear by the user, even across subsequent transactions.
Everlasting privacy states that whatever happens after a transaction has taken place, no one will be able to mathematically retrieve any information about the hidden (meta)data that were used in the transaction, including if the private key that was used for the transaction is revealed,if an adversary has an unlimited amount of computing power or if they can use a quantum computer. This property ensures that users feel safe performing transactions knowing that their privacy will be preserved in the long term.
These objectives do not imply that users would be anonymous in all transactions, but ensure that the framework enables appropriate privacy depending on service needs.
Unlinkabilityshall be ensured: • Not only across verifiers but also between any number of actors (including verifiers and issuers) and over time • In all processes, including in the revocation process, i.e. whenever the user must prove that their assets ((Q)EAA, wallet, etc…) have not been revoked From an architecture point of view, the underlying infrastructure shall not add any linkability to what the service has decided to share, especially since the user has no knowledge or understanding of the tracking information that might be spread by the underlying infrastructure and affect their privacy(especially when combined with “selective disclosure”). This involves identifying efficient and generic ways of randomizing all the (meta)data that service purposes do not explicitly require in clear . For example, this means that; • The signature provided by the issuer to the holder shall not be provided identically to the verifier but shall be “randomized” • Same for the public key embedded in the signed (Q)EAA • Metadata like the expiration date shall be hidden when shared with the verifier • To verify the validity of a verifiable proof, the verifier shall not have to request a non-randomized identifier from the holder, which would create linkability
In addition, a full implementation of privacy by design for eIDAS should include: • Blind signatures capabilities for the issuers • The capability to generate minimized proofs (e.g. “I am over 18” instead of “here is my date of birth. Look at it and you’ll see I’m over 18”) and more generally work on predicates where the user shall just answer Yes or No • The capability to authenticate user with self-generated pseudonyms • These should be accompanied by all necessary proofs including the fact that the pseudonym is generated as it should and linked to the legitimate user
The framework could also include plausible deniability (i.e. the user being able to credibly claim that they did not perform a transaction which they actually did, without anyone being able to claim the contrary). However, plausible deniability is not compatible with audit requirements and therefore each service shall define whether / which transactions are eligible to plausible deniability.
Furthermore, all of this should be done without impacting security, and in particular, the following shall hold true: • It shall not be technically possible for the WSCD to generate proofs independently without the user being involved • (Q)EAA shall be holder bound in a process leveraging a SOG-IS listed protocols running on a WSCD • Formal proofs of security and privacy shall be provided for the used protocols
The solution shall be available to as many people as possible immediately at launch.
On a common hardware, the calculations necessary for the transaction shall not take more than 100ms, so that, considering other latency factors e.g. checking the non-revocation of VCs, the full process would take less than 500ms.
For a future-proof framework he used protocols shall be PQC resistant, but without impacting any of the other properties.
In conclusion, there is no “light touch” way to solvethe privacy challenges of the current ARF: • The privacy strategy/objectives shall be defined in a specific chapter, and shall at least include full unlinkability and everlasting privacy • The protocols shall be amended: in their current form mDL and SD-JWT can not structurally support unlinkability, nor everlasting privacy. We recommend relying on anonymous credentials protocols, particularly of the BBS family which are the most efficient.
We have raised similar concerns previously (here) and are aware that such concerns echo those raised in an open letter published by renowned academics on 23/11/2023, which stated: “Finally, the mobile wallet part of the regulation mentions in multiple places the need for the European Digital Identity Wallet to protect privacy, including data minimization, and prevention of profiling. However, our concerns remain that the draft regulation still enables large-scale tracking of citizens based on government-issued identifiers. Our concern that unobservability (towards the Wallet provider) and unlinkability are not sufficiently assured has not been addressed: this means the technical implementation will decide on core privacy safeguards and that relying parties will choose the Member State with the weakest protection. The current reference architecture does not use the state-of-the-art technologies such as anonymous credentials that have been developed more than 20 years ago. It is clear that, once mobile wallets are rolled out on a large scale, it will become exceedingly difficult to make further changes.” Such concerns remain valid with the current version of the ARF.