digital-credentials
digital-credentials copied to clipboard
user agents should limit to protocols listed in the registry that meet the group's requirements
The spec includes a registry for exchange protocols (and document formats) and has some normative requirements and some non-normative considerations regarding inclusion in that protocol.
However, it's possible (in fact, almost certain, given current implementers' stated plans) that protocols that don't meet those requirements will still be accepted by browsers and passed on to wallets.
The spec should add a requirement that browsers only support requests with protocols listed in the registry (and that meet the requirements for inclusion in the registry). Or, if there is some good reason for more flexibility, there could be a should requirement, with stated acceptable reasons for those kind of exceptions.
Without such a requirement, the effort to manage a registry of protocols and doing privacy and security reviews would not have any actual effect.
@npdoty who enforces this?
Like all other normative requirements in the spec, there would be no direct enforcement mechanism. But normative requirements make it possible to document interoperability, establish expectations for site developers, and write tests.
I think that it is entirely reasonable to insist on a certain bar for inclusion in a standard. But I think that this is covered by #255 to a degree, if not entirely.
Obviously, browsers will implement what they think meets the needs of their users. There's value to saying that the market will decide, but there are very clearly choices that exist in this space that skew toward particular outcomes, some of which are not good. We are not just allowed to have an opinion on that, but I would argue we are obligated to.
Just for clarity
The spec includes a registry for exchange protocols (and document formats)
The spec explicitly does not deal with document formats / digital credential formats.
Just for clarity
Thanks! Edited issue to correct.
As I brought up at the GDC session on barriers to global adoption, I would prefer to see the registries removed entirely.
A few points:
Registries
- Act as a barrier to adoption
- Squelch innovation.
- Propagate Western technocolonialism
- Don't actually improve the security of the system
- Put the W3C in an inappropriately elevated position
- Are primarily a first-mover advantage that controls and limits adoption, rather than advance it
- Create a novel and unnecessary attack surface
Registries act as a barrier to adoption
It is inevitable that any registry we make will not be available in all the languages in the world. This puts anyone who is not a native English speaker at a disadvantage. This is a known problem for W3C specifications: it's hard to maintain localized versions. Further, if the information to be placed in the registry MUST be in English, that is also a barrier to those developers who work in a different language context. IMO, developers should be free to innovate in their own native tongue and use that technology regardless of whether or not they can convince the W3C to add it to a registry (for whatever reason they might be denied, including using a name that someone else also uses).
Registries squelch innovation
There's a reason there isn't a required registry of websites. As TBL stated in his first publication describing the web as a whole, https://cds.cern.ch/record/369245/files/dd-89-001.pdf he explicitly called out the importance of NOT restricting the nature of the content on the web:
the method of storage must not place its own restraints on the information.
and
The system must allow any sort of information to be entered. Another person must be able to find the information, sometimes without knowing what he is looking for.
and
Non-Centralisation Information systems start small and grow. They also start isolated and then merge. A new system must allow existing systems to be linked together without requiring any central control or coordination.
and
The only way in which sufficient flexibility can be incorporated is to separate the information storage software from the information display software, with a well defined interface between them. Given the requirement for network access, it is natural to let this clean interface coincide with the physical division between the user and the remote database machine3. This division also is important in order to allow the heterogeneity which is required at CERN (and would be a boon for the world in general).
In short, a registry creates restrictions, which goes against the very intent of the World Wide Web, which was specifically designed to allow massive innovation without centralized authorities vetting each new feature or content. This will limit adoption.
Registries propagate western Technocolonialism
This is a well-known point of historical contention. Registries for the web, specifically DNS and IANA IP blocks, have been widely criticized for enshrining a particularly western, and even American position of privilege. For example, the US got to decide that all .gov domains are for the US government, and initial DNS specification couldn't handle non-western text. Any system that defers to a single registry run by a western-biased organization will propagate the kind of technocolonial perspective that places non-Western, non-English speaking, non first-moving parties at a specific, structural disadvantage that is not easily remediated. This will limit adoption.
Registries don't actually improve the security of the system.
While a registry could attempt to vet any particular proposal, there is no guarantee that (a) the vetting is thorough, correct, and still valid, nor (b) that any given implementation of that protocol is implemented in a way that realizes the intended security guarantees. At best, it blesses "well-known" and popular technologies at the expense of perhaps more secure, more trustworthy alternatives. The fact that a protocol is listed in a registry is likely to create a false believe that it is of high quality and that any implementations claiming to use that protocol are also, therefore, somehow trustworthy. In reality, implementers considering which protocols to support should investigate the protocols themselves with their own concerns to see which best fit their needs. For any given use case, any particular protocol might be grossly inappropriate; imo, there simply is no way for a central organization to effectively vet all the possible technologies that might be useful. This will create a false sense of security rather than invite a healthy skepticism.
Registries put the W3C in an inappropriately elevated position
Although all of us here are, necessarily, supportive of the W3C as a standards development organization (or we wouldn't be here having this debate), there is no reason to believe that the W3C has the capability and moral authority to actually stay on top of real-time vetting of protocols as imagined by this specification. The W3C is not a cybersecurity response agency. We simply don't have the operational processes in place to maintain an appropriate posture with regard to the security of any given protocol. Even if a protocol were reasonably well-vetted by security minded professionals, there is no infrastructure or processes to maintain that assessment over time. The fact is, the W3C is just one of many organizations that have opinions about the quality of different protocols. Placing our own assessments over those of other, often more qualified, organizations would do a substantial disservice to the actual security of DCAPI implementations, by inducing trust because a protocol is listed in the registry despite the inherent inability for the W3C to stay on top of emerging cyber threats (at least with its current operational funding an processes). Best case, the W3C can advocate for those protocols its members approve of, but asserting that such a registry is any indicator of deployed security quality would be a gross misrepresentation.
Registries are primarily a first-mover advantage that controls and limits adoption, rather than advance it
Registries fundamentally keep things not in the registry in a relegated position. Whatever rules one must satisfy to get into the registry comprise real and unavoidable barriers to anyone not already in the privileged club. If you take the misguided DID Method registry, you'll see what I mean. In two separate incidents, the registry maintainers relied on flawed analysis that forced real innovators to adopt lesser alternatives than simply calling their DID method what they wanted. In one case, it was an ineffective policy about who actually controls an entry. In another, it was a flawed legal argument that allowed a community member to shout down a legitimate entry without effective due process that actually engaged the rule of law (it was an alleged trademark matter). While both of these were eventually resolved (both in a flawed manner, imo), they illustrate the fundamental function of the registry was NOT to embrace innovation, but rather to contain it, to limit it, not advance it. This will limit adoption and usefulness of the DCAPI.
In fact, any registry that is not a listing of requirements, e.g., protocols that are mandatory to implement, will only be able to exclude unlisted protocols, since inclusion in the registry won't ensure that any implementation actually supports any given protocol. With that lens, the only function of the registry is to limit the set of supported protocols.
Registries create a novel and unnecessary attack surface
Once you rely on registries, those registries become their own attack surface for security considerations. It will be possible
a. to get a spec listed in the registry inappropriately b. for entries in the registry to be inappropriately removed c. for entries in the registry to be inappropriately updated
A strong set of institutional processes can minimize the impact of these attacks, but as long as there is a registry, it, by itself, creates threats that can be entirely avoided simply by designing without a registry.
IMO, all we actually need are mechanisms for which components can self-identify the protocols they support and the browser deals which matching both ends of the credential exchange as a dumb pipe, supporting whatever can be supported in the architecture. Which is to say, I fully realize that a deployed piece of software is not going to randomly suddenly be able to support a new protocol, but the decision to support any protocol is far less the purview of the W3C as much as the responsibility of implementers. In fact, I expect there will eventually emerge meta-protocols that allow the equivalent of media codecs to be downloaded on the fly to support additional functionality. Any kind of restrictive registry would make that sort of dynamic flexibility functionally impossible.
There is sometimes an argument raised that without a registry, you can't know what protocol someone is actually referring to. This problem as solved rather elegantly with GUIDs, used by Microsoft to securely identity COM interfaces created by anyone without a centralized registry. Yes, there were directories of a sort which could give you the GUIDs for commonly used interfaces, but the technology worked: it enabled any programmer to define their own COM interface that immediately had interoperable semantics (for identification of the COM interface) so it could be used by any COM-enabled software that knows to look for that particular interface.
I would suggest considering ways that we can remove the registries entirely, all of them.
[@jandrieu] As I brought up at the GDC session on barriers to global adoption, I would prefer to see the registries removed entirely.
I see a lot of reasons you don't like registries. OK.
Do you have any suggestions of replacements for the existing registries of which you are aware, at W3C or elsewhere?
I wonder whether any (probably not all) of your objections might be solvable by adjustments to those registries?
Do you have any suggestions of replacements for the existing registries of which you are aware, at W3C or elsewhere?
Yes,. First, I'd separate this into two functions.
First, for run-time matching, use GUIDs or DIDs or even URIs (RDF style) to allow components to self-describe using globally unique identifiers (which painlessly refactors to URIs without needing to distinguish nuance between URIs, URNs, and URLs). This includes formats that self describe, as well as protocols. The developers of components simply publish the globally unique identifier for their component in their specification and developers decide at dev time which components they want to support, then implementations communicate which components they support at run time.
Second, for discovery, use directories, search engines, and LLMs. Developers, at dev time, are going to use the web and associated mechanisms to learn about interesting components, just as we do today for nearly all technology we incorporate in our applications. The thing about these kinds of services is that there are no gatekeepers for the meta platform: each directory, search engine, or LLM gets to decide which components it indexes. Reciprocally, developers can use any discovery tool to find the technology they are interested in, without deference to a systemic limitation on the platform. Decentralized discovery is how the Web works. Let's keep using that rather than impose the W3C as a gatekeeper for systems built on its technology.
FWIW, I don't consider automated discovery to be a problem worth solving as streamlining automated downloads of untrusted algorithms without human evaluation is a known security problem. Manual discovery through the web helps ensure that developers are in the loop when incorporating components. Perhaps more to the point: I don't think the convenience of automated discovery is worth the centralizing costs, especially in light of the increased security risks of adding that capability.
I wonder whether any (probably not all) of your objections might be solvable by adjustments to those registries?
First, turning them into directories for discovery would be a big help. There's nothing wrong with a spec listing related technologies as an aid to developers. It's when you make it a "registry" that leads to conversations like this, where people expect registration to be required and hence, it seems reasonable to make listing in the registry have normative requirements, which is what is proposed in this issue.
Second, remove any requirements other than editorial review for inclusion, making it clear that the list is a curated list maintained by specific editors, and not an official registry that has normative impact. I don't mind if the editors have opinions, but I do find it challenging when rules are imposed on the registry to exclude "bad" listings because they don't satisfy some obscure, political, or ideological reason and hence establish a false notion of approval by the W3C as a whole, rather than the independent opinion of respected editors.
@jandrieu do you want to open a separate issue on whether we should either eliminate the registry, or eliminate all requirements for adding to the registry? That seems like a very different issue from what was proposed here.
Personally I believe security and privacy reviews that are imperfect are a large improvement over no security and privacy reviews at all. Removing all semantics and review from what is passed in the protocol would instead seem to make this a generic API to pass arbitrary data to/from other installed apps. There would certainly be many use cases for that, but it would be dangerous and even harder to explain to users.
But if the WG or implementers were to decide based on this issue that they didn't want any limits on what protocols were supported then the registry and review process might become superfluous anyway.
Personally I believe security and privacy reviews that are imperfect are a large improvement over no security and privacy reviews at all. Removing all semantics and review from what is passed in the protocol would instead seem to make this a generic API to pass arbitrary data to/from other installed apps. There would certainly be many use cases for that, but it would be dangerous and even harder to explain to users.
Continuing with another reflection that emerged at GDC, we could use an approach similar to that of EME, where there is no registry, but there are Implementation Requirements so as not to burden the review process, but to provide requirements to be met in terms of privacy, security, and other matters.
@jandrieu do you want to open a separate issue on whether we should either eliminate the registry, or eliminate all requirements for adding to the registry? That seems like a very different issue from what was proposed here.
Happy to move this wherever would be best. I was pointed here in particular, but don't mind making a new issue.
Personally I believe security and privacy reviews that are imperfect are a large improvement over no security and privacy reviews at all. Removing all semantics and review from what is passed in the protocol would instead seem to make this a generic API to pass arbitrary data to/from other installed apps. There would certainly be many use cases for that, but it would be dangerous and even harder to explain to users.
I'm curious why you think that it's harder to explain than, say using https for AJAX requests. The user's involvement, IMO, is largely picking the wallet providers they trust so the API can offer those wallets as potential credential suppliers. Expecting users to actually understand the security implications of a registry seems like the harder problem, the more so the more complicated the registry rules are. Most users will never see the registry nor be exposed to any of its complexities. Either the browser and wallet share a common protocol or they don't. And all that will happen for non-conformant browsers & wallets is they get new bespoke features that aren't yet available to those who limit their implementation to registered components.
But if the WG or implementers were to decide based on this issue that they didn't want any limits on what protocols were supported then the registry and review process might become superfluous anyway.
Agreed. That's one of the logistical benefits of avoiding a registry. Some concerns just go away.
IMO, the question is "are we actually addressing the concerns people think we're addressing by creating a registry?" I posit that we are, in fact, limiting innovation (and hence innovative improvements in security and privacy) with a mechanism that won't, itself, improve the security or privacy of any of the protocols.
The registries aim for user agents to support protocols and formats that meet high privacy and security criteria, making them eligible for inclusion in the spec. While user agents are not required to support all or any of these formats, those listed have met the inclusion criteria.
User agents can, of course, support other formats, as W3C specs are merely “recommendations,” but the formats in the registry have been vetted by the W3C community and are considered acceptable and in line with its values and privacy and security expectations of the W3C Community.
EME, where there is no registry, but there are Implementation Requirements so as not to burden the review process, but to provide requirements to be met in terms of privacy, security, and other matters.
The DRM modules are different because they make "pinky promises" not to do bad things... the formats are actual standards that the can be checked that they actually don't do bad things.
User agents can, of course, support other formats, as W3C specs are merely “recommendations,” but the formats in the registry have been vetted by the W3C community and are considered acceptable and in line with its values and privacy and security expectations of the W3C Community.
Yes, it's true that implementers can do whatever they want, including violating normative requirements in the spec as proposed here. But if we don't normatively require limiting to reviewed protocol specs, interoperability, privacy and security will all be harmed. In particular, it would be the case that a UA can be compliant with all the requirements in this spec, but that most of the privacy considerations and expectations documented here wouldn't apply because sites and UAs could just use an exchange protocol with totally different features that don't satisfy any of our common expectations for enabling privacy or undergoing review.
If UAs plan on just implementing whatever protocol, then it seems hard to justify why volunteers should invest a lot of time doing privacy and security reviews of protocols, or why protocol authors would bother to submit them for review and inclusion in the registry, or why readers should expect the privacy considerations in this spec to be informative.
If UAs plan on just implementing whatever protocol, then it seems hard to justify why volunteers should invest a lot of time doing privacy and security reviews of protocols, or why protocol authors would bother to submit them for review and inclusion in the registry, or why readers should expect the privacy considerations in this spec to be informative.
I fully agree. I don't see the value of the registry if it doesn't impact UAs at all. We don't have to go as far as blocking unregistered protocols, but maybe UAs should show a warning to the users for requests with unregistered protocols? But bottom line is, before defining the impact/value of the registry, it's hard to justify the investment!