#850: Specification review request for Verifiable Credential Data Integrity
Discussions
Discussed
Jun 12, 2023 (See Github)
we begin looking at the explainer
bumped to next week
Discussed
Jun 19, 2023 (See Github)
bumped
Discussed
Jul 1, 2023 (See Github)
Hadley: are the cryptosuites normative dependencies?
Amy: the cryptosuite specs depend on the Data Integrity spec, but not the other way around
discussion of how RDF canonicalization fits in
Hadley: there's a json canonicalization process for plain json, and rdf canonicalization can be used if you have rdf
Amy: looks like they've considered complexity tradeoffs against the use cases they want to solve
looking at how cryptosuite specs fit with the data integrity spec.. data integrity specifies how to write a cryptosuite spec. Any cryptosuite can then be plugged in
The following language was deemed to be contentious: The specification MUST provide a link to an interoperability test report to document which implementations are conformant with the cryptographic suite specification.
The Working Group is seeking feedback on whether or not this is desired given the important role that cryptographic suite specifications play in ensuring data integrity.
Interested to hear more about both sides of that argument..
discussion about how this could be used as a general mechanism, and why it might be focussed on VCs (because it's impossible to charter a group for a general data integrity mechanism?). In the spec:
While this specification primarily focuses on Verifiable Credentials, the design of this technology is generalized, such that it can be used for non-Verifiable Credential use cases. In these instances, implementers are expected to perform their own due diligence and expert review as to the applicability of the technology to their use case.
We (@rhiaro and I) reviewed this in our virtual face-to-face this week.
First of all, we'd like to thank you for the clarity and conciseness of your explainer. Thanks!
The architecture which enables use of different cryptosuites depending on needs seems sensible. How does this affect interoperability? Is a verifiable claim from an implementation using one cryptosuite readable by an implementation using another?
We noted the contentious issue around requiring interoperability reports for cryptosuite specifications, and wondered what the different sides of the argument are for that.
We also see that you're not rolling your own crypto in this architecture and want to applaud that. Sensible choice.
Also noting that the specification could be put to general use, rather than being suitable only for VCs. Have you considered how to expand this work for other use cases? And have you thought about preceding work like XML signatures? If so, how does it feed into your thinking now?
Discussed
Jul 3, 2023 (See Github)
bumped
Comment by @hadleybeeman Aug 3, 2023 (See Github)
Hi, @msporny @dmitrizagidulin @martyr280 @dlongley @brentzundel @Sakurann @iherman @philarcher @peacekeeper @pchampin!
We (@rhiaro and I) reviewed this in our W3C TAG virtual face-to-face this week.
First of all, we'd like to thank you for the clarity and conciseness of your explainer. Thanks!
The architecture which enables use of different cryptosuites depending on needs seems sensible. How does this affect interoperability? Is a verifiable claim from an implementation using one cryptosuite readable by an implementation using another?
We noted the contentious issue around requiring interoperability reports for cryptosuite specifications, and wondered what the different sides of the argument are for that.
We also see that you're not rolling your own crypto in this architecture and want to applaud that. Sensible choice.
Also noting that the specification could be put to general use, rather than being suitable only for VCs. Have you considered how to expand this work for other use cases? And have you thought about preceding work like XML signatures? If so, how does it feed into your thinking now?
Comment by @msporny Aug 5, 2023 (See Github)
We (@rhiaro and I) reviewed this in our W3C TAG virtual face-to-face this week.
Wonderful, thank you for the quick turn around on the review and your comments. Responses to your questions below...
First of all, we'd like to thank you for the clarity and conciseness of your explainer. Thanks!
Good, glad it was helpful. :)
The architecture which enables use of different cryptosuites depending on needs seems sensible. How does this affect interoperability? Is a verifiable claim from an implementation using one cryptosuite readable by an implementation using another?
There are at least three mechanisms in play that the ecosystem utilizes to increase interoperability when multiple cryptosuites are in play.
Implement Multiple Cryptosuites
The first mechanism that verifier implementations typically use is implementing more than one cryptosuite to ensure that they are capable of verifying multiple types of common cryptosuites. For example, implementations will support both EdDSA and ECDSA cryptosuites. You can also see this in action in the pre-Candidate Recommendation test suites today. There are currently three independent implementations that support both the "ecdsa-2019" and the "eddsa-2022" cryptosuites, which use different key types, cryptography, and signature formats.
https://w3c-ccg.github.io/vc-di-eddsa-test-suite/#eddsa-2022%20cryptosuite%20(verifier) https://w3c-ccg.github.io/vc-di-ecdsa-test-suite/#ecdsa-2019%20cryptosuite%20(verifier)
These implementations will be able to verify signatures from either cryptosuite.
Parallel Signatures Using Multiple Cryptosuites
The second mechanism that issuer implementations can use is digitally signing a single payload using multiple cryptosuites in parallel. This approach is explained in the proof sets section of the Data Integrity specification and elaborated upon in the section on Proof Agility and Layering. Fundamentally, this mechanism increases interoperability by allowing a verifier to select from a variety of cryptosuite signatures on a single payload, increasing the chances that it can use at least one of the multiple signatures to verify the payload. When a verifier requests a payload through a protocol of some kind, it can convey which cryptosuites it supports to the sender (which can then select from the available cryptosuite signatures to ensure that it is sending a cryptosuite signature that the verifier can understand). For one mechanism that is used to convey supported cryptosuites, see the section on acceptedCryptosuites
in the DID Authentication Query Format Verifiable Presentation Request protocol specification.
Reducing Cryptosuite Optionality
The third mechanism that is utilized by the Data Integrity specification is reducing the optionality that a developer could pick from. Some cryptographic systems expose quite a number of "tunable buttons and knobs" that, while powerful and flexible, can also expose developers without a background in cryptography to combinations of unsafe choices. Cryptosuite specification authors are urged to protect application developers by not providing a great deal of optionality to developers and picking sensible defaults for a given cryptosuite version.
While the three mechanisms described above are designed to increase interoperability, there is always the danger that there will be a large number of cryptosuites developed that do not interoperate (or are not widely developed). The group acknowledges this risk and believes that it's largely addressed by 1) developers picking cryptosuites that have been vetted by standards setting organizations, 2) national standards that require certain types of cryptography to be used, and 3) market forces that gravitate towards a small set of cryptosuites if there were to become a larger set that are implemented.
We noted the contentious issue around requiring interoperability reports for cryptosuite specifications, and wondered what the different sides of the argument are for that.
The argument for it is: "In order to increase interoperability, implementations need to be tested against a stable baseline that the specification authors and community agree to."
The argument against it was: "We don't want to conflate the people writing the specification with the people writing the tests. There shouldn't be a single test suite or community that is in charge of test suites in case the specification author is negligent or another community is doing a better job at testing."
The latter was certainly the minority opinion in the discussion, which happened a long time ago, so we might try to add that language back into the specification if the TAG were to recommend that we do that.
We also see that you're not rolling your own crypto in this architecture and want to applaud that. Sensible choice.
Thank you for that observation. :)
Also noting that the specification could be put to general use, rather than being suitable only for VCs. Have you considered how to expand this work for other use cases?
Yes, the Data Integrity specification is meant to be a generalized data integrity technology and is not only applicable to VCs. This is mentioned briefly in a NOTE at the bottom of the Design Goals and Rationale section. You can apply it, today, to any JSON or JSON-LD payload, and to any RDF syntax payload with some trivial modifications.
When the work was proposed, it was suggested that we should focus on a narrow use case with well defined boundaries. The work was originally slated for a "Linked Data Security WG", but then retargeted to the Verifiable Credentials WG. If the v1.0 work is successful, a better home for the work would probably be a W3C WG that is dedicated to Data Integrity and other security technologies (such as Authorization Capabilities and privacy-preserving cryptosuites).
And have you thought about preceding work like XML signatures? If so, how does it feed into your thinking now?
Yes, XML Digital Signatures was studied at depth before embarking on the Data Integrity work, we speak to that in the Transformations section (especially in the issue at the very bottom of that section). We have attempted to take great care in avoiding the mistakes made during the XML Digital Signatures work, some of those include:
- DI does not allow stylesheets, DTDs, or schema languages that might modify the data that is signed -- XMLDsig had to deal with things like stylesheets, schemas, DTDs, namespaces, namespaced attributes and a variety of other items that also modified the XML structures being canonicalized and signed. There was no way to tell which was applied, in what order, to get the final canonicalization and signature.
- DI does not allow variations of the information model to be signed -- XMLDsig had to deal with transformations such as XSLT, which affected the construction of the XML tree in ways that were not easy to predict nor clear which version/evaluation of the XML tree should be/was signed.
- DI uses simpler signed formats (NQuads or JSON) and DOES NOT modify text value strings. We rely on NQuads (for RDF) or JSON (for JSON) as the final canonicalized format which does not inherit any of the more complex internal text node canonicalization rules that XML DSig had to deal with, such as performing internal XML text node canonicalization for whitespace, line endings, charset encoding, word wrapping and specialized escape sequences.
- DI ensures composability by separating the syntax (JSON) from the data model (JSON or RDF) from canonicalization (JCS or RDFC) and signing/verification (ECDSA, EdDSA) -- A number of XML implementations bundled these steps or the signature mechanisms with the toolchain (because the toolchain processed the data model in a way that impacted the signatures, so bundling was the easiest way to guarantee stable signatures) making it impossible to compose different processing and securing mechanisms.
- DI protects the entire payload -- Variations of XMLDSig also allowed partial-signing of data where the headers were signed, but the content was not, leading to security vulnerabilities.
There are other smaller lessons learned from XMLDSig that impacted the work on Data Integrity, but the ones above are the biggest take-aways that we have considered.
Thank you, again, for the review and your time. It is very much appreciated! :)
Discussed
Oct 23, 2023 (See Github)
Hadley: don't see any problems. Minor niggles on their response about use cases. Can easily see this being an opportunity for joining things up further down the line. Having said that it's hypothetical - no particular overlap they should be working on. Just makes me nervous about the self-imposed vaccum. Not concrete enough to put that back on the WG. We can sign this off.
Discussed
Nov 27, 2023 (See Github)
Yves: only about defining some usable cryptographic capabilities. I think this one should be OK.
Hadley: the spec link they gave us was to a CR snapshot...
Yves: there are CR snaps and CR snapshots... It's been through the CR "gate".
<blockquote> Thanks for your extensive replies, @msporny.We note that your process allowing a "verifier to select from a variety of cryptosuite signatures on a single payload" does improve interop, and we are happy to see that.
We have no further feedback on this particular review.
</blockquote> Comment by @hadleybeeman Nov 28, 2023 (See Github)
Thanks for your extensive replies, @msporny.
We note that your process allowing a "verifier to select from a variety of cryptosuite signatures on a single payload" does improve interop, and we are happy to see that.
We have no further feedback on this particular review.
OpenedMay 28, 2023
The Verifiable Credentials Working Group requesting a TAG review of Verifiable Credential Data Integrity and two Data Integrity Cryptosuite specifications (EdDSA and ECDSA).
These specifications describe mechanisms for ensuring the authenticity and integrity of Verifiable Credentials and similar types of constrained digital documents using cryptography, especially through the use of digital signatures and related mathematical proofs. Cryptographic proofs enable functionality that is useful to implementers of distributed systems. For example, proofs can be used to:
Additionally, many proofs that are based on cryptographic digital signatures provide the benefit of integrity protection, making documents and data tamper-evident. The specifications in this review request enable these features in ways that were included in the W3C Verifiable Credentials Working Group charter.
Further details:
You should also know that...
We'd prefer the TAG provide feedback as:
☂️ open a single issue in our GitHub repo for the entire review