#686: Broadening the user base of WebAuthn

Visit on Github.

Opened Nov 1, 2021

WebAuthn is the Web standard that supports security keys: often physical USB tokens used as a 2nd factor for authentication. WebAuthn already supports being the only factor: browsers can display a list of accounts from a security key and the security key can collect a local PIN or biometric to verify that the correct person is present. But it seems unlikely that broad numbers of people are going to purchase a pair of security keys to use WebAuthn, and manage double-registering on all their sites.

Thus, in WebAuthn L3, the WG is considering several changes to make WebAuthn more broadly applicable as a password replacement and we would like to raise this with TAG.

Further details:

We'd prefer the TAG provide feedback as (please delete all but the desired option):

☂️ open a single issue in our GitHub repo for the entire review

Discussions

Comment by @Myridium Nov 21, 2021 (See Github)

Sorry to butt in, but I have been reading a bit about w3c's WebAuthn activities and the approach is puzzling me:

But it seems unlikely that broad numbers of people are going to purchase a pair of security keys to use WebAuthn, and manage double-registering on all their sites.

First, whatever happened to the Asynchronous Remote Key Generation proposal by YubiCo which would allow users to keep a redundant security key backup in safe storage, consistently synced? The user has control over their own data and security this way; there is no need for online services to offer their own syncing mechanisms if the user has, say, 3 security keys to which they assign memorable names. It is easy to imagine this as part of a marketing campaign for a new passwordless experience-- naming your personal, private security keys that will never divulge their secrets to anyone. Call them "Bob, Roger and Frank" if you will. Physically label/engrave them too, if desired. Simple to understand, as well. As part of ARKG, the user's hardware tokens could be configured with the public keys of the other tokens, and easily enrol them as a matter of course in new services. Services can display a list of enrolled devices, and users know to check that "Bob, Roger and Frank" are all enrolled. Easy.

Only issue/improvement would be synchronising credentials across services. For this it seems easiest to use the Hedera network, where there could be a maintained list of keys for the user. For more privacy, the user could maintain their own private list (e.g. Hedera File Service), and issue NFTs to services in order to grant them access. How? The security key would have permission to mint such NFTs. This may sound futuristic and far-fetched and impractical, but it is actually achievable and cost-effective today. We just need trusted hardware key manufacturers like YubiCo to support biometric--protected on-board transaction construction and signing, for a painless experience. Admittedly there is probably a lot of work to be done developing compatible hardware keys. But it can be done.

Second, I think that the passwordless transformation will be more than sufficient to convince users to take that extra security step in managing a few hardware tokens. In fact I would go so far as to speculate that this relieves the user of a psychological burden, as a physical security key system is easy to understand and keep track of. If a private key is stored on my smartphone or PC, then I simply don't know how safe it is. Can it be hacked remotely? Does it expose the private key in RAM? (YES, therefore vulnerable to memory exploits, rootkits etc). If it's on dedicated hardware designed by a reputable brand, then I just needn't worry.

Users may even think of this transformation as returning to simpler times-- a physical key that unlocks accounts. You don't have the key? Then you don't have access. You do have the key? Well, I better change the lock (invalidate the key) but I won't worry too much because I know it's protected by my fingerprint.

I know I'm repeating myself, but I don't understand why it would be considered hard to onboard users to such a system. I am pretty sure my grandparents would have felt quite comfortable using biometric FIDO2 hardware keys. Usernames/passwords? Not so much.

I would urge W3C to conduct real usability studies (with real test subjects) with biometric FIDO2 keys, and usernames/passwords, to determine whether users would autonomously make the switch. This is too important for idle speculation.

If biometric-protected FIDO2 keys supporting ARKG are released, and ARKG is incorporated as a core of the WebAuthn standard, then users will have the ultimate usability experience; the cognitive burden of managing online identity will be eased; users will 'own their identity'; they'll retain privacy (assuming public keys are in some way 'salted' according to the online service; also consider Hedera), and the security of such a system will be quite substantial. I believe that users would understand this authentication system better than the current username/password based one. And I believe that users would autonomously switch due the great user experience.

A bright future in security and usability awaits, and it's really achievable today I believe. Please don't shoot us in the foot by settling for second best. Incremental innovation stifles adoption. The passwordless revolution is a rare opportunity to get things right, to perform a quantum leap in security and usability, with a high degree of confidence that users will autonomously onboard themselves and mainstream adoption will be achieved.

Discussed Nov 22, 2021 (See Github)

Dan: explainer looks good and detailed and talks about user needs but is quite in depth. Don't seem to have a security and privacy questionnaire response.

Peter: seems more like broad ideas than something that would warrant an s&p

Dan: spend time at the f2f

Comment by @torgo Nov 22, 2021 (See Github)

Hi @agl just wanted to ack this and let you know we are working on it. We plan to spend some time on it at our virtual f2f in 2 weeks' time and we hope to have some useful feedback for you after that.

Discussed Jan 31, 2022 (See Github)

Hadley: reviewing a comment that came in...

[discussion of sync fabrics]

Tess: regarding e.g. iCloud keychain - Apple doesn't have access to contents of the keychain. Only your devices have access because they're in the same ring of trust.

Hadley: from a user perspective -I have to trust that Apple has done that.

Tess: yes you have to trust the company that built it. Different companies have different ways of demonstrating they are worthy of that trust. Apple makes some kinds of devices available to security researchers. People in in the industry can review that research.

Hadley: makes me wonder if we should recommend to the working group a checklist - if a key sync fabric provider meets these criteria then they can be said to be compliant with the spec. E.g. letting independent researchers have a look.

Tess: yes it would make sense for the working group to have that .. best practices. And notes for auditing. If you're been charged with evaluating .. then you should look at xyz. A little concerned - hard to have the working group require.

Hadley: problem I'm trying to solve: me as a user - i see the web site I want to log into has implemented this new version of webautn and I want to know if I should trust it. I have nothing beyond the reputation of the browser / key syncing company...

Tess: also true for Yubikey...

Yves: if the key is stored in the browser then you have to make sure you're trusting the browser maker.

Dan: browsers should be able to interoperate with multipls 3p sync fabrics?

Hadley: they are talking about the phones...

Tess: you can still use hardware security keys -- but that won't sync across a sync fabric. It's not a replacement - it's an alternative. Some might say "in order to access this you have to have a physical key" - but for the common case - vast majority don't have a hardware key... A feature like this opens up webauthn to be used in a wider context (so it can displace passwords). We should welcome an effort to reduce that threat.

Dan: [raising issues of requests from governemnts to providers of key sync fabric]

Tess: we can't address policy issues.

Hadley: potentially opens up users to ... Feels like we need external credentialling.

Tess: For centralizing - sympathetic - we require our browsers to have 3p architetcure for these things - then a less scrupoulous company implements one - and they use it to snarf up your credentials - now they can impersonate you wherever they want. So there's a concern but remedy might be worse.

Tess: ... or governments.

Hadley: similar issue of using phone OS as credential storage fabric - you still need to trust your phone OS. There are lots of phones out there on old versions of OS or less maintained forks...

Peter: interesting option - you can build a system where you have a hardware key that you use to sign your syncable keys - so the system lets you log in with any key signed by that key. You have your phone and you have your ubikey - once a month you can re-gen the keys but they expire... Device key extension - might be this.

Dan: we can ask them about key sync fabric providers.. we can ask them about device key extension...

Tess: his is a decent overview video btw: https://developer.apple.com/videos/play/wwdc2021/10106/

Hi @agl. We are discussing this in our W3CTAG breakout today.

This has generated an interesting discussion, and we have a couple of questions you may be able to help with.

1. **How safe are sync fabric providers?** We talked through a number of scenarios in which this could fail to protect users: 

* one where the sync fabric provider has access to the credentials themselves and abuses them, 
* one where authorities could serve warrants on sync fabric providers and gain access to all of a user's accounts, and
* one in which an unscrupulous but widely-used site requires that users use their sync fabric to log in — and then they have access to all of that user's accounts. 
  
Have you and the working group considered these types of scenarios? What sort of mitigations might be added to the spec or the ecosystem to protect users from scenarios like these?

2. **Can you say more about device-key extension?** We were intrigued with the protections that come from pairing syncable credentials in a phone or sync fabric to an automatically-generated, device-bound key pair. Would you imagine those keys expiring regularly (monthly or weekly, etc)? And while this clearly mitigates some of the danger to the user (their older credentials would no longer be available, should they be exposed or exfiltrated) — how much damage could still be done if credentials were leaked before they expire? And are there other ways to protect users in this scenario?


Dan: +1 Peter: +1 Tess: +1 Rossen: +1

Comment by @hadleybeeman Jan 31, 2022 (See Github)

Hi @agl. We are discussing this in our W3CTAG breakout today.

This has generated an interesting discussion, and we have a couple of questions you may be able to help with.

  1. How safe are sync fabric providers? We talked through a number of scenarios in which this could fail to protect users:
  • one where the sync fabric provider has access to the credentials themselves and abuses them,
  • one where authorities could serve warrants on sync fabric providers and gain access to all of a user's accounts, and
  • one in which an unscrupulous but widely-used site requires that users use their sync fabric to log in — and then they have access to all of that user's accounts.

Have you and the working group considered these types of scenarios? What sort of mitigations might be added to the spec or the ecosystem to protect users from scenarios like these?

  1. Can you say more about device-key extension? We were intrigued with the protections that come from pairing syncable credentials in a phone or sync fabric to an automatically-generated, device-bound key pair. Would you imagine those keys expiring regularly (monthly or weekly, etc)? And while this clearly mitigates some of the danger to the user (their older credentials would no longer be available, should they be exposed or exfiltrated) — how much damage could still be done if credentials were leaked before they expire? And are there other ways to protect users in this scenario?
Comment by @agl Feb 1, 2022 (See Github)

How safe are sync fabric providers? … Have you and the working group considered these types of scenarios? What sort of mitigations might be added to the spec or the ecosystem to protect users from scenarios like these?

Starting more broadly, I don't think a W3C spec has much (any?) influence here. I would think of it in the same way as password managers. There are several, and probably people have opinions about their relative security, but I don't think that HTML5 would achieve anything if it, say, mandated particular designs for password managers.

Similarly, Web Authentication currently defines an interface for using security keys and typically those are separate physical devices or embedded secure hardware. But one can also flip a switch in Chromium's DevTools and store credentials in software for testing. Web Authentication does not (and should not) prohibit that. (Although it does include support for attestation that can effectively prevent it in enterprise environments.)

Getting more specific, I can only speak about Google's potential sync system here.

If Google were to build a sync system we would not wish to have access to the credential private keys. You can see public information about how Android Backup implements end-to-end encryption and thus how that might be achieved in this context.

(iOS and macOS can sync WebAuthn credentials via iCloud Keychain, behind a developer flag. You can find information about iCloud Keychain online.)

Can you say more about device-key extension?

Would you imagine those keys expiring regularly (monthly or weekly, etc)?

(Noting, again, that I can only speak about Google's potential implementation of this.)

We do not have a firm opinion on expiration of these keys. At one end, one could think of them like cookies and thus reset them whenever cookies for that domain would be removed. But they are a lot higher friction than cookies since WebAuthn credentials require biometric confirmation (at least with Google's current platform authenticators). So perhaps they are more like saved passwords, which persist. It seems likely that management UI would allow them to be reset, but there is not yet a firm answer on whether that would also happen automatically if Google were to launch support.

how much damage could still be done if credentials were leaked before they expire? And are there other ways to protect users in this scenario?

We can consider a WebAuthn credential, in this model, to consist of either one or two private keys. The primary private key cannot be hardware bound since it is synced. The, optional, second private key (the device-bound key) hopefully would be hardware bound on a particular device. Thus the greatest risk is to the primary private key and the obvious way for an attacker to obtain them would be to impersonate the victim to their sync provider and know the victim's PIN/pattern.

The consequences of this are similar to when using a password manager: pretty bad. The attacker would have the user's credentials. Sites opting to additionally use a device-bound key would notice that a new device is being used but the protection afforded by this is up to the site. Similar to obtaining the complete set of a user's passwords, the remediation would be to rotate registered credentials on all sites.

User-agents may be able to aid in doing this should it be needed.

Comment by @plinss Feb 2, 2022 (See Github)

I was imagining a mechanism where a hardware based key could be used to sign synchronized keys (and the web site would rely on being presented with any key signed by the hardware key, basically using the hardware key like CAs do their root keypair but using an intermediate key on a daily basis). This would allow the synchronized keys to expire rapidly and be regenerated offline by using the hardware key and any of the synchronized keystores.

Your response indicated that's not how the device-key extension works. Can you explain more about what device-key extension is meant to be used for then? And have you considered a design like I mentioned?

Comment by @agl Feb 2, 2022 (See Github)

Your response indicated that's not how the device-key extension works. Can you explain more about what device-key extension is meant to be used for then?

The device public-key extension proposes that a WebAuthn credential may have a set of secondary private keys associated with it. When registering or asserting with a WebAuthn credential, if the website requests it, an authenticator may return a public-key and signature from one of these private keys in addition to the signature from the primary private key for the credential. The idea being that if the authenticator is syncing the primary private key onto different physical devices, each physical device can create its own device-bound key for each credential. The device public-key extension thus discloses some information about the set of devices in use to aid with risk decisions.

We do not envision that most websites would use this. But, for some sites that take a risk-based approach to sign-in, this additional signal may be useful. For example, a sign-in attempt coming from a geographical location that is unusual for the account might be rendered less suspicious given proof that a known device for that user is making the request.

Discussed Feb 7, 2022 (See Github)

Dan: I'm looking at the response to Peter's question about private keys on devices.. I don't understand why "most websites would not use this" or how what he describes is different from what you said you thought it was..

Peter: what I was thinking was a system wehre you have a hardware key which is non extractable and a set of softwar ekeys which are syncable and you use the hardware key to countersign the public keys of the syncable keys.

Dan: I think they're thinking about this in terms of keys that are already being synced across some back lane, so the main key is not a hardware based keys, it's a key being synced.. the website that's using it wants to have some additional information about 'you're now authenticated on thsi device and I know this public key is only associated with a private key that's only for this particular device and I have that in addition to the main synced key' - so that would provide some kind of additional 'we haven't seen you on this device before' type thing. The whole thing is based on the idea that the main key is not tied to a specific device

Peter: looks like their extension is you sign in with two keys, neither of which signs the other, but one is tied to a device and one is synced, so you know it's you and you know which device it's coming from. That looks like what they're talking about.

Dan: I do wonder why they think most sites wouldn't use it

Peter: maybe most people wouldn't bother or care

Hadley: worth asking

Dan: [leaves comment]

Peter: the fundamental problem with syncable keys is creating a vector for the keys to leak. What I was describing was you still have a physical key that you only use once every so often, when your other keys expire, to resign all of your syncable keys, but on a daily basis you use your syncable keys. It doesn't rptoect you from being leaked but lets you rotate keys frequently in case they do get leaked, and sites still know who you are because your identity is based on your hardware key

Hadley: and I was asking about the states where....

Yves: key repudiation? I don't know.. can you inform the site a specific key has been compromised?

Peter: is there a mechanism for revoking key like that? Instead of whatever the sites mechanism is?

Yves: replace it with a new key or a repudiation system?

Peter: afaik with sites i"ve used webauth on so far you just log in and replace your key or tell the site you're no longer using that key. I don't know of a platform way. Something where you tell the browser they key is no longer good and the browser tells sites not to use it?

Yves: that would be something

Hadley: like telling your bank you've lost your credit card

Dan: I also asked about the design Peter mentioned. We talked last week about in the larger response from Adam it seems like he's really pushign back on requirements about safety of the key sync fabric as you suggested Hadley and I'm wondering what we ought to do about that

Hadley: I'd think at a bare minimum it would not be inappropriate to lay out in the spec these are dangerous. Not uncommon to do. If memory serves they don't yet have a spec for this. If nothing else, being able to share with implementers we thought through some potential pitfalls or drawbacks - we're sharing with you so you can take stesp to minimise them.

Dan: implementers should be cautious of

Hadley: that's different than coming up with a set of normative statements

Dan: My preference would be to leave that comment, get the feedback on whether they've thought about this alternative design, and try to close the issue based on that? We're pointing out the issues but I don't think anything here is a killer

Hadley: sounds sensible

Dan: the other thing I see, which I appreciate, is that this is a working group proposal, multi stakeholder, the contacts are not just the implementers, this is good

Hadley: and early enough to talk about the design, but also mature enough that there's something to work on

Dan: the group chair is one of the contacts, but is there any controversy about this approach in the WG do we know?

Peter: there are a few things in the explainer we haven't really looked at. Conditional UI.. preventing unintended credential overrides..

Dan: we closed conditional ui review last week. I think that's okay.

Peter: thing that lets the website tell the device a key has been deleted... seems useful. Leave feedback about doing that the other way around? Browser could tell sites?

Thanks for the thorough and thoughtful reply, @agl.

While we appreciate that you aren't keen for the spec to list mitigations or privacy protections on the part of the credential sync fabric providers — which we understand — it is useful to brainstorm them. You, as a working group, will have put more thought and energy into threat modelling than anyone else at this point. It's valuable to capture that.

We'd recommend that you include something in the spec, when you get to drafting it, to share some of this thinking with implementers (even something as simple as "users are trusting credential sync fabric providers to keep their keys secure. While the mechanisms of demonstrating that trust or keeping those credentials secure is out of scope for this spec, we are flagging to implementers that they may need to focus on this problem. Without it, the entire feature won't work."). 

While the implementers may choose to address the issue in different ways, it's still helpful to give them that warning/benefit of your advanced thinking. 
Comment by @torgo Feb 7, 2022 (See Github)

Hi @agl - what makes you think that most websites would [not] use this? Also have you discussed the idea about ephemeral synced keys that are signed by a hardware based key as Peter described? Was that design considered?

Comment by @hadleybeeman Feb 7, 2022 (See Github)

Thanks for the thorough and thoughtful reply, @agl.

While we appreciate that you aren't keen for the spec to list mitigations or privacy protections on the part of the credential sync fabric providers — which we understand — it is useful to brainstorm them. You, as a working group, will have put more thought and energy into threat modelling than anyone else at this point. It's valuable to capture that.

We'd recommend that you include something in the spec, when you get to drafting it, to share some of this thinking with implementers (even something as simple as "users are trusting credential sync fabric providers to keep their keys secure. While the mechanisms of demonstrating that trust or keeping those credentials secure is out of scope for this spec, we are flagging to implementers that they may need to focus on this problem. Without it, the entire feature won't work.").

While the implementers may choose to address the issue in different ways, it's still helpful to give them that warning/benefit of your advanced thinking.

Comment by @agl Feb 10, 2022 (See Github)

what makes you think that most websites would [not] use this?

Most sites use a model based on username+password and let people reset passwords with an emailed link. For such sites the device-bound keys are complexity they don't need.

Sites that already have step-up challenges such as SMS OTP when signing in on a new device are those that I think would use the signals that device-bound keys provide.

Also have you discussed the idea about ephemeral synced keys that are signed by a hardware based key as Peter described? Was that design considered?

I don't feel that I understand the proposal enough to comment. An important part of the device-bound keys is that they sign a nonce from the site, proving active possession. Having them sign a synced credential provides very different security properties. I'm also unsure how new devices work in such a scheme.

We'd recommend that you include something in the spec, when you get to drafting it, to share some of this thinking with implementers (even something as simple as "users are trusting credential sync fabric providers to keep their keys secure. While the mechanisms of demonstrating that trust or keeping those credentials secure is out of scope for this spec, we are flagging to implementers that they may need to focus on this problem. Without it, the entire feature won't work.").

Understood. Thanks for the guidance. I can't speak for the whole working group, but I can write pull requests along those lines and hopefully that's enough!

Comment by @plinss Feb 10, 2022 (See Github)

I don't feel that I understand the proposal enough to comment. An important part of the device-bound keys is that they sign a nonce from the site, proving active possession. Having them sign a synced credential provides very different security properties. I'm also unsure how new devices work in such a scheme.

Happy to explain better: this stemmed from a concern about synced keys getting leaked.

My thinking was that the user could have a device-bound key which is used to sign the public key of an ephemeral key pair that gets synced. When registering with a site, the device-bound key would be required to be present (to sign a site-specific ephemeral key that's generated on the fly), and the site would learn the public key of the device-bound key.

However during normal usage, the site would accept a signed nonce from a key that is itself signed by the device-bound key, so the device-bound key need not be present. The model here is how CA's use an intermediate key to sign certificate requests, keeping their root key securely offline (or DNSSEC using zone-signing keys and key-signing keys).

The advantage is that the synced keys could be set to expire frequently. Upon key expiration, the device-bound key would have to be presented to one of the devices that can sync the ephemeral keys and it regenerates all the ephemeral keys that need it. This operation can be done offline. This way should a synced keystore get leaked, the damage is at least limited in time somewhat (presuming the newly generated keys aren't immediately leaked, of course).

This would be an additional mode to requiring direct use of a device-bound key or allowing completely software keys. It's a hybrid approach, somewhat between the two in security while presenting less of a burden on the user than a straight device-bound key, while ensuring that the user at least had access to the device-bound key relatively recently.

Comment by @equalsJeffH Feb 10, 2022 (See Github)

wrt https://github.com/w3ctag/design-reviews/issues/686#issuecomment-1034371664 (above)

We'd recommend that you include something in the spec ... [e.g.] ... as simple as "users are trusting credential sync fabric providers..."

Understood. Thanks for the guidance. ...

fyi, this is noted here in webauthn issue #1665 "Synced Credentials"

Discussed Feb 14, 2022 (See Github)

Hadley: they repsonded to a bunch of things. They're opening a bunch of security issues that they don't want to spec out but that they should raise so implementers consider them. They have accepted that and opened an issue of their own on it so I'm happy. But there's other stuff in there.

Rossen: doesn't seem like they've addressed Peter's comment

Peter: more of a suggestion for an alternative approach, not necessarily our job to redesign their features for them

Rossen: is this something we care strongly about to wait for them to get back to us?

Peter: general concerns that if they go to a model where keys are entirely software based, I agree this will make the feature more useful and more people will adopt it, and as soon as there's a major breach the trust will be burned down to the ground.

Dan: worth articulating

Peter: I suggested a hybrid approach... hardware keys would take longer to catch on. My comment isn't a blocker.

Dan: Useful for Hadley to leave the sentence about the thread .. and on this basis happy to close

Hadley: okay

Comment by @hadleybeeman Feb 14, 2022 (See Github)

Thanks, @agl and @equalsJeffH. That sounds like a good way forward on the concerns I mentioned.

On that basis, we're happy to close this issue and wish you well with this. Do come back if we can help with anything further.

Comment by @agl Mar 1, 2022 (See Github)

Sorry, there's something wrong with my GitHub notification emails and I missed a prior update:

My thinking was that the user could have a device-bound key which is used to sign the public key of an ephemeral key pair that gets synced. When registering with a site, the device-bound key would be required to be present (to sign a site-specific ephemeral key that's generated on the fly), and the site would learn the public key of the device-bound key.

However during normal usage, the site would accept a signed nonce from a key that is itself signed by the device-bound key, so the device-bound key need not be present. The model here is how CA's use an intermediate key to sign certificate requests, keeping their root key securely offline (or DNSSEC using zone-signing keys and key-signing keys).

I think the difference here is that the device-bound key wouldn't be involved in "normal usage". We believe that, for the majority of sites, a synced WebAuthn credential has good advantages over a password. But we hear from some sites that they really want a device-bound key as an additional signal. However, if the device-bound key simply signed a synced key then, at sign-in time, there's no difference between a known device signing in, an a leaked synced key being used with a replayed signature by a device-bound key.

In order to prevent replays, you would want the signature by the device-bound key to also sign over a nonce—proving that it was exercised for each sign-in request. At which point you have a something very similar to the current proposal.