#620: WebXR Plane Detection Module
Discussions
2021-04-19
Dan: Proposed feedback...
Hi! I'm just having a look at the ex-*plane*r and coming up with a few questions. First of all, I think it could go into a bit more detail on the user need. Can you give a few examples of where plane detection becomes important in the context of a WebXR application? The answers to the security & privacy questionnaire look good but do not seem to be fully reflected in the explainer of the spec itself? For example, the quantization of planes is mentioned as a mitigation strategy against privacy attacks in the questionnaire response but this is not mentioned in the spec itself. I think this would be a lot stronger (and less prone to fingerprinting) if the quantization and other mitigation strategies were spelled out in the spec. Finally, you mention synchronous hit-test but it's not clear from the explainer how this fits together with the [hit testing](https://immersive-web.github.io/hit-test/) specification itself? What does this technology provide over and above what WebXR hit testing? We may also have feedback on the API to share - but I wanted to get this out to you quickly.
Dan: posts comment
2021-04-26
Dan: I left feedback, asked how it fits with HIT testing and for more examples and use cases. The requester has responded well. Not HIT testing, but did respond on additional use cases and quantization issue. Still more we're waiting for.
Rossen: since last time we talked, nothing much has changed. Discussion last time was around privacy and what that means for users. Through such API you can pretty quickly and easily map out somebody's physical environment. What does that mean from a privacy point of view? If I use this does company x all of a sudden have a map of my office or bedroom? Don't believe this was addressed.
Dan: one way they say they're addressing that is through quantization of planes. Could make it less possible to fingerprint you based on how high your desk is..
Rossen: might make it slightly harder. The actual API set they are bringing in is pretty straightforward.
Dan: note that the phrase quantization still does not appear in the spec. leaves comment
Proposed comment:
Hi @bialpio - we're just taking another look at this this week... Still missing the connection to hit testing and some additional info regarding the privacy & security. Do you want to ping us again when you're ready for us to re-review?
Rossen: re-read paragraph 6 in the spec which is on privacy and security, still not addressing it. Recognising the issue but not necessarily proposing concrete ways of why this is a non-issue or how it will be mitigated. Instead just saying general XR concerns apply. Look at general XR anchors module.
Hadley: is it possible to do all of that on device or in the user agent? To keep the information from leaking out of the users control. If we can keep it within the user agent as opposed to expose it throug the API such that the application can work with it rather than....
Dan: part of the idea of the API but they don't do enough to explain how that happens. That's part of the issue with the security and privacy thing, not really going in to detail on the abuse cases so its' not clear how the design doesn't allow for the information to be misused.
Rossen: we had similar concerns with hit test but they said at the time hit test is completely contained in the user agent and the only thing you get back is the object that you hit tested, not anything identifying about it. This one goes one level further, or more.
2021-05-Arakeen
Dan: PR that addresses our feedback.
Rossen: things demystified.. added required name... good.. must not reveal details of people..
Dan: PR is merged. We should close this.
Rossen: [leaving comment]
Dan: this is a success story. Do we need wider review? Proposed closed?
Rossen: closed it.
Rossen: [closing with comment]
OpenedMar 24, 2021
Ya ya yawm TAG!
I'm requesting a TAG review of WebXR Plane Detection Module.
WebXR Plane Detection API allows WebXR-powered experiences to obtain information about planes (flat surfaces) detected in the users' environment. The information about detected planes' poses (position + orientation) and approximate shapes is being surfaced to the application on every XR frame if the feature was enabled on an XR session. This will allow the applications to provide more immersive experiences, for example by leveraging the returned plane information when computing interactions of virtual objects with users' surroundings.
Further details:
You should also know that...
This is an early review request, although we already have a rough specification draft so I decided that "Specification review" may be more appropriate.
We'd prefer the TAG provide feedback as (please delete all but the desired option):
🐛 open issues in our GitHub repo for each point of feedback