proposals icon indicating copy to clipboard operation
proposals copied to clipboard

Virtual Keyboard API

Open AlbertoElias opened this issue 6 years ago • 19 comments

There are many many use cases for text input and currently each framework or site is creating its own virtual keyboard implementation. Browsers have their own for 2D websites and URL input, it would be great for WebXR sites to be able to spawn that virtual keyboard.

Made up API:

const keyboard = navigator.getVirtualKeyboard();
keyboard.open();
keyboard.addEventListener('keypress', (event) => {
    custom3DInput.value += event.key;
}
keyboard.close();

AlbertoElias avatar Oct 05 '18 12:10 AlbertoElias

I see this convenient for many experiences. Content should though still have the option to implement its own keyboard. VR input is not as mature as touch screens and applications might have different needs or UX ideas than the User Agent. A good conversation to have but maybe this should not block WebXR v1 and can be speced out later?

dmarcos avatar Feb 15 '19 19:02 dmarcos

I agree applications should be able to implement their own thing. I think this API would be specially useful for libraries having to implement their own keyboard, and then applications adding multiple libraries, and ending up with many keyboards.

I definitely agree it shouldn't block WebXR v1, but as UAs are shipping their own keyboard anyway, this feels (and I speak without deep underlying knowledge about adding APIs to browsers) like a simpler thing to add, just to expose that keyboard to developers.

AlbertoElias avatar Feb 16 '19 12:02 AlbertoElias

Would there be benefit to making such keyboards contextually sensitive? Eg, date selector, URL entry specialized, password entry specialized?

kearwood avatar Mar 12 '19 17:03 kearwood

In the case when a user has a physical keyboard, such as a Bluetooth keyboard paired to an AR headset, perhaps this should be surfaced to the API so that it may adapt UX, without having to use a separate API.

kearwood avatar Mar 12 '19 17:03 kearwood

I'm surprised this hasn't yet come up for touchscreen devices in regular browsing (non immersive) experiences: are there not cases where you want to be able to surface a keyboard where there's no <input> involved (perhaps you want to "type" into a canvas)? This might be a useful API for the web in general to have.

Manishearth avatar Feb 06 '20 00:02 Manishearth

After some discussion, some points:

Firstly, we should probably gate this on user gesture/action like many other APIs that have spamming concerns

Secondly, the API should not be "bring up a virtual keyboard" but rather "I would like to receive input please" (and "I would like to stop"). If a physical keyboard is connected, it's a no op. For most other cases it would bring up a system virtual keyboard. This can be tailored further at the UA level and can offer accessibility benefits: e.g. if the user wants to use voice input it can turn on a mic (there are some privacy concerns here though).

Manishearth avatar Feb 06 '20 06:02 Manishearth

@simultech and I discussed this and we suspect that no additional API is needed to enable this.

While in non-immersive mode, if an element gets focused, we bring up the appropriate virtual keyboard. During immersive sessions, this does nothing.

What if we change that behavior and bring up the virtual keyboard in VR? The correct keyboard would be brought up and all its input would go to the html element. The spec provides the `visible-blurred' event for trusted UI. Maybe it needs to call out what needs to be done for attached devices such as Windows MR or Oculus Rift?

cabanier avatar Mar 17 '21 16:03 cabanier

I think that could be fine. There may be some other weird interactions but I'm overall happy to approve text to that effect, pending approval from the group.

Manishearth avatar Mar 17 '21 18:03 Manishearth

That's an interesting approach! It would expand fairly naturally to whatever DOM-in-XR mechanisms we devise down the road and would already be the natural behavior exposed by DOM overlay. I'd be very interested in seeing a proof of concept.

toji avatar Mar 17 '21 19:03 toji

I can't make any promises but if we can get it to work, we'll likely have a setting under chrome://flags so people can experiment.

cabanier avatar Mar 17 '21 19:03 cabanier

The biggest issue I see right now is the interoperation between immersive session and this keyboard. Mainly 2 specific ones

  1. Occlusion with controller and other stuff in the session. Current state of things requires UA's to render controllers along with the content (with the library of a choice), but how visual part of the keyboard are gonna work with that? It should probably be farther than controllers but closer than the other content.

I can see how I can take a texture of a keyboard though and render it as I prefer

  1. Interaction with keyboard itself. Again it's currently on the UA's side to do the hit tests and render rays so how it could work? Will it switch to the platform implementation in the meantime while it's opened?

Altough if we could fire something like pointermove events to the keyboard with the local coordinates it might work

saitonakamura avatar Aug 06 '22 11:08 saitonakamura

The biggest issue I see right now is the interoperation between immersive session and this keyboard. Mainly 2 specific ones

  1. Occlusion with controller and other stuff in the session. Current state of things requires UA's to render controllers along with the content (with the library of a choice), but how visual part of the keyboard are gonna work with that? It should probably be farther than controllers but closer than the other content.

The session would generate a blur event which signals to the experience that controllers shouldn't be rendered. The keyboard and native controllers are then drawn by the OS.

I can see how I can take a texture of a keyboard though and render it as I prefer

You can already do that today :-)

  1. Interaction with keyboard itself. Again it's currently on the UA's side to do the hit tests and render rays so how it could work? Will it switch to the platform implementation in the meantime while it's opened?

yes. The UA wouldn't get any controller input while the system keyboard is up.

cabanier avatar Aug 08 '22 20:08 cabanier

We are making progress on enabling the native keyboard during an immersive session. As mentioned before, it will be triggered by focusing an element just like on a 2D page. I'm wondering if we need a flag to indicate that the UA supports this. Most implementations won't have this feature and there would be no way for an author to know if a keyboard was shown. (Listening to blur events could work but feels hacky)

/agenda should WebXR have a flag and normative text for keyboard input?

cabanier avatar Nov 04 '22 23:11 cabanier

Was looking at https://developer.oculus.com/documentation/web/webxr-keyboard/ And then looked for isSystemKeyboardSupported in the immersive-web organization, and couldn't find. So I'm linking it here so other people would be able to find this discussion.

De-Panther avatar Feb 11 '23 02:02 De-Panther

And my added 2cents to this discussion: The current implementation of input text field is already slightly different between browsers. (e.g. Chromium based and WebKit based browsers has different behavior when setting input text to hidden or visible)

I think the issue here is not keyboard API, but a text input API for headsets.

Currently, In an AR experience on phone/tablet, developer can use the DOM Overlay and display a visible input text element. Or they can do the hacky way of having an outside of the view input text, focus on it, and making sure to hide it and call blur.

And there's the experimental Quest Browser support using the input text.

But in the end, what the developer needs, is a way from a WebGL/WebGPU or WebXR experience to get text input using an OS or UA consistent mechanism. It might be a 2D virtual keyboard, it might be an overlay of 3D virtual keyboard, and it might be an overlay button for speech-to-text.

De-Panther avatar Feb 11 '23 03:02 De-Panther

Just some feedback about current implementation. We tried it in out production app with Oculus Quest 2 and here's some thoughts:

  • We need some kind of API to control keyboard position. If the virtual input is in the center of screen, the keyboard opens in front of it, obscuring user input
  • Input attributes like type, pattern or inputMode have no effect on the keyboard appearance
  • We should be able to edit existing input's value, not only rewrite text

Ledzz avatar Apr 11 '23 09:04 Ledzz

Just some feedback about current implementation. We tried it in out production app with Oculus Quest 2 and here's some thoughts:

  • We need some kind of API to control keyboard position. If the virtual input is in the center of screen, the keyboard opens in front of it, obscuring user input
  • Input attributes like type, pattern or inputMode have no effect on the keyboard appearance

Thanks for trying it and providing feedback! The first two points are precisely some of the items I have planned next. In fact I already have a working prototype for positioning, just need to iron out the API and spec it.

  • We should be able to edit existing input's value, not only rewrite text

This one will take a bit longer since there are some internal pieces to work around. There's a feature for the keyboard itself coming down the pipe soon that may help solve this, but I don't want to overpromise too soon.

emmanueljl avatar Apr 17 '23 16:04 emmanueljl

Is there any standardization on how to do this these days? @AdaRoseCannon @cabanier 🙏🏽

richardanaya avatar Jun 29 '24 14:06 richardanaya

Is there any standardization on how to do this these days? @AdaRoseCannon @cabanier 🙏🏽

The standardization is done afaik. Is there a feature that's missing in the Quest's implementation?

cabanier avatar Jul 02 '24 15:07 cabanier