XR Input Toolkit 2020 FAQ

Unity’s XRInputToolkit is currently in Preview mode – i.e. it’s not launched yet, but we have early access and a chance to use it + report bugs before the main launch. This is great! But … the official thread for support / queries is now 5 pages and growing, and I’m finding it very hard to find info I remember was in there … somewhere. So here I’m collating the FAQs I’ve found there + elsewhere

Subscribe for FAQ Updates

I'll email you when there are updates / new answers to the FAQ, and occasional major news about Unity VR.

Installation / Getting Started

Basics

Advanced

UnityUI / GUI

Input

Interactors & Interactables

Grabbing & XRGrabInteractable

Performance

Detecting & Configuring Headsets

Vendor-specific: OpenVR/Valve

Vendor-specific: Oculus

Vendor-specific: Google/Cardboard

Installation

FAQ: Official Unity mini-site for XRInputToolkit?

https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@0.9/index.html

Note: if you are building for Oculus headsets, you may also want to download the Oculus-specific extras for developers, that enable some Oculus features not configurable directly using Unity XR: FAQ: How do I install the Oculus add-ons for VR / OVRPlugin / Oculus Integration tools?

FAQ: Pre-launch support forum thread?

https://forum.unity.com/threads/xr-interaction-toolkit-preview-release.795684/

FAQ: I can’t install it from the Package Manager

Unity 2019.3.x: Since it’s in Preview mode, you need to manually select “Show preview packages” or it won’t appear. NB: this is a “feature” 🙂 in Unity that you have to do this every time you open the PackageManager. The button is (currently, Unity 2019.3+) accessed via the button in top-center of PackageManager, titled “Advanced”.

UPDATE 2020: Unity’s PackageManager team broke this, so that the “Preview packages” button is now deleted from Unity v2020.x. Instead you have to open your ProjectSettings (e.g. from the Menu: Edit > Project Settings), select “Package Manager”, and enable Preview packages there (https://forum.unity.com/threads/xr-interaction-toolkit-preview-release.795684/page-9#post-5859070)

FAQ: The XR Plugin Management tab is missing after installing the package?

This is due to bugs in Unity’s PackageManager system, which XR relies upon. If there are any build-errors in your project, or any compile-errors, then the PackageManager will stop partway through installing new packages, and leave your Editor in a halfway state where the package is installed but not available.

“any errors in compilation or package import will cause package manager to report the package as installed but the package will not activate till all errors are dealt with.” – https://forum.unity.com/threads/xr-plugin-management-tab-missing-after-installation-of-the-package.884218/#post-5811334

So … check your console log, fix the errors … then quit and restart the Editor. The package should automatically finish installing at that point.

Answers: Basics

FAQ: How do I get an InputDevice?

From the manual, the simplest way if you know which controller / item you want (do you want the headset? the hand? the controller?) is:

var devices = new List<InputDevice>();
InputDevices.GetDevicesWithCharacteristics( 
 InputDeviceCharacteristics.Controller &amp; 
 InputDeviceCharacteristics.TrackedDevice &amp;
 InputDeviceCharacteristics.Left, devices
);

InputDevice device = devices[0]; // assuming it found it

If you just want the device for a controller, and you have a reference to the XRController object (e.g. drag/dropped in the Editor):

InputDevice device = InputDevices.GetDeviceAtXRNode(controller.controllerNode);

FAQ: How do I detect button.IsPressed?

The preferred way appears to be: do it manually by clicking in the UnityEditor. You have to select your XRInteractor component (e.g. XRRayInteractor), and find the listener / event for OnSelectEnter – you can then hook this up to your component in the same way as adding an onClick listener for any UnityUI button.

If you want to do it in code it’s a bit cleaner, but lower-level, so requires quite a lot of code. Minimum is 1 line, but the maximum is about 40 lines. Unity provides a copy/pasteable complete example in the manual (not all of this is needed, but it’s a good place to start) – scroll down to the section “Example for primaryButton”: https://docs.unity3d.com/2019.1/Documentation/Manual/xr_input.html

The 1-line version is:

if( device.TryGetFeatureValue( CommonUsages.trigger, out bool isPressed) && isPressed)
{
// do your processing here
}

But you need to have a reference to an XRDevice first – see FAQ:-How-do-I-get-an-InputDevice?.

FAQ: How do I handle joystick input?

(This is the same as “How do I detect button.isPressed?”, but the method you use is slightly different)

The 1-line version is:

if( device.TryGetFeatureValue( CommonUsages.primary2DAxis, out Vector2 joystickAngle) && isPressed)
{
// do your processing here
}

But you need to have a reference to an XRDevice first – see FAQ:-How-do-I-get-an-InputDevice?.

Answers: Advanced

FAQ: Some of Unity’s methods are private/internal – how do I use them?

You can edit the source, but every time you start a new project or upgrade, Unity will delete all your changes.

The Unity team have declared that they plan to remove these artificial blocks (use of the private keyword, use of the internal keyword).

Until then, see this tutorial for info on how to seamlessly access the private methods from your classes – while making sure your changes do NOT get wiped when you upgrade the Editor or the XR packages: http://snapandplug.com/how-to-fix-unitys-private-vr-xr-code-so-it-becomes-public/

FAQ: How do I manually configure the Controllers with TrackedPoseDriver?

NB: for all new projects, you should do this automatically: From the Create menu, select “XR” and e.g. the “Room-Scale Rig” (or one of the other Rigs) – this will automatically create all the objects and setup the connections correctly.

If upgrading an old VR project, or customising a non-VR project with existing player-controller etc: https://forum.unity.com/threads/improving-project-configuration-for-ar-vr.864457/#post-5831380

FAQ: How I create a custom XR Plugin / make a new plugin for custom hardware?

Signup with Unity to get access to their SDK for hardware manufacturers/maintainers here: https://create.unity3d.com/vsp-signup-form?

Unity: “we do not charge any fee to access our XR SDK and once a form has been filled out, you will be taken directly to a page that includes material for download. There is no additional gatekeeping where we accept / reject applications.

Again, our goal is focused on enabling the entire ecosystem by providing access to our engine-level optimizations where developers can benefit from more performant plug-ins developed & maintained by 3rd-party AR/VR hardware and software providers. Or even implemented by folks within our community.” – https://forum.unity.com/threads/xr-plugins-and-subsystems.693463/page-6#post-5833876

Answers: Unity UI / GUI

FAQ: Is there a CurvedCanvas available?

The old CurvedUI asset from Unity Asset Store works (launched 4 years ago), not upgraded to new systems, but apparently works if you delete the parts that reference old APIs etc.

FAQ: How do I ignore a button-press/trigger/grab on objects when the menu is in the way?

XRInputToolkit is directly integrated with the EventSystem from UnityUI, so you can use the same approach from UnityUI: query the method IsPointerOverGameObject, which should return true when the UI is consuming the click (the method was poorly named, it really means IsPointerOverUnityUIObject!)

https://docs.unity3d.com/2018.2/Documentation/ScriptReference/EventSystems.EventSystem.IsPointerOverGameObject.html

Answers: Input

FAQ: How do I simulate VR inputs using scripts or the keyboard (for testing/debugging)?

(for general headset simulation, see: #FAQ:-How-do-I-test-without-a-headset?-/-Is-there-a-headless-mode?)

(June 2020) This is currently not supported by Unity (it’s a missing feature that we hope they’ll add before they release the final version), but you can find a community-provided solution here: https://forum.unity.com/threads/unity-xr-input-possible-to-simulate-input-events-via-a-fake-inputdevice.905738/#post-6003896

Example usage (after you’ve downloaded the scripts attached to that forum post and followed the setup instructions):

 //Class begining
 public KeyCode activateUI = KeyCode.E;
 
 //LateUpdate
 UpdateXRControllerState("uiPress", activateUI);

Answers: Interactors and Interactables

FAQ: Can I use more than one Interactor on a single Controller/Hand?

May 2020: no. Unity has designed their API so that this is not really possible (you can make it work, e.g. see here for one approach, but it’s quite likely to break in a future release).

(note: in general it is not possible for Unity to do this automatically: there are many difficult questions with unknown answers around “which interactor takes priority? In which situations?” etc)

Unity’s preferred approach seems to be: write your own custom Interactor that combines elements of both the Interactors you wanted.

Answers: Grabbing & XRGrabInteractable

FAQ: Can I force a hand/controller to drop held items?

“toggle m_AllowSelect on the XRGrabInteractables. This will force any held object to be dropped by the hand’s interactor.” – https://forum.unity.com/threads/xr-interaction-toolkit-preview-release.795684/page-3#post-5382996

FAQ: Grab an object and attach it to the hand using Physics? Attach it using parenting? Attach it using velocity?

There are three modes for XRGrabInteractable, which you configure in the component from the UnityEditor inspector, or in code. The variable is named “movementType” and has these values:

https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@0.9/api/UnityEngine.XR.Interaction.Toolkit.XRBaseInteractable.MovementType.html

FAQ: How does attachPoint work/how does attachTransform work?

“It should take the transform of the GameObject with the Interactable Monobehaviour, and set it’s position/rotation to the same as the Hand’s Attach Point.” – https://forum.unity.com/threads/xr-interaction-toolkit-preview-release.795684/page-4#post-5451060

FAQ: How do I temporarily disable a Grabbable (so it can’t be grabbed)?

May 2020: there is no direct support for this in the API. Workarounds:

(SAFEST): Destroy the Grabbable component and recreate it when you want to re-enable the Grabbable:

GameObject.Destroy( GetComponent<XRGrabInteractable>() );

(DANGEROUS BUT EASIER): Disable the Colliders (this will ALSO break ALL PHYSICS on the object: don’t do it if you need the physics for something else). This only works if you have the Colliders as child-objects of the RigidBody (if they’re on the same object, you can’t disable them), and you also need to set the RigidBody to kinematic (otherwise it will fall through the floor, as it now has no collision-detection):

(Collider) myCollider.gameObject.SetActive( false );
(RigidBody) myBody.isKinematic = true;

FAQ: How do I start the scene with a specific Grabbable already held by the controller/hand?

XRBaseInteractor (the base class for all Interactors) has a variable for this:

startingSelectedInteractable

which is intended for pre-grabbing at start of a scene / when an Interactor goes live (it’s processed in the Interactor’s Awake() method)

Answers: Performance

When you connect a wires-free Oculus headset using a wire (e.g. Link-cable with a Quest), and run the Oculus Desktop, Oculus combines the power of your desktop GPU with the headset GPU, providing a performance boost.

Oculus has a detailed technical explanation here: https://developer.oculus.com/blog/how-does-oculus-link-work-the-architecture-pipeline-and-aadt-explained/?locale=en_US

This is also why Oculus sometimes allows Quest users to purchase/run Rift-only (desktop-only) games on a Quest: Oculus only allows it while the Quest is connected to a desktop PC with a powerful-enough GPU.

Answers: Detecting & Configuring Headsets

FAQ: How do I prevent VR starting automatically / start VR after the app is already running?

This isn’t linked anywhere in the XR Interaction Toolkit docs, but instead is in the XR Management docs (confusingly, the two systems are highly interdependent, but completely separate).

“If you want to start XR on a per-Scene basis (for example, to start in 2D and transition into VR), follow these steps:

  1. Access the Project Settings window (menu: Edit > Project Settings).
  2. Select the XR Plug-in Management tab on the left.
  3. Disable the Initialize on start option for each platform you support.
  4. At runtime, call the following methods on XRGeneralSettings.Instance.Manager to add/create, remove, and reorder the Loaders from your scripts:”

The methods you need to call are incorrectly documented in the official Unity docs. You actually need to read the XRManagerSettings.loaders variable, change the order of items inside it, and then re-assign back to it (according to Unity staff). Source: https://docs.unity3d.com/Packages/com.unity.xr.management@3.2/manual/EndUser.html

HOWEVER: if you do this in Editor, it will save the changes, as Unity currently incorrectly stores this as a ScriptableObject. Bugs have been reported, so this will probably get improved/updated in a future release.

FAQ: Can I have multiple VR headsets active in a single Unity app?

Currently: no. “We don’t have support in Unity for multiple headsets in the same application instance” – https://forum.unity.com/threads/xr-interaction-toolkit-preview-release.795684/page-2#post-5346522

FAQ: Should I install the LegacyInputHelpers in Unity > ProjectSettings > XRInput?

Currently (2020): yes. “The LegacyInputHelpers dependency is to get access to the TrackedPoseDriver, which is also being upgraded. Support for the new Input System is on the short term roadmap.” – https://forum.unity.com/threads/xr-interaction-toolkit-preview-release.795684/#post-5299884

FAQ: How do I test without a headset? / Is there a headless mode?

You can install the “Mock HMD XR Plugin” via PackageManager. Dedicated site/docs here: https://docs.unity3d.com/Packages/com.unity.xr.mock-hmd@1.0/manual/index.html . This provides a render-only fake VR headset (you have to build your own simulation for the controllers, e.g. using mouse/keyboard inputs).

FAQ: How do I trigger a teleport via script/code?

“if you look at the code for the teleport interactables, they submit a request to the teleportation provider. you can also submit your own request via code. QueueTeleportRequest is the function.” – https://forum.unity.com/threads/xr-interaction-toolkit-preview-release.795684/page-4#post-5413524

FAQ: How do I make grabbed objects jump to my hand if trigger already held-down before hovering?

“That got turned into a setting on the XRRayInteractor. You should now see a Select Action Trigger setting that by default is set to State Change. The original behaviour that you want is State.” – https://forum.unity.com/threads/xr-interaction-toolkit-preview-release.795684/page-4#post-5421999

FAQ: Can multiple Interactors select the same Interactable?

Currently: it seems “no”. – https://forum.unity.com/threads/multiple-interactors-selecting-the-same-interactable-with-xr-interaction-toolkit.845011/

FAQ: My code using Render to Texture stopped working / My Camera FOV is wrong?

To protect people from accidental changes, Unity now ignores all changes to camera FOV on VR cameras. Currently, this is a Warning (should probably be an Error!).

At the same time: as soon as you enable VR in Unity, all new Cameras are automatically created as VR cameras, even if you are not using them for VR.

Solution: set the camera.stereoTargetEye = StereoTargetEyeMask.None; to declare this is a non-VR camera, and/or set the camera.targetDisplay = //something greater than 0; to tell Unity not to send this camera’s data to the VR headset.

NB: The second step should not be necessary for manually scripted Cameras, but is probably a good idea to do anyway.

FAQ: My Grabbables stop working / can’t pick them up any more?

There are a couple of bugs in XRInteractionManager / XRBaseInteractable which allow the interaction manager to permanently “lose” track of objects in the scene. There are multiple ways you can end up here, but the obvious one is any time you reparent any object that has Colliders attached, or has any Colliders on any children (very common!) – this will break the InteractionManager.

Until Unity fixes these bugs, you need to manually patch the XRInteractionManager, and add a method to update the “registration” data. You cannot safely use the Unregister/Register methods because those are bugged (although they almost handle this, they can/will corrupt other objects in the scene).

https://forum.unity.com/threads/xr-interaction-toolkit-preview-release.795684/page-7#post-5658190

FAQ: XRGrabInteractable crashes if I destroy the object when dropping it?

This is a bug in XRGrabInteractable. Edit the class and in the OnSelectExit method, wrap the calls to “rigidbody” in an “if( m_RigidBody != null )” clause. e.g.:

if( m_RigidBody != null )
{
    m_RigidBody.isKinematic = m_WasKinematic;
    m_RigidBody.useGravity = m_UsedGravity | m_GravityOnDetach;
}

..and similarly in Detach() in that class. Bug number: 1232882

FAQ: What’s the replacement for XRDevice.model / how do I read the VR device name?

Use the new InputDevice.name API, however, see: #FAQ:-InputDevice.name-gives-incorrect-/-empty-data?

FAQ: InputDevice.name gives incorrect / empty data?

Usually this is because a manufacturer changes their internal description of their devices, and Unity’s XRIT needs to be updated with the new values (can’t do this ourselves, have to wait for Unity to publish an update).

Worth reporting as a bug, although currently (v0.9.3) there’s at least one known bug with Oculus Quest reporting the wrong name.

Vendor-specific: Oculus

FAQ: How do I install the Oculus add-ons for VR / OVRPlugin / Oculus Integration tools?

Many features of Oculus headsets are not available via Unity XR (either because they are too proprietary to Oculus, or because Unity’s VR team hasn’t caught up with Oculus’s dev team yet).

Oculus ships a free bonus project that is designed to work with any Unity VR setup (both legacy Unity and modern Unity XR) and adds the missing features. Note that this plugin often has misconfigured or broken scenes – there is a huge amount of sample code, and Oculus staff generally forget/don’t bother to update the older things – but new features generally are demoed effectively here. Some of the useful things you’ll find in this package:

  • Meshes + textures for all Oculus controllers
  • Hand-tracking support + scripts
  • Example code + scenes for all Oculus features
  • … + much more

This is a free download from the Unity Asset Store:

https://assetstore.unity.com/packages/tools/integration/oculus-integration-82022

There is also an official documentation page for this download from Oculus that contains extra setup / known bugs / etc, and is regularly updated:

https://developer.oculus.com/downloads/package/unity-integration/?_fb_noscript=1

(that page also contains a link to the older versions of the AssetStore package, which Unity will block you from downloading from the AssetStore for no reason at all, but Oculus will allow you to get)

This is specific to the Oculus Quest (not even the Rift, only the Quest): Oculus Link is a special feature of Oculus Quest headsets where you can use your desktop/laptop PC’s more powerful GPU to drive the VR rendering for your less-powerful Quest.

(The Quest has an onboard GPU – it’s basically an Android phone embedded inside the headset)

This enables multiple additional features, including: higher performance/quality rendering (assuming your GPU is relatively expensive, it could be considerably faster than the Oculus Quest one), and live debugging from within the UnityEditor.

It requires a USB-3.0 cable (see: FAQ:-What-is-an-Oculus-Link-cable-and-do-I-need-one-where-do-I-get-one?)

FAQ: How to detect Quest entering/exiting stand-by mode?

Currently: unknown. e.g. https://forum.unity.com/threads/how-to-detect-quest-entering-exiting-stand-by-mode.862000/

To use Oculus Link (#FAQ:-What-is-Oculus-Link-Quest-Link?), you need a standard USB-3.0 cable. This is often called a “Link Cable” or “Oculus Link Cable”, but is nothing special, just a standard USB-3.0 cable.

NB: USB-3.0 cables are different to all previous versions of USB – it’s a physically different cable with extra wires inside. A cheap cable that is “compatible with USB 3” is probably a USB1/USB2 cable and won’t work with Link. Most vendors specifically sell cables as “USB3” or “USB3-ready”.

The Oculus community maintains a huge spreadsheet of specific cables from specific manufacturers, and user-reports on whether they worked (plus places you can buy them online)

FAQ: Primary controller / cursor randomly switches from one hand to the other?

Check your battery levels (this tends to happen when the battery still reports 10-20% charge, but actually it’s too low and needs replacing), and check that you’re not in bright sunlight / bright full-spectrum lights.

This seems to be caused by the headset losing track of the main controller briefly, and then automatically switching over to the other controller as a backup. If both controllers have low-battery and/or too bright lighting, it will keep jumping between them.

Vendor specific: Valve/Steam

FAQ: How do I get SteamVR/OpenVR working with XR Interaction Toolkit?

Updated May 2020:

Valve is providing a dedicated plugin for XR, although it is currently incomplete (no skeletal input yet, and no user-rebindable actions). It’s in public beta, full source code + details here: https://github.com/ValveSoftware/steamvr_unity_plugin/tree/UnityXRPlugin

See also Valve’s launch announcement here: https://steamcommunity.com/app/250820/discussions/7/2268069450205612646/

Vendor-specific: Google/Cardboard

FAQ: How do I install the plugin for Google Cardboard?

May 2020: Google has released the official XR Plugin for Cardboard here: https://github.com/googlevr/cardboard-xr-plugin – and the Install guide is here: https://developers.google.com/cardboard/develop/unity/quickstart

There is also a community-provided solution (because Google’s official version took 6 months between announcement and release) here: https://github.com/mobfishgmbh/Cardboard-VR-Unity-SDK

Updates

Subscribe for FAQ Updates

I'll email you when there are updates / new answers to the FAQ, and occasional major news about Unity VR.

Post Author: adam

2 thoughts on “XR Input Toolkit 2020 FAQ

    Gerald Foster

    (June 1, 2020 - 8:19 pm)

    Is there anyway to use hand tracking with the xr plugin?

      adam

      (June 23, 2020 - 11:04 am)

      Unity hasn’t included hand-tracking in XR yet – but if your hardware vendor supports it, you can still use their proprietary tracking code. So e.g. for Oculus Quest: you install Unity XR (as per the FAQ), but then you also install the Oculus OVR system (which duplicates a lot of Unity XR, but is proprietary to Oculus), and delete the bits that conflict. Then you are left with Oculus hand-tracking + Unity XR together.

      (I have not tested this myself, it’s on my todo list but I’ve been too busy :(. I will update the FAQ once I’ve verified the above steps!)

Leave a Reply

Your email address will not be published. Required fields are marked *