Gestures Mechanism

Telerik UI for Unity XR provides a cross-platform tracking space that keeps the position of different XR components—head and hands position, controller buttons states, and finger nodes. Based on this hierarchy, you may implement the IXRRigNode interface and read the current values from the tracking space, which allows you to implement custom gestures logic.

Hand Gestures

You can handle hand gestures by implementing the Update method of the IXRRigNode interface. Through the rig argument of this method, you can access the tracking property and get the following hand information from the TrackingSpace instance:

  • GetHand—a method providing the Hand instance for the left or right hand.

  • TrackingHandPose—may be accessed from the pose property of each hand and provides information for its fingers.

  • GetFinger—a method providing specified finger DigitPose information:

    • An open field with values ranging from 0 to 1.
    • A pinch field providing a Pinch for the specified finger.

Controller Gestures

Controller gestures are implemented in a similar manner. Through the Hand class instance, you have access to the controller property that provides the following information:

  • trigger—a property of type Button providing information for the current state of the trigger button.
  • grip—a property of type Button providing information for the current state of the grip button.
  • button1—a property of type Button providing information for the current state of button1.
  • button2—a property of type Button providing information for the current state of button2.
  • menu—a property of type Button providing information for the current state of the menu button.
  • joystick—a property of type Joystick providing information for the current state of the trigger button.
In this article
Not finding the help you need?