Interactions Gestures

Telerik XR Interactions package provides several gestures that allow you to interact with UI and 3D objects in the scene. This article provides more information on the usage of these gestures.

For more information on the cross-platform approach for implementing the gestures mechanism using XR Rig Tracking Space nodes, see the Telerik XR Core documentation.

Touch Surface Volume Gesture

To support close interactions with fingers, Telerik XR Interactions uses the TouchSurfaceVolumeGesture component that detects the available touch surfaces located close to the hand. This is achieved by using Unity collision detection and by searching for nearby colliders for components that implement the ITouchSurface interface. To set up the touch gestures, you can drag and drop the Touch Surface Volume Gesture prefab as a child of the XR Rig root. You should add two separate instances of this prefab corresponding to each hand and you should additionally make sure that the Hand component value is changed to specify which hand is being tracked. For more information on the touch finger interactions, see the Touch Interactions documentation article. Alternatively if you have purchase the full Telerik XR Complete solution from the Unity Asset Store then you may consider using the Gestures prefab from XR Integration package, as this prefab contains the gesture setup with colliders for both hands and also contains some additional predefined gestures.

Two Hand Scale Gesture

Telerik XR Interactions provides the TwoHandScaleGesture that you can use to change the scale of a scene object by using hand tracking. To achieve that, the user must pinch with both hands and at the same time move the hands to change the scale and rotation of the target object. To set up the hand gesture, drag and drop the Two Hand Scale Gesture prefab as a child of the XR Rig root, and then initialize the TwoHandScaleGesture component Target field with the scene object that you want to be modified. The available API for customizing the behavior of this component is as follows:

  • target—this is the Transform that will be modified during the gesture execution. It is implemented as a public field allowing you to change it runtime.
  • leftHandPinch—this is the Transform that visualizes the pinch position of the left hand. It is implemented as a protected field so it may be accessed by inheriting the behavior.
  • rightHandPinch—this is the Transform that visualizes the pinch position of the right hand. It is implemented as a protected field so it may be accessed by inheriting the behavior.
  • lineRenderer—this is the line that is rendered between both hands pinch points when the gesture is active. It is implemented as a protected field so it may be accessed by inheriting the behavior.
  • ruler—this is the game object that positions the pinch information in the middle of the pinch line. It is implemented as a protected field so it may be accessed by inheriting the behavior.
  • rulerText—this is the text that is rendered in the middle of a pinch line. By default, it displays the pinch line length. It is implemented as a protected field so it may be accessed by inheriting the behavior.
  • isScaling—this is a read-only property providing information if there is currently a captured target that is being modified.
  • UpdateGestureAppearanceWhenPinching—this is a protected virtual method that manages the rendering of the gesture hierarchy and may be overridden by inheriting the TwoHandScaleGesture class.

For an example of gesture usage, check the Two Hand Scale Demo scene. This gesture works with hand tracking so you will need to deploy the scene to a capable device. For Oculus Quest you should prepare your project as described in these hand tracking requirements before building and deploying the apk file.

In this article
Not finding the help you need?