IXRRigNode Interface

To ensure predictable ordered execution for the different XR features, the Telerik XR Core package provides an IXRRigNode interface. All the methods of this interface receive the XR Rig parameter allowing to either read or modify the rig hierarchy. This approach keeps the XR Rig as the medium for achieving communication between different features which is the key for the cross-platform approach for developing features such as hand gestures and controllers input.

Implementing Custom Node

When creating a custom XR Rig node, you must implement the following methods of the IXRRigNode interface:

  • FixedUpdate—receives an XR Rig instance allowing you to read or modify its values during the FixedUpdate call in the Unity cycle.

  • Update—receives an XR Rig instance allowing you to read or modify its values during the Update call in the Unity cycle.

  • OnBeforeRender—receives an XR Rig instance allowing you to read or modify its values during the onBeforeRender Unity event.

Order of Execution

The order of execution of all IXRRigNode interface instances is determined by the root XR Rig. During each Update, FixedUpdate, and onBeforeRender call, the root rig is traversing its children and calls their corresponding methods. If you want some of the children to be called in a specific order, then you may place them at the same level of the rig hierarchy and in the order you want them to be called.


The following examples of different features within Telerik's packages will give you a better understanding of how the IXRRigNode interface may be used:

  • Input—when handling input information from an XR Device, you may implement the IXRRigNode and modify the TrackingSpace nodes to match the device input. The Unity Input prefab does this with the cross-platform information provided by Unity, but you may add another vendor-specific input on top of that—such as Oculus Hand Tracking and Controller Poses prefabs which are available in the Telerik XR Interactions package.

  • Gestures—when handling either controller button input or some hand tracking gesture, you may implement the IXRRigNode interface and read the current values of the TrackingSpace children hierarchy. More information on this approach is available in the Gestures Mechanism documentation article.

  • Pointers—a common approach for interacting with virtual objects is to use XR pointers. The PointersController class is implementing the IXRRigNode to set the active state of its child pointers. On the other hand, each of the child pointers may also implement the interface in order to read the current state of the rig hierarchy—finger pointers reading the finger states, hand pointers the hand states, and controller pointers the controller state.

  • Tracked device—when you need to have some 3D object that needs to move with the virtual hand or head, then you may use the TrackedDevice class which implements the IXRRigNode to read the corresponding device position. Example usages of the tracked device are the skinned hand model that visualizes the virtual hands and also the controller layout that show tooltips over the XR buttons.

In this article
Not finding the help you need?