Hand Tracking

This guide will introduce what NRSDK can & cannot do while tracking hands. The tutorial will demonstrate how to add the hand tracking feature to your app and use your hands as an interaction model.

Notice

  • MR Apps using the Hand Tracking feature in NRSDK 1.7.0 can only operate until 12/31/2022.
  • This feature is under Beta phase, more devices/models will be supported in the next release of NRSDK.
  • This feature is fully tested on the following devices:
    • Nreal DevKit
    • Nreal Enterprise Kit
    • OnePlus:9R / 7T
    • LG:V60 / V50S ThinQ 5G / V50 ThinQ 5G / G9(Velvet 5G) / Wing
    • SONY:Xperia 5 II / Xperia 1
    • SAMSUNG:Galaxy Note20 5G / Galaxy S10+ / Galaxy S20+ 5G / Galaxy Z Fold 2 5G /
    • Galaxy Note 20UItra / Galaxy Note10+ 5G / GalaxyA90 5G
    • ZTE Axon 10 pro
    • Black Shark 2 ProNreal DevKit
  • Snapdragon888 / Exynos based models cannot be operated, will be supported in the next release of NRSDK
  • The operation of other devices is currently unknown

Introduction

NRSDK’s Hand Tracking capability tracks the position of key points of your hands and recognizes hand poses in real-time. Hand poses are shown in the first-person view and used to interact with virtual objects immersively in-world.

Capabilities

  • The NRSDK can track hands through the world coordinate frame and annotated the position and orientation of twenty-three key points;
  • The NRSDK currently supports six hand poses from either hand;
  • While tracking hand poses, the NRSDK also returns to a state of whether there are hands being tracked;
  • Left/Right hand detection is available in NRSDK;
  • When using hands as input, the hand’s pose drives a laser cursor-pointer that behaves like the standard controller cursor.

Hand Poses

General Gesture

../../../_images/ht1-1.png

Select Gesture

../../../_images/ht1-2.png
  • As long as the index finger and thumb pinch together (regardless of the pose of other fingers), it is considered a pinch pose.
  • Gestures above will all be recognized as pinch/select.

System Gesture

../../../_images/ht1-3.png
  • Keep this gesture for 1.2s to evoke the home menu.
  • Both left and right hands will be recognized.

Hand Pointer

Similar to a controller, NRSDK provides a pointer for each hand to interact with targets. The pointer pose and whether the pointer pose is being tracked can be obtained from HandState of each hand.

The pointer pose must meet the following conditions to be correctly tracked:

  • The hand is recognized
  • The palm direction is pointing forward

A basic hand pointer style is included in NRHand_R and NRHand_L prefabs located in Assets>NRSDK>Prefabs>Hands. You can also customize the style of Hand Pointer based on the pointer pose and related data, combined with some data in NRPointerRaycaster.

Joint

The NRSDK hand tracking system identifies the position of 23 key points on the hand for every recognized hand pose(position and orientation).

Joint Label

Index

Index API Name
0 Wrist
1 Palm
2 ThumbMetacarpal
3 ThumbProximal
4 ThumbDistal
5 ThumbTip
6 IndexProximal
7 IndexMiddle
8 IndexDistal
9 IndexTip
10 MiddleProximal
11 MiddleMiddle
12 MiddleDistal
13 MiddleTip
14 RingProximal
15 RingMiddle
16 RingDistal
17 RingTip
18 PinkyMetacarpal
19 PinkyProximal
20 PinkyMiddle
21 PinkyDistal
22 PinkyTip

Joint Orientation

../../../_images/ht1-4.png

Common Usage Of Hand Tracking

Sample Use Case:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
  //returns true if input source switch to hand tracking success
  bool switchToHandTracking = NRInput.SetInputSource(InputSourceEnum.Hands);

  //returns true if input source switch to controller success bool
  switchToController = NRInput.SetInputSource(InputSourceEnum.Controller);

  //returns true if hand tracking is running bool isRunning =
  NRInput.Hands.IsRunning;

  //returns the NRHand of right-handness NRHand hand =
  NRInput.Hands.GetHand(HandEnum.RightHand);

  //returns true if user is now performing system gesture bool
  isPerformingSystemGesture = NRInput.Hands.IsPerformingSystemGesture();

Details Of HandState:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
  //returns the HandState of  right-handness
  HandState handState = NRInput.Hands.GetHandState(HandEnum.RightHand);

  //returns the handness of this hand
  HandEnum handEnum = handState.handEnum;

  //returns true if this hand is tracked
  bool isTracked = handState.isTracked;

  //returns the start pose of hand ray pointer
  Pose pointerPose = handState.pointerPose;

  //returns ture if hand ray pointer pose is valid
  bool pointerValid = handState.pointerPoseValid;

  //returns true if this hand is performing pinching
  bool isPinching = handState.isPinching;

  //returns the current pinch strength value of hand. The range is from 0 to 1
  float pinchStrength = handState.pinchStrength;

  //returns the current gesture of hand
  HandGesture handGesture = handState.currentGesture;

  //returns the pose which contains position and orientation of thumb tip joint
  Pose thumbTipPose = handState.GetJointPose(HandJointID.ThumbTip);

Tutorial

Enabling Hand Tracking

  1. Create a new project in Unity with NRSDK. Refer to Quickstart for Android Unity for more setting up instructions.
  2. Delete the Main Camera from the scene hierarchy.
  3. Find NRCameraRig and NRInput prefab from Assets>NRSDK>Prefabs>NRCameraRig. Drag them to the scene hierarchy. image0
  4. Select the NRInput GameObject in the Hierarchy window to open the Inspector window, and choose Hands as Input Source Type. image1
  5. Find NRHand_R and NRHand_L from Assets>NRSDK>Prefabs>Hands. Add them as child GameObjects of Left and Right anchor in NRInput correspondingly. image2
  6. Now you are ready for hand tracking, refer to Building a Project with User Input for adding more interactions with objects.

Samples are included in the plugin. Please refer to Assets>NRSDK>Demos>HandTracking for details.

Requirements & Limits

  • Hand tracking SDK uses on-board camera(s) to detect hands, so make sure hands are visible from the camera.
  • We are improving the accuracy of the model, please pay attention to the following situations if your hand is not detected:

Backgrounds

  • Avoid complicated backgrounds, solid backgrounds are preferred;
  • Avoid backlight or low light, or unbalanced lighting conditions in camera frame;

Gesture

  • Avoid stacked or interlaced fingers for either hand;
  • Avoid the hands of different people;
  • Avoid fast-moving;
  • There are chances that the orientations of the joints are recognized converted. If so, please move your hands out of the visible fields and move them back again.