Samples

There are totally three sample scenes provided in the plugin, which are good tutorials on how to use the SDK.

  1. HandModel scene is used to demo how to use the 3D model renderer for detected hands. Only supported on Windows.
  2. Test_WaveVR scene is used to demo how to add GestureProvider script to existing WaveVR prefab. Only supported on Focus.
  3. Sample scene demos how to use hand to interact with game objects and UIs in the scene. Supports all platforms.

For detailed use of Vive Hand Tracking SDK, please refer to the scripts in the plugin.

This section goes through the HandRenderer.cs in the plugin for explanation of basic APIs. The script is used for displaying hand result for one hand (either left/right) as joints and bones. For more details, please refer to the script.

Setup the renderer

In 2D/3D point modes, each hand have only 1 position, therefore one primitive objects is created to illustrate the position of the hands. We can optionally use different colors to show gesture types.

In skeleton mode, each hand has 21 keypoints detected, which can be used to draw skeleton of the hand, given the connections between points. However, not all keypoints are always valid. Due to occlusion reasons, some keypoints may be missing although the hand is detected. In this way, the corresponding index of the keypoint is a place holder value. Use GestureHelper.IsValidGesturePoint to check if a keypoint is valid or not.

In the Start function, the script checks what mode hand detection is currently running, and creates the required game objects, including points and links.

The main logic is shown below (see Hand Positions (Modes) for definition of skeleton index):

HandRenderer.cs
class HandRenderer : MonoBehaviour {

  // Links between keypoints, 2*i & 2*i+1 forms a link.
  // keypoint index: 0: palm, 1-4: thumb, 5-8: index, 9-12: middle, 13-16: ring, 17-20: pinky
  // fingers are counted from bottom to top
  private static int[] Connections = new int[] {
    0, 1, 0, 5, 0, 9, 0, 13, 0, 17, // palm and finger starts
    2, 5, 5, 9, 9, 13, 13, 17, // finger starts
    1, 2, 2, 3, 3, 4, // thumb
    5, 6, 6, 7, 7, 8, // index
    9, 10, 10, 11, 11, 12, // middle
    13, 14, 14, 15, 15, 16, // ring
    17, 18, 18, 19, 19, 20, // pinky
  };

  // list of points created (1 for 3D/2D point, 21 for skeleton)
  private List<GameObject> points = new List<GameObject>();
  // list of links created (only for skeleton)
  private List<GameObject> links = new List<GameObject>();

  IEnumerator Start () {
    // wait until detection is started, so we know what mode we are using
    while (GestureProvider.Status == GestureStatus.NotStarted)
      yield return null;

    // create game objects for points, number of points is determined by mode
    int count = GestureProvider.HaveSkeleton ? 21 : 1;
    for (int i = 0; i < count; i++) {
      var go = GameObject.CreatePrimitive(PrimitiveType.Sphere);
      go.name = "point" + i;
      go.transform.parent = transform;
      go.transform.localScale = Vector3.one * 0.012f;
      go.SetActive(false);
      points.Add(go);
    }

    // create game objects for links between keypoints, only used in skeleton mode
    if (GestureProvider.HaveSkeleton) {
      for (int i = 0; i < Connections.Length; i += 2) {
        var go = GameObject.CreatePrimitive(PrimitiveType.Capsule);
        go.name = "link" + i;
        go.transform.parent = transform;
        go.transform.localScale = Vector3.one * 0.005f;
        go.SetActive(false);
        links.Add(go);
      }
    }
  }

  ...
}

Display hand result

In the Update function, the keypoints and links position and rotation are updated to match the latest detection result.

The main logic is shown below:

HandRenderer.cs
class HandRenderer : MonoBehaviour {

  // color look-up for different gestures
  private static Color32[] gesture_colors = new Color32[] {
    new Color32(0, 255, 0, 0), new Color32(255, 255, 255, 0), new Color32(0, 0, 255, 0),
    new Color32(0, 255, 255, 0), new Color32(255, 20, 147, 0), new Color32(255, 215, 0, 0),
  };

  // Draw left hand if true, right hand otherwise
  public bool isLeft = false;
  // Show gesture color on points (2D/3D mode) or links (skeleton mode)
  public bool showGestureColor = false;

  void Update () {
    // no need to update since result is same as last frame
    if (!GestureProvider.UpdatedInThisFrame)
      return;

    // hide points and links if no hand is detected
    var hand = isLeft ? GestureProvider.LeftHand : GestureProvider.RightHand;
    if (hand == null) {
      foreach (var p in points)
        p.SetActive(false);
      foreach (var l in links)
        l.SetActive(false);
      return;
    }

    // update points position
    for (int i = 0; i < points.Count; i++) {
      var go = points[i];
      go.transform.position = hand.points[i];
      go.SetActive(go.transform.position.IsValidGesturePoint());
      // update gesture color on points for non skeleton mode
      if (showGestureColor && !GestureProvider.HaveSkeleton)
        go.GetComponent<Renderer>().material.color = gesture_colors[(int)hand.gesture];
    }

    // update links position
    for (int i = 0; i < links.Count; i++) {
      var link = links[i];
      link.SetActive(false);

      int startIndex = Connections[i * 2];
      var pose1 = hand.points[startIndex];
      if (!pose1.IsValidGesturePoint())
        continue;

      var pose2 = hand.points[Connections[i * 2 + 1]];
      if (!pose2.IsValidGesturePoint())
        continue;

      // calculate link position and rotation based on points on both end
      link.SetActive(true);
      link.transform.position = (pose1 + pose2) / 2;
      var direction = pose2 - pose1;
      link.transform.rotation = Quaternion.FromToRotation(Vector3.forward, direction);
      link.transform.localScale = new Vector3(0.005f, 0.005f, direction.magnitude);

      // update gesture color on links for skeleton mode
      if (showGestureColor)
        link.GetComponent<Renderer>().material.color = gesture_colors[(int)hand.gesture];
    }
  }

  ...
}