The Avatar Magna Carta : or, How to Puppeteer a 3D Humanoid with 6DoF head and hands tracking

In this post, we present the workflow required in order to enable a player to live puppeteer a rigged first person 3d avatar in-game, by:

  1. driving the avatars in-game hands in a 1:1 relationship with the players actual physical hands, and
  2. animating the avatar’s in-game head orientation to match the precise orientation of the players physical real-world head,
  3. via 3d trackers on the players heads and hands, and the application of a simple inverse kinematic (IK) physics model.

We spent a long time figuring out this path, so I thought we’d share it with the community. Note: this is an entirely technical, workflow, pipeline post for readers who are currently developing VR applications. Its not for the general consumer. This tutorial is specifically crafted for a Unity3D pipeline. Also, it is specific to the Oculus Rift DK2 HMD, and Razer Hydra hand trackers, powered by Sixense. It shuold work with other tracking solutions, with modification.

Go ahead, wave hello to the adoring fans...

Go ahead, wave hello… heads and hands fully tracked and puppeteered

First, the basic premise:

People want to relate to their own physical avatars in VR. They want to be able to look down at their feet, and see their body. They want to be able to wave their hands in front of their face, and see some representation of their appendages in front of them, superimposed on the virtual scene. In short, they want to feel like they are present in the experience, not just an ethereal viewer.

This problem proved a bit more difficult to solve in practice than one might imagine. So in the interest of fostering community, we are sharing our technical solution with everyone. It isn’t perfect yet; we’ve posted a number of tips and follow-on research topics at the end of the tutorial, places where this needs to go before its fully “ready for prime time.” However, the solution presented here IS functional, leaps beyond the standard “avatar-less” VR being produced today, and should serve as a baseline from which improvements can and will be made.

Now, the actual tutorial:


  1. build and skin your Avatar Model
    1. we create humanoids in Mixamo Fuse
    2. you may choose the modeling software of your choice : Maya, Blender, etc
    3. make sure that your final model is in T-pose
  2. rig the character with bones
    1. this can be done by uploading a t-pose character to mixamo.com
    2. …or manually in your software
    3. use the maximum resolution possible : we use a 65-bone skeleton, which includes articulated fingers
  3. give the character at least one animation
    1. we will use “idle” state
    2. you assign this online in Mixamo
  4. get the Avatar into Unity
    1. export from Mixamo to Unity FBX format
    2. import the resulting FBX (approx. 5-20MB) into Unity Assets : gChar folder
    3. this will generate a prefab in Unity along with 4 components in a subfolder:
      1. a mesh
      2. a bone structure
      3. an animation loop file
      4. an avatar object
    4. the prefab will have a name such as “YourModelName@yourAnimName”
  5. Configure the Avatar
    1. click on the prefab in the Assets folder
    2. In the inspector, select the “Rig” tab
      1. make sure that “Humanoid” is selected in the “Animation Type” pull-down
      2. if you selected that manually, hit “Apply”
    3. drag the prefab from the Assets into the Heirarchy
    4. select the prefab avatar in the heirarchy
      1. In the Inspector:
      2. add an “Animator” component. we will fil in the details later
      3. add the g-IKcontrol.cs C# script. again, we will fill in details later
      4. you can copy the source of the script from here
  6. Add the latest Oculus SDK (OVR) to the project. 
    1. Download the latest Oculus SDK for Unity
    2. this is usually done by double-clicking “OculusUnityIntegration.unitypackage” in the OS, then accepting the import into your project by clicking “import all”
    3. You should now have a folder within Assets called “OVR”
  7. Add the latest Sixense Hydra / STEM SDK to the project
    1. Download the Hydra plug-in from the Unity Asset Store
    2. Import it into your project.
    3. You should now have a folder within Assets called “SixenseInput”
  8. Create a high level Empty in your hierarchy and name it “PLAYER-ONE”
    1. make your Avatar prefab a child of this parent
    2. Drag the OVR CameraRig from the OVR folder and also make it a child of PLAYER-ONE
  9. Properly position the Oculus camera in-scene
    1. The oculus camera array should be placed just forward of the avatars eyes
    2. we typically reduce the forward clipping plane to around 0.15m
    3. If you’re using the OVRPlayerController, Character Controller settings work well:
      1. Center Y = -0.84m (standing),
      2. Center Z = -0.1 (prevents from being “inside head”)
      3. Radius = 0.23m
      4. Height = 2m
    4. This will require some trial and error. Make sure that you use the Oculus camera, and not the Oculus Player controller. Experimentation will be required to bridge the spatial relationship between a seated player and a standing avatar. Calibration software needs to be written. Trial and Error is generally defined as a series of very fast cycles of : build, test, make notes. modify, re-build, re-test, make notes. repeat until perfect. There are many gyrations and you will become an expert at rapidly donning and removing the HMD, headphones, and hand controllers.
  10. Create the IK target for head orientation
    1. Right-click on the CenterEyeAnchor in the Hierarchy, and select “Create 3D Object > Cube”
    2. Name the cube “dristi-target”
    3. move the cube approx 18” (0.5m) directly outward from the Avatar’s third eye
    4. This will serve as the IK point towards where the avatar’s head is “aimed” at, i.e. where they are looking. In yoga, the direction of the gaze is called dristi.
  11. Get the Sixense code into your scene
    1. Open the SixenseDemoScene
    2. copy the HandsController and SixenseInput assets from the heirarchy
    3. Re-open your scene
    4. paste the HandsController and SixenseInput assets into your heirarchy
    5. drag both to make them children of OVRcameraRig
  12. Make sure the Sixense hands are correctly wired.
    1. Each hand should have the “SixenseHandAnimator” controller assigned to it
    2. Root Motion should be UNchecked
    3. Each hand should have the SixenseHand Script attached to it
    4. On the pull down menu doe SixenseHand script, the proper hand should be selected (L/R)
  13. Properly position the Sixense hands in-scene
    1. They should be at about the Y-pos height of the belly button
    2. The wrists should be about 12” or 30cm in Z-pos forward of the abdomen
    3. In otherwords, they should be positioned as if you are sitting with your elbows glued to your sides, forearms extended paralell to the ground.
    4. You will want to adjust, tweak, and perfect this positioning. There is an intrinsic relationship between where you position the hands in the scene, and the Sixense trackers position in the real world relative to the Oculus camera. Trial and error and clever calibration software solves this. That’s another tutorial.
  14. Make the Sixense hands invisible. 
    1. we do this because they will merely serve as IK targets for the avatars hands
    2. do this by drilling down into HandsController : Hand_Right : Hand_MDL and unchecking the “Skinned Mesh Renderer” in the Inspector panel
    3. do the same with the left hand.
    4. this leaves the assets available as IK targets, but removes their rendered polys from the game
  15. Create the Animator Controller
    1. create transitions from Entry to New State, and
    2. from New State to Idle (or what you named your created animation)
    3. On the Base Layer, click the gear, and make sure that “IK Pass” is checked.
    4. this will pass the IK data from the animation controller on down the script chain
  16. Assign the new Animation Controller to the Avatar
    1. select the avatar in the heirarchy
    2. assign the controller in the inspector
  17. Map the Avatar with Puppet IK targets for Hands and Head,
    1. drag the “Hand – Left” from the Sixense Handscontroller parent to the “Left Hand Obj” variable
    2. drag the “Hand – Right” from the Sixense Handscontroller parent to the “Right Hand Obj” variable
    3. drag the Look-At-Target from the OVRCameraRig to the “Look Obj” variable
      1. The Look-At-Target is nested deep:
      2. PLAYER-ONE : OVRCameraRig : TrackingSpace : CenterEyeAnchor : Look-at-Target
  18. THATS IT!
    1. Build a compiled runtime.
    2. Connect your Rift and Hydra
    3. launch the game
    4. activate the Hydras.
      1. grasp the left controller, aim it at the base, and squeeze the trigger.
      2. grasp the right controller, aim it at the base, and squeeze the trigger.
      3. hit the “start” button, just south of the joystick on the right controller
    5. When you tilt and rotate your head, the avatars head should also tilt and roll. When you move your hands, the avatars hands should move in a 1:1 ratio in-scene. Congratulations, you’re an Avatar Master.
 

TIPS
and areas for further R&D
 
  1. Ideally, the avatars head should not be rendered for the player, yet it should still cast shadows and reflections
  2. the avatars head should also clearly be rendered for other players in multi-player scenarios, as well as for third-person camera observer positions.
  3. An in-game shadow is a great way to ground the player to the avatar in a convincing manner. Even when the hands are outside the field of view, seeing the shadows of the heads and hands triggers a very powerful sense of presence.
  4. While head rotation and orientation on the skull axis is fairly straightforward, head translation, i.e. significant leaning in or out or to the side, is a bit more abstract in terms of pupeteering choices. You may wish to explore either locomotion animations, or “from the hip” IK solutions to move the torso along with the head.
  5. RL : VR / 1:1 Spatial Calibration is KEY to great experiences.
    1. See 9.3, above : Properly position the Oculus camera
    2. and 13.4 : Properly position the Sixense hands 
  6. The built-in Unity IK leaves a lot to be desired when it comes to realistic approximations of elbow positions. We are investigating the FinalIK package and other professional-class solutions.
  7. This solution in its current form disables the ultra-cool “grasping” or “pointing” animations that are built-in to the Sixense template models. Investigate how to re-enable those animations on the rigged avatar’s MecAnim structure.
  8. You will also want to configure the remainder of the Hydra joysticks and buttons to control all  game functions, because it sucks to be reaching and fumbling for a keyboard when you are fully immersed in a VR world.
  9. The majority of this configuration starts in the Unity Input Manager
    1. Edit | Project Settings | Input…
  10. There should be keyboard fallback controls for users who do not own Hydras…
Have you tackled this same challenge? Have you created any solutions to the further investigations we’ve proposed above?
Share in the comments below,
because we’re all in this together! 

 

Avatar : Arrival

Finally, our project has gotten to Phase Zero.

The avatar has arrived.
Witness: 

VR feet - virtual world

My very first look at my feet in the VR world… perfection.

avatar-foreal

comparing to my feet in RL — Real Life… pretty damn good. Same feet, same place. VR goggles on == VR goggles off. Perfect Calibration and Registration. Achievement : Unlocked.

v1 avatar tet

And then, suddenly… the whole Avatar is Manifest. Head, spine, hands… all are 1:1 with RL… and SL is born.

Go ahead, wave hello to the adoring fans...

Go ahead, wave hello to the adoring fans…

we made it: your avatar awaits…

Well, its been a hard month of headbanging on the issue of inverse kinematics, first person head cameras, and 1:1 perfect hand calibration.

And today, we made it:

head-and-hands-finally

As with many things, it took going back to square one.

Instead of trying and continually failing to integrate multi-player full avatar control + keyboard + mouse + gamepad + oculus + hydra all at once into our working game prototype, we created a barebones scene:

1 avatar, 1 wall, 1 light.

simple avatar solve

And went about tracking heads and hands, and making the avatar move along with the inputs. Finally, late tonight, everything came together, and we got it working. Thank the VR Gods. Now onto more important things. Like eating, drinking, sleeping, and… yes, that.