VR tech 411 : 6DoF, XYZ + YPR, position + orientation in 3space

I’ve spent so many cycles describing this gesturally to so many people, I’m considering getting this tattooed on my chest. To avert that, here is the diagram, liberally adapted, corrected, and upgraded from the Oculus Developer Guide:

We present to you, the standard coordinate 3-space system:

dSky-Oculus-XYZ-YPR position orientation diagram

POSITION is listed as a set of coordinates :

  • X is left / right
  • Y is up / down
  • Z is forward / back

ORIENTATION is represented as a quaternion* (later). Simply:

  • Pitch is leaning forward / back (X axis rotation)
  • Yaw is rotating left / right (Y axis rotation / compass orientation)
  • Roll is spinning clockwise / counterclockwise (Z axis rotation)

Now there, all clear. You’re welcome.


 

Further clarifications:

* a quaternion is a very special (and generally non-human readable) way of representing 3-dimensional orientation reference as a 4-dimensional number (X, Y, Z, W) in order to correct for strange behaviours encountered when rotating 3d objects.

* 6DoF is an acronym for “six degrees of freedom”. It is generally used in talking about input devices which allow a user to control position and orientation simultaneously, such as head trackers, playstation Moves, razer Hydras, Sixense STEMs, etc.

 

we made it: your avatar awaits…

Well, its been a hard month of headbanging on the issue of inverse kinematics, first person head cameras, and 1:1 perfect hand calibration.

And today, we made it:

head-and-hands-finally

As with many things, it took going back to square one.

Instead of trying and continually failing to integrate multi-player full avatar control + keyboard + mouse + gamepad + oculus + hydra all at once into our working game prototype, we created a barebones scene:

1 avatar, 1 wall, 1 light.

simple avatar solve

And went about tracking heads and hands, and making the avatar move along with the inputs. Finally, late tonight, everything came together, and we got it working. Thank the VR Gods. Now onto more important things. Like eating, drinking, sleeping, and… yes, that.

Controllers, Controllerism, Controrgasm?

virtual xBox 360 controller - high fidelity modelMy god. I haven’t installed this many device drivers and intercept layers in… well, in my entire life. VR exploration and 3D creation require whole new methods of interacting with a universe, and tho the Apple Magic Mouse — essentially, a crazy-accurate optical mouse with a curved multi-touch surface fused to its carapace — is still my go-to interface after all these years, I find myself asking: What is the best interface for navigation and manipulation?

In the past month, I’ve configured my lovely MacBook Air LightWarrior(TM) to accept ALL the following input streams:

  • LeapMotion — “AirHands” pointing and gesture tracking
  • xBox360 wireless game controller — 8 buttons, 2 joysticks, 2 pressure sensitive triggers, 1 D-pad… all in the palm of your hand
  • 3Dconnexion 3d mouse — an amazing piece of kit that should be standard for all 3D design rigs.
  • Unity xTUIO iPad multi-touch simulator — enables one to use a bluetoothiPad to send multi-touch signals in real-time to the Unity application.

and, of course:

  • Oculus Rift Optical Head Tracking — explore the Multiverse simply by looking around, i.e. moving your torso, head, neck, and eyeballs. CAUTION: may cause a very bizarre form of RSI (repetitive strain injury), a.k.a. a VSN (Very Sore Neck)

Curious to know more about my explorations?

Follow Rift Adventures, updated daily!