Razer Hydra Input in Unity3D : Sixense Input control syntax

dsky-screenshot-lightsaber

we’ve been doing some fairly extensive development with the Razer Hydras in anticipation of the forthcoming Sixense STEM, as well as a bevy of other 6DoF controllers (Perception Neuron, Sony Move, PrioVR, etc). The Hydra input harness is somewhat convoluted and exists outside and parallel to the standard Unity Input Manager.

razer_hydra_TIGHT

 

I’ve found scant documentation for this on the interwebs, so here is the result of our reverse engineering efforts. If you want to code for Hydra input in your Unity experiences, here are the hooks:

First, we map primary axis and buttons as symbolic representations in the Unity Input Manager (i.e. P1-Horizontal, P1-Vertical, P1-Jump…); those handle basic keyboard, mouse, and standard gamepad input (xbox, playstation). Then inside of our Input handler code, we write custom routines to detect the Hydras, to read their values, and to sub those values into the aforementioned symbolic variables.

Our best recommendation is to install the Sixense plug-in from the Unity Asset Store, and to thoroughly examine the SixenseInputTest.cs that comes with it.

The basic streaming vars are :

• SixenseInput.Controllers[i].Position — Vector3 XYZ
• SixenseInput.Controllers[i].Rotation Vector4 Quaternion
• SixenseInput.Controllers[i].JoystickX — analog float -1.0 to 1.0
• SixenseInput.Controllers[i].JoystickY — analog float -1.0 to 1.0
• SixenseInput.Controllers[i].Trigger — analog float 0.0 to 1.0

obtaining the button taps is a bit more obfuscated,
they’re something like:

• SixenseInput.Controllers[i].GetButton(buttonObjectName)
where “buttonObjectName” is one of many objects:
ONE, TWO, THREE, FOUR, START, BUMPER, JOYSTICK
representing which “switch” is being closed on that cycle,

It also appears that there are two simpler methods,
if you want to trap explicit button press events:

• SixenseInput.Controllers[i].GetButtonDown(buttonObjectName)
• SixenseInput.Controllers[i].GetButtonUp(buttonObjectName)

This sample script has a bevy of (non-optimized?) methods for reading the controllers output in real time, from which you can (in code) map all buttons, thumbsticks, and 6DoF XYZ YPR data to your app. Hopefully the STEM API will be far more integrated into the standard Unity Input Manager framework, and thus work in seamless parallel with standard controllers, without the need for custom code.

Have any tips on Hydra input for Unity?
Pop’em into the comments below:

VR tech 411 : 6DoF, XYZ + YPR, position + orientation in 3space

I’ve spent so many cycles describing this gesturally to so many people, I’m considering getting this tattooed on my chest. To avert that, here is the diagram, liberally adapted, corrected, and upgraded from the Oculus Developer Guide:

We present to you, the standard coordinate 3-space system:

dSky-Oculus-XYZ-YPR position orientation diagram

POSITION is listed as a set of coordinates :

  • X is left / right
  • Y is up / down
  • Z is forward / back

ORIENTATION is represented as a quaternion* (later). Simply:

  • Pitch is leaning forward / back (X axis rotation)
  • Yaw is rotating left / right (Y axis rotation / compass orientation)
  • Roll is spinning clockwise / counterclockwise (Z axis rotation)

Now there, all clear. You’re welcome.


 

Further clarifications:

* a quaternion is a very special (and generally non-human readable) way of representing 3-dimensional orientation reference as a 4-dimensional number (X, Y, Z, W) in order to correct for strange behaviours encountered when rotating 3d objects.

* 6DoF is an acronym for “six degrees of freedom”. It is generally used in talking about input devices which allow a user to control position and orientation simultaneously, such as head trackers, playstation Moves, razer Hydras, Sixense STEMs, etc.