we made it: your avatar awaits…

Well, its been a hard month of headbanging on the issue of inverse kinematics, first person head cameras, and 1:1 perfect hand calibration.

And today, we made it:

head-and-hands-finally

As with many things, it took going back to square one.

Instead of trying and continually failing to integrate multi-player full avatar control + keyboard + mouse + gamepad + oculus + hydra all at once into our working game prototype, we created a barebones scene:

1 avatar, 1 wall, 1 light.

simple avatar solve

And went about tracking heads and hands, and making the avatar move along with the inputs. Finally, late tonight, everything came together, and we got it working. Thank the VR Gods. Now onto more important things. Like eating, drinking, sleeping, and… yes, that.

The lightsaber… that is all.

killer-best-lightsaber

I realised many months ago that the make or break of our VR experience will be the veracity and feel of the lightsaber. To this end, I’ve developed a wishlist of features:

  1. accurate saber model with metal and buttons
    1. — getting there.
    2. need a really good reflective metal shader
    3. need it to be all solid faces 🙂
  2. perfect glowing blade
    1. — COMPLETE with alpha textures AND point light source
  3. perfect control by hands
    1. — COMPLETE with Razer Hydra
  4. dynamic motion-driven sound FX
    1. halfway there
    2. need some velocity based flange / distortion
    3. OpenAL or other, transform tip velocity vector to dynamic sFX
  5. hands interaction
    1. switch saber power-on/off from keyboard control to Hydra control
  6. multiple on/of cycles
    1. once lightsaber is off:
    2. blade needs Prefab re-instantiation to be tied to proper place on base
    3. and finally…
  7. interaction with architecture
    1. calculate the intersection of the blade with solids
    2. leaving decals on wall where point hits
    3. showing a fire where it is actively intersection
    4. throwing sparks

Yeah. That should about do it.

lightsaber-cutting-wall