First person VR lightsaber : the design intent

We’ve helped pioneer first person VR lightsaber control in-Rift with our ScenePlay demo app. This is what happens when you take that vector and extend it towards its logical conclusion: just add photorealistic rendering, VR cinema backplates, AI stormtroopers, laser bolts and explosions… voila.

Consider this advanced pre-viz of the experiences coming down the pipe in the next 3 years. Start practicing up your swordplay skills, and Enjoy.

What’s a lightsaber look like, you might ask? Well, this:

Mark Zuckerberg tests out the new Oculus touch hand controllers as Brendan Iribe observes

Mark Zuckerberg tests out the new Oculus touch hand controllers as Brendan Iribe observes

And this;

Testing out the Sony Move hand controllers paired with the Sony Morpheus VR HMD for the PlayStation 4

Testing out the Sony Move hand controllers & Sony Morpheus VR HMD for the PlayStation 4

Or, if you prefer the dark side, go ahead, play Vader:

Switch Hitting : back to Mr. Potter

great meetings with the Studios in Hollywood. Put some post-demo polish on the Star Wars featurette, now it may be time to revisit Harry Potter in his most awesome game of Wizard Chess VR. Remember BattleChess? You ain’t seen nuthin yet.

Early early pre-viz:

Harry Potter Wizards Chess VR

scene recreation, v0.03

Harry Potter Wizards Chess

Wixards Chess, still from Harry Potter and the Sorcerer’s Stone. © Warner Brothers

our first task is to do the basic “blocking” on the scene. For this we use simple capsules as stand-ins for the actors. Once all the gross movement is accounted for, then we swap in the high resolution humanoid models, and slowly add in actual gestures / movements / walks / hands / head animations, lip sync, and even eye gaze, aka dristi.

Harry Potter Wizards Chess

character stand-ins for basic scene blocking

Harry Potter Wizards Chess VR

Harry Potter Wizards Chess, Harry Potter and the Sorcerers Stone © Warner Brothers

Onward.

We don’t always use hand control at dSky, but when we do… we choose Hydra

Madame Hydra

Yes, Marvel did have video long before the Avengers re-start. Oh, good ole G.I. Joe…

With Hydra, we get to use BOTH our hands in VR,
just like Madame Hydra here…

Keeping it simple... because the best part of the hydras is... you move your hands, and your VR hands... well, they move precisely where they should.

Keeping it simple… because the best part of the hydras is… you move your hands, and your VR hands… well, they move precisely where they should. Buttons not included.

and here are the hydras at play:

 

dynamic audio : wind in your hair

When we design spaces, we want our worlds to be *alive*.

A key component of this sense of vitality is dynamic audio. In a nutshell, dynamic audio is physics-based, player-generative audio signals. The two challenges we are working on are:

1. the sound of the lightsaber as the player swooshes it around them in the environment. buzzzzzzz…. hmmmmm… zap! Obviously, this is dynamic, based on the velocity and acceleration of the ‘blade’.

dsky sceneplay lightsaber ultimate

2. the sound of wind in the player’s hair as they fly high above the city… modulated by airspeed, gusts, and hopefully, near-miss-objects.

_dsky-real-flying-SF

Programming Sound with Pure Data

Solution? Hard work, creative sample bases, and sweet code. I’ve found this awesome resource: Programming Sound with Pure Data, by Tony Hillerson.

 

Can’t wait!

Stay tuned…

 

PS – here’s a little extra on the actual components of the original lightsaber sounds, circa 1977: doppler microphone swinging :)

 

we made it: your avatar awaits…

Well, its been a hard month of headbanging on the issue of inverse kinematics, first person head cameras, and 1:1 perfect hand calibration.

And today, we made it:

head-and-hands-finally

As with many things, it took going back to square one.

Instead of trying and continually failing to integrate multi-player full avatar control + keyboard + mouse + gamepad + oculus + hydra all at once into our working game prototype, we created a barebones scene:

1 avatar, 1 wall, 1 light.

simple avatar solve

And went about tracking heads and hands, and making the avatar move along with the inputs. Finally, late tonight, everything came together, and we got it working. Thank the VR Gods. Now onto more important things. Like eating, drinking, sleeping, and… yes, that.

mobile : the final destination

Witness : the future. a truly mobile VR picnic.

Witness : the future. a truly mobile VR picnic.

Way back in 2005, I was in the business of creating massive multiplayer augmented reality systems. My team created playspaces which could read up to 200 players simultaneously, using high-powered projectors to paint the space with light, complex multi-camera arrays to sense the people and their movements… and very highly tuned algorithms to transform those raw camera feeds into usable structured data, in real time, with near-zero latency. This was called markerless motion capture, or markerless mocap. It was before Microsoft Kinect, before time-of-flight, and was considered one of the holy grails of computer vision.

We were able to package all this equipment: CPU, GPU, projector, cameras… all into a single piece of unified hardware. Our first build weighed in at 110 lbs.

The Gizmo v1 was a massive 89 pound steampunk joy machine

The Gizmo v1 was a massive 89 pound steampunk joy machine

I lugged that beast all around North America, paying the steep fees to airlines and hurting my back all the while. Due to the physical stresses, I demanded that we bring the weight in under 50 lbs, the top limit of airlines, and indeed, with some clever mechanical engineering, we were able to accomplish that goal.

Gizmo v3 weighed in at a svelte, travel-ready 49.9 lbs. including high-lumen projector, camera-array, power supply, CPU, GPU, and fans.

Gizmo v3 weighed in at a svelte, travel-ready 49.9 lbs. including high-lumen projector, camera-array, power supply, CPU, GPU, and fans.

Nonetheless, 49.9 lbs was still a hell of a lot to haul around, especially given my 13th story walk-up apartment in Lower East Side, Manhattan where I was based at the time. On the 20th time I climbed those stairs, I swore to the gods above that never again would I haul heavy hardware around the planet.

That promise held true for many years. Until now. Now, in 2015, somehow we find ourselves again in need of high powered GPUs, with the accompanying massive power supplies and cases. Thank the gods, I was able to engineer this thing to less than 20 lbs this time: the cameras are featherlight, and the projectors are replaced by goggles. Instead of projecting to an outerworld, we are creating rich innerworlds. However, its still a massive amount of heavy iron.

which brings us to the key event:
a seminal board meeting of my former company.

Matt, Suzanne, and I were sitting at the massive mahogany conference table, alongside all of our Board of Advisors: brilliant businessmen, financiers, and researchers. We presented our new ultralight 49 lb. unit, the PlayBox Mark IV. My father was in attendance; he played a key role in ushering in the modern era of VR, having launched the military’s SimNet initiative waaay back in 1980. He simply looked at the schematics, and said:

“You do realise, that all that hardware is going to sit inside a cellphone, inside of 5 years?”

At the time, I scoffed:

“A cellphone? That’s ridiculous! Do you realise the graphics supercomputing power we are harnessing to make this a real-time, responsive, computer vision AR system?”

But as the days, months, and years went on, I realised the wisdom of my father’s words. First came the pico-projectors, medium-lumen LED-powered HD projectors that were the size of a matchbox. Next came the low-powered, high-resolution stereoscopic camera arrays, these the size of a dime. And finally came nVidia’s Tegra line of GPUs, ulttra-fast graphics supercomputers that were purpose-designed for smartphones and tablets.

Before I knew it, all the parts were in place.

Which brings us to the present moment.

Once again, we have built graphics supercomputers to ease our entry into real-time, high-performance VR. We tweak and optimize every component to maintain the floor 75fps required for genuine presence.

the engine of our current VR-PC, the venerable Radeon 7990. 400 watts, 4 gigaflops of graphics supercomputing horsepower, and 75fps on our Oculus Rift.

the engine of our current VR-PC, the venerable Radeon 7990. 400 watts of energy draw, 4 gigaflops of graphics supercomputing horsepower, and 75fps on our Oculus Rift.

And then, I got a Samsung Note4 and the GearVR peripheral, a hardware/software combo lovingly hand-architected by none other than John Carmack, designed to deliver high-performance VR in a truly mobile form factor.

Samsung GearVR : the harbinger of the final form factor of VR : light, wireless, fast, mobile.

Samsung GearVR : the harbinger of the final form factor of VR : light, wireless, fast, mobile.

The shocker? To date, my GearVR has outperformed all desktop solutions we’ve created.

Let me say that again:

A $900 battery-powered 6-ounce smartphone currently outperforms my $2500, 1-kilowatt, 21-pound desktop beast…

AND, the added element of freedom of physical movement is not even factored in here. The ability to bring your GearVR on a picnic in an Adidas sport bag, as opposed to bringing people into your studio and holding the cords out of their way… that alone justifies the Gear.

In short: as soon as possible, dSky will be focusing all efforts on mobile as our lead platform. No worries, Oculus, Sony, and HTC: our apps and experiences will still perform insanely wonderfully on your platforms. Its just, as with the world:

dSky is Mobile First.

Unity 5 port complete

Well, the port to Unity 5 took a bit longer than expected. Then again, what port doesn’t? Overall, we’re very happy with the more robust namespace support in code, and the physically based shader model. It took quite some time to re-tool all our custom shaders into a PBR model, but once done, the results are spectacular, no pun intended.

R2-PBR-shaders

R2D2 with the new PBR in Unity5. We’re loving that blue-alloy metal look!

And, we finally solved the mascara issue with all our character models, which we created in Mixamo’s excellent Fuse product. For those techies / artists out there: the trick is to duplicate the existing Legacy/Diffuse-Bump shader for each character, keep the textures and normals, and set the shader model to “Standard / Specular / Fade” with a smoothness of 1.0. Do the same with the eyes, and you’ll have that beautiful “twinkle in the eyes” that all pseudo-living avatars should properly exhibit.

Luke finally drops the mascara and gets real eyebrows -- and a spark

Luke finally drops the mascara and gets real eyebrows — and a spark

In other news, our friends at Magic Leap released their first actual concept video. Just single-player for now, but fun stuff nonetheless.

That’s all for today.

Onward.

The lightsaber… that is all.

killer-best-lightsaber

I realised many months ago that the make or break of our VR experience will be the veracity and feel of the lightsaber. To this end, I’ve developed a wishlist of features:

  1. accurate saber model with metal and buttons
    1. — getting there.
    2. need a really good reflective metal shader
    3. need it to be all solid faces :)
  2. perfect glowing blade
    1. — COMPLETE with alpha textures AND point light source
  3. perfect control by hands
    1. — COMPLETE with Razer Hydra
  4. dynamic motion-driven sound FX
    1. halfway there
    2. need some velocity based flange / distortion
    3. OpenAL or other, transform tip velocity vector to dynamic sFX
  5. hands interaction
    1. switch saber power-on/off from keyboard control to Hydra control
  6. multiple on/of cycles
    1. once lightsaber is off:
    2. blade needs Prefab re-instantiation to be tied to proper place on base
    3. and finally…
  7. interaction with architecture
    1. calculate the intersection of the blade with solids
    2. leaving decals on wall where point hits
    3. showing a fire where it is actively intersection
    4. throwing sparks

Yeah. That should about do it.

lightsaber-cutting-wall