If it performs wonderfully on GearVR:
then it will be a dream on the Oculus Rift,
and take little or no effort to port, other than input harness*
If it performs adequately on the Rift + PC,
it may or may not work well on GearVR.
In fact, it may take both a massive re-factoring as well as a
total re-thinking of graphics, models, textures and code.
Be smart : Mobile First.
Develop for the Gear. Port to the Rift. WIN.
* a few words on input harnesses:
Designing input mechanisms for the GearVR touchpad is a tricky business… its a VERY limited input surface, and we tend to use both gestures AND a lot of “gaze detection” combined with taps…
for the Rift, we often take the easy way out : keyboard input. At dSky, we are especially fond of the following, which are easy to “find” for “blind” VR users:
space bar (easiest to find… big and central and edge)
cursor arrows at lower right (2nd easiest to find blind)
ESC key (far upper left, also “feelable”)
Truth be told, we should ALL be designing for
a) gamepad, and
b) natural hand tracking devices,
c) with keyboard as a “fallback”
d) oh, did i fail to mention the venerable mouse? ooops!
as the long-term goal for natural and intuitive VR input streams.
We’ve helped pioneer first person VR lightsaber control in-Rift with our ScenePlay demo app. This is what happens when you take that vector and extend it towards its logical conclusion: just add photorealistic rendering, VR cinema backplates, AI stormtroopers, laser bolts and explosions… voila.
Consider this advanced pre-viz of the experiences coming down the pipe in the next 3 years. Start practicing up your swordplay skills, and Enjoy.
What’s a lightsaber look like, you might ask? Well, this:
Mark Zuckerberg tests out the new Oculus touch hand controllers as Brendan Iribe observes
Testing out the Sony Move hand controllers & Sony Morpheus VR HMD for the PlayStation 4
Or, if you prefer the dark side, go ahead, play Vader:
Well, its been a hard month of headbanging on the issue of inverse kinematics, first person head cameras, and 1:1 perfect hand calibration.
And today, we made it:
As with many things, it took going back to square one.
Instead of trying and continually failing to integrate multi-player full avatar control + keyboard + mouse + gamepad + oculus + hydra all at once into our working game prototype, we created a barebones scene:
1 avatar, 1 wall, 1 light.
And went about tracking heads and hands, and making the avatar move along with the inputs. Finally, late tonight, everything came together, and we got it working. Thank the VR Gods. Now onto more important things. Like eating, drinking, sleeping, and… yes, that.
So last night, for the first time, I got my calibrated hands into the VR world. First I had only viewed them on a 2d screen, which was a sort of detached amusement… it felt like operating a robot through telemetry.
seeing your properly calibrated hands appear in VR is a consciousness expanding experience.
Last night, though, I got them in real 3d space, in VR, with perfect calibration… so that I perceived my hands to be precisely where they were in real space, in a virtual environment. I brought my hands right up to my face, and there they were… The effect was similar to my first dive into modern VR… utterly profound.
All night I dreamed about hands, and virtual hands, and virtual keyboards and virtual iPhones. Today, I place the hands into the worlds I’ve been building for the past year, and we see what we can do…