Cinema meets Videogames : Strange bedfellows, or Match made in Heaven?

We’re fresh back from the epic Digital Hollywood conference / rolling party in LA, and chock full of ideas of how to integrate classic cinema techniques with our native videogame tropes.

See, 80+% of the participants at the conference were from traditional media backgrounds — music, TV, film. And while VR was absolutely the hot topic of the show — as it was for CES, GDC, and NAB — there was as much confusion as there was excitement about the commercial and artistic promise of this brand spanking new medium.

One of the key findings on our part was a genuine need to integrate cinema techniques, aka linearity, composition, color and storytelling —  into our hyper-interactive realm of videogame design. Thus began our investigations. What exactly does it take to make full, HD, 3D, 360 captures of real-world environments?

We’ll get into more details later, but for now I want to spell it out, if only for technical humor: It takes a:

  • 12-camera
  • 4k
  • 360°
  • 3D stereoscopic
  • live capture stream
  • …stitched to form an:
  • equirectangular projection
  • over / under
  • 1440p
  • panoramic video

Say that 12 times fast. Oh, and be ready for handling the approximately 200GB / minute data stream that these rigs generate. Thank god for GPUs.

What does that look like in practice?
Like this:

A still frame from a 360 stereoscopic over/under video. Playback software feeds a warped portion of each image to each of the viewers eyes.

A still frame from a 360 stereoscopic over/under video. Playback software feeds a warped portion of each image to each of the viewers eyes.

And how do you capture it? With something like this:

360Hero GoPro stereo 360 rig

12 camera GoPro 360Hero rig

Or, if you’re really high-budget, this:

Red Dragon 6k 360 3D stereoscopic capture rig by NextVR_Rig1

array of 10 Red digital cinema cameras (photo not showing top and bottom cam pairs)

Though personally, we really prefer the sci-fi aesthetic:

jaunt-sci-fi-rig-header

an early 3d 360 capture prototype by JauntVR

Then there’s the original 360 aerial drone capture device, circa 1980

Empire Viper Droid, Empire Strikes Back, c. 1980, LucasArts

Empire Viper Droid, Empire Strikes Back, c. 1980, LucasArts

Then, the ever-so-slightly more sinister, and agile version, circa 1999…

Sentinel Drone, The Matrix, 1999, via the Wachowski brothers

Sentinel Drone, The Matrix, 1999, via the Wachowski brothers

What do you think? Is the realism of live capture worth the trouble? Would you prefer “passive” VR experiences that transport you to hard-to-get-to real world places and events, “interactive” experiences more akin to xBox and PlayStation games, or some combination of the two?

Join the conversation below:

From here forward… Optimization

We recently migrated primary build development from my trusty MacBook Air to our high-performance Windows PC rig… while FPS was churning along at around 15fps on the MBA, I figured we’d easily be hitting the 75fps required for presence on the PC. Lo and behold, that was simply not to be the case… yet.

Shockingly, the frame-rate on the PC is about the same, and the stutters are even worse. This makes me angry. That PC has what was, a year or so ago, the best graphics card that money could buy. A monster Sapphire ATI R9 HD7900, with 3GB of fast DDR graphics RAM, pulling 400w of juice. How could this be doing similar performance to my humble 2012 MacBook Air with Intel 4000 graphics on a tiny chip welded to the motherboard?

Time to run some benchmarks. I used the beautiful Heaven and Valley benchmarks, both free from Unigine (not to be confused with Unreal and/or Unity, ha!). My PC rig scored a respectable(?) 901 on Heaven, average fps of 36, with a max of 66 and a low of 9. Same settings on the Mac, drove a humbling score of 87, average 3.4fps, max 5.3, low 2.4. This would imply a theoretical 10x performance enhancement on the PC… if we were only talking about pure graphics. Good… at least the benchmarks showed as much. Conclusion:  the raw horsepower is there.

1680x1050-4xaa-heaven--MBA--087

Now I just have to do the hard work of getting Unity, and Oculus, and Sixense, to all perform similarly, and to optimize, optimize, optimize until we arrive at that fabled 75fps required for solid presence.

May the force be with me.


UPDATE: BRAINS!

It appears that my CPU was totally the bottleneck. Since the machine was re-purposed from its past life as a CryptoCurrency miner, we invested heavily in the GPU and scantily in the CPU, currently an AMD Sempron 145 @2.6gHz. After sorting through the so-often misinformed reddits on this matter…

OctopusRift has a most excellent resource to build the ultimate VR PC:
http://www.octopusrift.com/building-a-vr-pc/

Tom’s Hardware also appears to be a great resource, specifically:
• http://www.tomshardware.com/charts/ and
Tom’s Gaming CPU Hierarchy Chart March 2015


UPDATE 2: UPGRADES

We’ve gone ahead and purchased a new brain for Anakin. Moving up from the lil Sempron. Here are the new specs: (and yes, 8GB of RAM comes next)

CPU : AMD FX-8350 8-Core Black
GPU : Sapphire Radeon R9 280X Dual-X 3GB GDDR5
HDD : 500gb SSD Samsung 850 EVO
RAM : 4GB Kingston 2GB x2
MBO : ASRock 970 Extreme4 — full ATX socket AM3+
PSU : 1200w Cooler Master Silent Pro Gold BEAST
Stay tuned for the new benchmark results. 

Controllers, Controllerism, Controrgasm?

virtual xBox 360 controller - high fidelity modelMy god. I haven’t installed this many device drivers and intercept layers in… well, in my entire life. VR exploration and 3D creation require whole new methods of interacting with a universe, and tho the Apple Magic Mouse — essentially, a crazy-accurate optical mouse with a curved multi-touch surface fused to its carapace — is still my go-to interface after all these years, I find myself asking: What is the best interface for navigation and manipulation?

In the past month, I’ve configured my lovely MacBook Air LightWarrior(TM) to accept ALL the following input streams:

  • LeapMotion — “AirHands” pointing and gesture tracking
  • xBox360 wireless game controller — 8 buttons, 2 joysticks, 2 pressure sensitive triggers, 1 D-pad… all in the palm of your hand
  • 3Dconnexion 3d mouse — an amazing piece of kit that should be standard for all 3D design rigs.
  • Unity xTUIO iPad multi-touch simulator — enables one to use a bluetoothiPad to send multi-touch signals in real-time to the Unity application.

and, of course:

  • Oculus Rift Optical Head Tracking — explore the Multiverse simply by looking around, i.e. moving your torso, head, neck, and eyeballs. CAUTION: may cause a very bizarre form of RSI (repetitive strain injury), a.k.a. a VSN (Very Sore Neck)

Curious to know more about my explorations?

Follow Rift Adventures, updated daily!