Arrival of a Train… and a new medium is born

Ever since watching Sascha Unseld’s talk at Oculus Connect last year, which prominently featured “Arrival of a Train“, one of the very first silent films, we’ve been noodling on a little side project concerning a VR capture of a train locomotive in action.

After many trials and errors, and much reverse-engineering of train schedules, and inclement weather, we’ve finally captured a nice little rough draft piece of footage. We’ve posted the 360 footage, captured with our Ricoh Theta, to YouTube, for your enjoyment.

Here are a few stills. Scroll down for the complete video. Be sure to a) fullscreen it using the little [__] icon in the lower right of frame, and b) click and drag around with your mouse (or look around if in googles).

Oh, and we’ve included the original “Arrival of a Train” for your viewing reference. The reason its so famous? Reportedly, crowds fled the theatres waaaaay back in the day, convinced that the 2d animation on the screen was in fact a train headed directly for them. This allegory is often used to communicate how audiences viscerally experience a brand new medium — half is the content, half is the novelty of the experience.

gTrain-inbound-01

 

gTrain-midtrain-02

gTrain-freight-03

I was quite concerned that the train would knock over the tripod, it was within inches of the car extents. Thankfully, the rig survived, and we even got a friendly wave from the engineers.

 

 

and, finally, the original to which we pay homage:
Arrival of a Train at La Ciotat (The Lumière Brothers, 1895)

Capturing Virtual Worlds to Virtual Cinema : How To

We’ve just read once, twice, three times this most excellent tutorial / thought piece by D Coetzee, entitled “Capturing Virtual Worlds: A Method for Taking 360 Virtual Photos and Videos“.

The article gets into the dirty details of how we might transform a high-powered, highly interactive VR experience into a compact* file sharable with all our friends on a multitude of platforms (cardboard, mergeVR, oculus, webVR, etc)

Having spent a great deal of time figuring out these strategies ourselves, its good to see someone articulate the challenge, the process, and future development paths so well.

360 3d panorama thin strip stitching

Most accessible present-day solution: a script that renders thin vertical strips with a rotating stereo camera array, then stitches into the final panorama

Enjoy.

  • the term “compact” is used here liberally. A typical 5 minute VR experience might weigh in at 500MB for the download. Transforming this into a 120MB movie might be considered lightweight… for VR. Time to beef up your data plans, kiddies. And developers, say it with me now : algorithmic content 🙂

 

First person VR lightsaber : the design intent

We’ve helped pioneer first person VR lightsaber control in-Rift with our ScenePlay demo app. This is what happens when you take that vector and extend it towards its logical conclusion: just add photorealistic rendering, VR cinema backplates, AI stormtroopers, laser bolts and explosions… voila.

Consider this advanced pre-viz of the experiences coming down the pipe in the next 3 years. Start practicing up your swordplay skills, and Enjoy.

What’s a lightsaber look like, you might ask? Well, this:

Mark Zuckerberg tests out the new Oculus touch hand controllers as Brendan Iribe observes

Mark Zuckerberg tests out the new Oculus touch hand controllers as Brendan Iribe observes

And this;

Testing out the Sony Move hand controllers paired with the Sony Morpheus VR HMD for the PlayStation 4

Testing out the Sony Move hand controllers & Sony Morpheus VR HMD for the PlayStation 4

Or, if you prefer the dark side, go ahead, play Vader:

Luxo, meet Henry : And the Wheel in the dSky keeps on turnin…

In 1986, Pixar Animation Studios created its first film, dubbed Luxo Jr.

luxo-jr-still-pixar-cgi

It was presented to a sophisticated audience at SIGGRAPH, the annual convention for those in the know, and in the engineering, design, and creation of the best 3D graphics in the world. At the time, the air was one of hope; Tron had just been released to audience acclaim a few years earlier in 1982, the first movie to have significant computer-generated visual effects (nee CG/VFX) sequences, and whispers in the wind predicted at some point, at some time, a completely 100% CGI (computer generated imagery) film would be created.

In less than 2 minutes, Pixar proved that seemingly inanimate objects, in this case simple desk-lamps and inflatable toy balls, could exude, even ooze, character, charm, and even emotion.

toy-story-key

That one film opened the door to an entire new era of Hollywood cinema, which arguably exploded into mainstream consciousness a decade later in 1995 with Toy Story, the first commercially successful, full length, completely computer-generated CG film… again, by Pixar, the love child of George Lucas, John Lasseter and  Steve Jobs.

Fast forward 30 years. Oculus, the amazing company that launched on kickstarter and was stunningly acquired by Facebook for $2 billion a mere 2 years later, launches a virtual cinema division dubbed Oculus Story Studio.

And now, here’s Henry.

Meet Henry from Story Studio on Vimeo.

Henry is a clear attempt to move from hardcore, sci-fi, robot-loving gamers, into a more general, emotional, human populace. Presumably, a human populace willing to put on a pair of blackened ski-goggles in order to watch — scratch that — in order to experience, genuine story-fed emotion.

Henry is, plain and simple: An attempt, albeit a decent one, to launch a new category, animated cinematic interactive VR… with a best-in-class example.

Come August 28, and more-so, come February 2016, the world will decide.

Here’s hoping they succeed.

 

 

 

 

Switch Hitting : back to Mr. Potter

great meetings with the Studios in Hollywood. Put some post-demo polish on the Star Wars featurette, now it may be time to revisit Harry Potter in his most awesome game of Wizard Chess VR. Remember BattleChess? You ain’t seen nuthin yet.

Early early pre-viz:

Harry Potter Wizards Chess VR

scene recreation, v0.03

Harry Potter Wizards Chess

Wixards Chess, still from Harry Potter and the Sorcerer’s Stone. © Warner Brothers

our first task is to do the basic “blocking” on the scene. For this we use simple capsules as stand-ins for the actors. Once all the gross movement is accounted for, then we swap in the high resolution humanoid models, and slowly add in actual gestures / movements / walks / hands / head animations, lip sync, and even eye gaze, aka dristi.

Harry Potter Wizards Chess

character stand-ins for basic scene blocking

Harry Potter Wizards Chess VR

Harry Potter Wizards Chess, Harry Potter and the Sorcerers Stone © Warner Brothers

Onward.

Approaching Cinema : Lessons Learned in 360 Capture

dsky-wilderness-v011-title-splash-screen

We’ve created some very rapid prototypes in the past 10 days, just to test the waters (no pun intended) of cinematic 360 capture and playback within VR HMDs.

viewing the photosphere in-engine, with polygons. cool.

viewing the photosphere in-engine, with polygons. cool.

The tests have been mostly very rewarding.

Our findings are as follows:

  1. Initial 2D 360 still / video capture is easy. We started with the Google Photosphere app, free on Android, which takes about 5 minutes and 40 photos per sphere. We’ve since upgraded to the Ricoh Theta, which captures 360 video with a single button-press.
  2. consider EVERYTHING in the 360 field of view. Its all in the shot. There’s no back stage. This concept takes a lot of getting used to if you’re used to working with lights, sound techs, and crews.
  3. Editing is time consuming. Easier to clean physical reality prior to the actual shot, then to paint it in post.
  4. A base plug at the foot of the shot is a nice touch, both visually and to cover the merge seam.
  5. Similarly, we use a lens flare to simulate the light dynamics of the sun.
  6. Audio engineering is key, time-consuming, fun, AND makes the difference between “just another photosphere” and the feeling of presence. You collect video at a single point; audio should be collected at all the local sound origination points, then placed into proper 3D positions in post, with filters.
  7. Since we’re authoring all this within the game engine, we’re having a lot of fun with the 3D positional audio. Placing sounds, even animating sounds as, say, a bird flies across the forest canopy.
into the wild... virtually.

into the wild… virtually.

And finally, there are some things, some of the best parts of nature, which simply aren’t going to be in VR anytime soon. Those being, the elements. Wind in your hair, and clean running stream water on your bare feet… those will have to wait.

fresh water from the springs... yes please!.. but not in VR.

fresh water from the springs… yes please!.. but not in VR.

For a 2D sample of what’s being created and captured in the world, start with YouTube’s shiny new 360 video channel.

Are there places or experiences you’d like to see us model in VR? Do you see yourself capturing and publishing your own 360 experiences?

Continue the conversation in the comments below:

Cinema meets Videogames : Strange bedfellows, or Match made in Heaven?

We’re fresh back from the epic Digital Hollywood conference / rolling party in LA, and chock full of ideas of how to integrate classic cinema techniques with our native videogame tropes.

See, 80+% of the participants at the conference were from traditional media backgrounds — music, TV, film. And while VR was absolutely the hot topic of the show — as it was for CES, GDC, and NAB — there was as much confusion as there was excitement about the commercial and artistic promise of this brand spanking new medium.

One of the key findings on our part was a genuine need to integrate cinema techniques, aka linearity, composition, color and storytelling —  into our hyper-interactive realm of videogame design. Thus began our investigations. What exactly does it take to make full, HD, 3D, 360 captures of real-world environments?

We’ll get into more details later, but for now I want to spell it out, if only for technical humor: It takes a:

  • 12-camera
  • 4k
  • 360°
  • 3D stereoscopic
  • live capture stream
  • …stitched to form an:
  • equirectangular projection
  • over / under
  • 1440p
  • panoramic video

Say that 12 times fast. Oh, and be ready for handling the approximately 200GB / minute data stream that these rigs generate. Thank god for GPUs.

What does that look like in practice?
Like this:

A still frame from a 360 stereoscopic over/under video. Playback software feeds a warped portion of each image to each of the viewers eyes.

A still frame from a 360 stereoscopic over/under video. Playback software feeds a warped portion of each image to each of the viewers eyes.

And how do you capture it? With something like this:

360Hero GoPro stereo 360 rig

12 camera GoPro 360Hero rig

Or, if you’re really high-budget, this:

Red Dragon 6k 360 3D stereoscopic capture rig by NextVR_Rig1

array of 10 Red digital cinema cameras (photo not showing top and bottom cam pairs)

Though personally, we really prefer the sci-fi aesthetic:

jaunt-sci-fi-rig-header

an early 3d 360 capture prototype by JauntVR

Then there’s the original 360 aerial drone capture device, circa 1980

Empire Viper Droid, Empire Strikes Back, c. 1980, LucasArts

Empire Viper Droid, Empire Strikes Back, c. 1980, LucasArts

Then, the ever-so-slightly more sinister, and agile version, circa 1999…

Sentinel Drone, The Matrix, 1999, via the Wachowski brothers

Sentinel Drone, The Matrix, 1999, via the Wachowski brothers

What do you think? Is the realism of live capture worth the trouble? Would you prefer “passive” VR experiences that transport you to hard-to-get-to real world places and events, “interactive” experiences more akin to xBox and PlayStation games, or some combination of the two?

Join the conversation below: