jan pinkava view conference

View Conference 2018: Jan Pinkava explains some VR experiences from Google Spotlight Stories

Do you know Google Spotlight Stories? If not, well, you should. It is a small R&D lab on creative projects hosted by Google that is focused on creating original VR storytelling experiences. At View Conference 2018, I’ve heard the talk of Jan Pinkava, that is the director of Google Spotlight Stories’s experiences (apart from being the co-director of Ratatouille and a winner of an Oscar prize), that has detailed the creative process behind their work.

Jan has introduced what is Google Spotlight stories and has told that so far they have released 16 experiences, that are all different one from the others, both in term of graphical style, type of experience, destination hardware, etc… So, it is really a work of research on evaluating what storytelling in VR can do, exploring all its possible modalities. And having seen some of these experiences, I can assure that they are all high-quality ones. If you want to enjoy them on Cardboard, you can find them on the Google Spotlight Stories app on Google Play (Jan has joked about this a lot in his talk… he continuously asked “who has installed our app now?”)

google spotlight stories melies vr storytelling
The 16 experiences developed so far by the Google Spotlight Stories team

During the talk, he mostly talked about three of these apps: Back To The Moon, Piggy, and Age of Sails

Back To The Moon

Back To The Moon is the first virtual reality doodle of Google’s history. It is dedicated to Georges Méliès, the director that has invented various cinematographic techniques and that is recognized as the father of special effects. In this short experience, Google wanted to show all the most important techniques invented by Melies in one single 2-minutes short movie.

That was, of course, a hard task, but Pinkava and his team made a great job. Jan highlighted in the talk how this 360 experience was designed. He showcased a lot of sketches in this sense.

First of all, we all know that even if the movie is a 360° one, the user can only look in one direction at a time and that according to statistics, for 70% of the time, this is the frontal one. So, basically, the team has divided the scene into 3 parts:

  • The frontal foreground one: hosting the main action of the story, with the main characters. It is the part where the story unfolds;

    google spotlight stories melies vr storytelling
    In every moment, there is the main area in front of the user when the main action unfolds
  • The frontal background one: acting as a contour of the main scene, it adds some details to it and make it more interesting;

    google spotlight stories melies vr storytelling
    Here there are elements that are nice to be seen and add more value to the main action
  • The rear one: it is composed of things that are not useful for the story, but may be interesting or funny to be seen.

    google spotlight stories melies vr storytelling
    In the opposite direction of the main action, there are things that are nice to be seen, but that are not useful for the story at all

So, the story unrolls only in front of the user, but the user can turn his head sometimes to see additional elements reacting to the developments of the main plot. This makes sure that the user can follow the main plot comfortably without having to continuously rotate his head.

After that, they started sketching the story, deciding what should happen during the 2 minutes of the duration of the video. This meant deciding not only the main plot but also all the other little side-stories that unroll at the same time. This means that there is not only a timeline, like in traditional film-making: now there is a two-dimensional storyline, that has both the time and spatial component. This is because you have to develop the stories for all the 360° space around the user, not only for a frontal frame. And also the main story doesn’t happen in a fixed place, but the main characters move in the 200° circa of the main action frustum. So designing the story now requires a 2D table.

google spotlight stories melies vr storytelling
The 360° view of the user has been divided into 8 regions of 45° each
google spotlight stories melies vr storytelling
After that, the team has written a storyline encompassing all the 8 regions around the user. Notice that the 1D timeline has become a 2D table

The doodle should be available for multiple platforms: in some, it has been presented as a 360 video, while in other it was rendered in real time. Of course, this meant performing a lot of optimizations to make it pleasant on mobile without being computationally heavy.

Jan and the team decided that, when available on real-time platforms, the main story should unfold only when the user was actually paying attention to the characters. This may seem an easy task (it is just a trigger that makes the characters do stuff, isn’t it?), but it is actually difficult because you have to think what all the characters of the scene should do when you are not looking in the direction you should look. And even more difficult: what to do with the music? The music should follow the main story, but since the user could look everywhere, pausing the development of the main story… what the music should become when the main story is paused? How can you script a music that you don’t know how much it lasts? How can you script a music that is synchronized with a story that you don’t know how will evolve?

Well, here some tricks are needed. Jan has shown us one: the main music is written in different pieces, that, if the user always looks as intended, just play one after the other, forming the background music of the main story. If at a certain point the user is looking somewhere else, the system plays the current piece until the end and then starts playing a looping piano break, that continues playing until the user looks again at the main story. At this point, the next audio piece of the main soundtrack will be started. Of course, all this music gluing should be harmonic, whatever the user does, so you need a talented audio artist.

google spotlight stories melies vr storytelling
If the user follows Melies, the first two tracks play one after the other. Otherwise, when the first ends, the piano break starts playing until the user watches Melies again. At that time, the second audio piece starts

In the end, Jan showcased us the video, making us notice that while the main action unfolds, a lot of stuff actually happens in all regions of the video (this is easily noticeable if you look at the equirectangular projection of the video).

Piggy

Piggy is one of the most recent works of Google Spotlight Stories and is an experiment that is very different from Back To The Moon. While in Back To The Moon there are things that are happening all around the user, in Piggy there is only a pig, a piece of cake, and you. All the rest is white.

The story is very simple: there is a fat pig that should exercise and not eat that tasty piece of cake that is next to him. Of course, his plans are completely different, he really wants to eat that cake… and so you have to keep on eye on him to not let him eat the cake.

This experience is all based on interactivity, on how your gaze can change the story, on how you can enjoy a storytelling experience even if there is almost nothing in the scene.

And even here there has been a lot of work on sketching the character accurately (it is the only thing in the scene, so it has to be perfect), and on implementing it so that it appears good without being computationally heavy. Furthermore, all the things that he does have to show his personality. All the animations of Piggy have been pre-recorded so that to make it more lightweight for the mobile phone.

google spotlight stories piggy vr storytelling
Sketches of the Piggy character

Then there is a lot of work in developing the story: on the contrary of Back To The Moon, here there is not the main story that unfolds in a linear way: here there is a story that can evolve differently depending on what the user does (where the user looks at every moment). So Jan and his team have spent a lot of time to develop a graph of possible states, of possible ways that the story may unfold. There is only one start point and three alternate endings and different possible paths to arrive at these endings. And all these paths have to appear perfectly glued together in a coherent way that is pleasant for the user. So the story will be different every time you will experience it, depending only on where you will look.

google spotlight stories piggy vr storytelling
This is the main graph of the story of Piggy, that, as you can see, may evolve in various different ways. Pinkava says that this story is like our lives: you only live a path, but actually, you may have lived a lot of different possible stories
google spotlight stories piggy vr storytelling
A more detailed view of the graph, specifying what happens in every stage

The story evolves in a different way depending on how well you will control Piggy… he will try to evade your surveillance in every way and you have to be careful to always keep an eye on the cake, or he will eat it! He is a very smart pig. So, your gaze direction, where you will look in the story will decide how the story will evolve.

google spotlight stories piggy vr storytelling
The evolution graph alongside with the bounding elements of the elements in the scene, that is used to detect what you are looking at in every moment

When Jan showed us a video of one of the possible evolutions of Piggy, I could appreciate how the experience is very simple but very enjoyable at the same time. It is very funny because the team carefully studied the different evolutions of the plot so that whatever happens, the user would find it amazing. And also Piggy has been designed to be a funny pig. The fact that you decide the action and that Piggy is aware of you, is something that also increases a lot the sense of presence. It is a very nice experience.

Age of Sail

Age of Sail is a still unreleased experience (it should come in a few weeks). It has a completely different tone from the previous ones: it is the story of an angry old sailor, that is angry because steamboats are becoming popular and so sailboats are becoming less useful. He has lost his faith in everything, and at a certain point, he has to rescue a little girl that is drowning in the sea. It is a history of redemption.

google spotlight stories age of sail vr storytelling
An angry sailor is one of the main characters of the story

The story is not funny at all, it is instead dramatic: Pinkava made us see half of the experience and it had various sad moments. And also the graphics style reflect this drama: it is crude. They looked for something that was beautiful but at the same time low poly and so they took the inspiration from some painters that were used to paint about the sea and sailboats.

This experience was made in the sea because Google Spotlight Stories team asked itself what is that thing that VR is good at and they answered that VR is so special because it can really make you feel as being inside a particular place. And that’s why they chose the open sea environment: because the open sea is something in which VR adds a true value to the experience, it is something that is impressive to be lived in VR. In VR you can really feel in the sea, on this sailboat, as if you were there, with that angry man, and it is something that in no way a 2D screen can give you in the same way.

google spotlight stories age of sail vr storytelling
The girl is drowning in the sea… not the funniest moment ever

The open sea also added various problems:

  • The sea is enormous and has lots of waves, so they had to ultra optimize everything. For instance, all the waves dynamics, that is physically correct, has been pre-baked in the sea animations.
  • Being on a boat in VR, can induce a high degree of motion sickness and so the team had to find a way to mitigate it.
google spotlight stories age of sail vr storytelling
Texture optimization in the old sailor

To make the various objects of the scenes optimized and pleasantly looking from all distances, the team at Google employed meta-textures (that are some kind of real-time procedural texture). Thanks to this, for instance, the sea appears in the same way from whatever distance you look at it.

Age of Sail is coming soon… and if you are interested in it, this is the trailer!


I really appreciated the talk of Jan Pinkava because it showed me how it is possible to create VR storytelling experiences that are very different the one from the others but that are all amazing in the same way. And it also showed me that a lot of work is required even for design a 2-minute experience.

And a lot of work is also required to write blog posts… so if you have liked this one, please subscribe to my newsletter! 🙂


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

3 thoughts on “View Conference 2018: Jan Pinkava explains some VR experiences from Google Spotlight Stories

  1. Hey Antony! RFC here.

    great article, and great access 👍

    very interesting work coming out of Spotlight Stories. Really love Pearl, Sonaria and Back to the Moon, amongst their other work. Isle of Dogs and Son of Jaguar were awesome experiences. Then of course “Saturnz Bars” from Gorillaz a standout no doubt.

    Clever work getting Pearl to run on 6DOF system like Vive, but also performant on 3DOF system like Daydream.

    Happy XR!

    1. Woooohooo! I am so happy to hear from you again here on my blog, I really missed your comments!

      Google Spotlight Stories is doing a great work for sure…it’s a pity that I haven’t managed to interview Pinkava to have more insights on their work… maybe next time!

Comments are closed.

Releated

playstation vr 2

The XR Week Peek (2024.03.26): Sony halts production of PSVR 2, Meta slashes the price of Quest 2, and more!

This has been the week of GDC and of the NVIDIA GTC, so we have a bit more pieces of news than the usual because there have been some interesting things announced there. Oh yes, also IEEE VR had its show. Actually, a lot of things happened for tech people in the US, which is […]

We need camera access to unleash the full potential of Mixed Reality

These days I’m carrying on some experiments with XR and other technologies. I had some wonderful ideas of Mixed Reality applications I would like to prototype, but most of them are impossible to do in this moment because of a decision that almost all VR/MR headset manufacturers have taken: preventing developers from accessing camera data. […]