News

Meet Nick Whiting, Epic’s Technical Director of VR/AR

It’s been said it only takes one visionary to champion VR within an organization; that one passionate (and sometimes crazy) person who “gets it” early and then slowly but surely gets others in the group on board. As the publisher of Unreal Engine 4, Epic Games is at the forefront of developers creating new worlds in VR, and we recently sat down with the man driving their VR efforts: Nick Whiting, Epic’s Technical Director of VR/AR.

What we found most notable about the conversation was unexpected: a good 40% of Epic’s VR efforts are outside of gaming and Nick provides a lot of interesting insights and lessons learned between the two spaces.

Read on for the full interview.

VRSCOUT: Who are you and what do you do? 

I oversee all things VR and AR for Epic Games. We make Unreal Engine 4 (UE4), so our goal there is to make all these platforms accessible to everyone and ensure the content that you build for one platform is transferable to all other equivalent platforms. We make an interface where users can focus on creating content while leaving the technical nitty-gritty deep stuff to our engineers here. To help do that, I coordinate engineering efforts internal to Epic as well as well as manage relationships with other companies like Oculus, Valve and Samsung to make sure that UE4 lets people maximize the potential of the engine to the fullest and that it’s easy to use. I’ve also worked on all the VR demos that Epic has done to date.

How did you get here?

In college, I studied a lot of things, including biomedical engineering. The relationship between human and machine always fascinated me; it’s one of those things that always seems a little father off than I think it actually is.  VR makes those sorts of sci-fi dreams a reality today. So I was always fascinated by it, especially how you can fool human perception to make it feel like it’s in another place. When I graduated college in 2005, the technology was pretty limited and relegated to military and really high-end entertainment; it wasn’t something people could get their hands on.

So, once I was working at Epic, one of the things I did during Gears of War 3 was work on getting Scale Form, which is a UI middleware solution, into the engine. As part of that, I got to meet Nate Mitchell and Brendan Iribe who later became two founders of Oculus. Right before they started Oculus, they called me up one day and said “Hey we got this really cool, crazy duct-taped piece of hardware. We don’t really have any software to show on it. If we send you one, would you noodle about with it after hours and see if you can get Unreal Engine running in it”? So, he sent us one of their first kits and I worked with them developing UE4; we ended up bootstrapping the Elemental VR demo. They demonstrated that, before PS4 and Xbox360 were out; it won The Verge’s Best of Show at CES 2013 running our demo. After that, every time Oculus released a new piece of hardware, they would send it to me and Nick Donaldson and we’d make small demos for them; Showdown and Bullet Train were more.  But the interesting thing is even after we won Best of Show, it still took a while for the rest of Epic to see it as something real.

There are kind of two camps in VR: those that can put the rough, duct-taped prototypes on their face and see past the limitations and those that want to see something really tangible and near-field. The turning point was when our CEO Tim Sweeney went over to Valve and saw their room-scale demo where they really nailed it with the 90 FPS and full-room tracking. That was the “a-ha moment”, then they let me hire up a VR team and turn it into a real deal. Now we’re growing the team, hiring lots of people, really exciting times. Especially since we’re not stuck with only one hardware platform; we want to work on all of them and we’re in the middle of it.

epic-games-unreal-vr-editor

Since a large part of the readership on VRScout is interested in cinematic and some of the more utilitarian use cases of VR, can you describe some of yours and Epic’s work in the areas outside of gaming?

Games are a big challenge in VR and AR. Some of the hardest problems are trying to make people not sick and feel immersed in a game while doing something completely unrealistic. So, in a game, the action is super-fast paced, there’s so much stuff going on- you want to be superhuman and those are all the hardest problems to solve in VR. With UE4, especially in the past 2½ years since we moved from a subscription to a free model, we’ve seen a lot of interest in markets outside of gaming: architectural, visualization, simulation, film and those kind of “other” areas where you don’t have to necessarily solve those hard problems all at once. You can take little bits and pieces of them, so I think one of the fastest growth areas for us is not as a game engine, but as a real-time rendering toolset.

We come from a background where frame rate is very important and good-looking real-time interactive visuals are very important, but we’ve increasingly been talking with people who come traditionally from the offline world who render frame times in hours instead of milliseconds, so it’s really a very interesting confluence to bridge that gap. They’re markets that I really hadn’t had much insight into prior and I especially like working with film companies. It’s cool to see their perspective on things. We worked with Weta Digital and bringing in one of the scenes from The Hobbit and making it real-time. They have such an incredibly deep pipeline for simulating everything down to the nines because they throw it to these giant supercomputer farms, something we’d never even approached with games. With that demo, the simulation took a week until we got to see the results which was just mind-blowing for us; but when you see the results, you see the fidelity. You just don’t get things like that in games necessarily, so I think we complement each other. We game designers know how to make things by take shortcuts and things that are pre-computed to render in real-time, whereas they have to know how to really deep dive on these physiological simulations; just seeing the ligature and the motion on the skin of the characters was absolutely incredible and we’re very proud we helped them get that to render in real time.

What do you think cinematic folks can learn from you folks in gaming? 

One of the biggest things is, with film, you have a lot of investments in these huge pipelines and when something isn’t quite right, you can fix it up in post or paint over it, so to speak. That’s really not something that you can afford to do in real-time visualization; games are all about “smoke and mirrors” and how to fake the player into believing something that isn’t necessarily 100% accurate. The tips and tricks we can offer that are basic to game developers really haven’t had to be used at all in the film industry; a lot of the real-time rendering techniques- like using the GPUs to pre-compute some stuff that needs to look really good and hook all the real-time tricks we have for lighting it, making the animations play fast, and so on; these can be really beneficial (to cinematic producers).

I think the biggest thing that film hasn’t had to deal with is the interaction and presence of a viewer in their story. With a movie, you’re watching the director’s take on the camera cuts, the perspective and the timing. In games, we haven’t had that luxury; there are a lot of cinematic games where we’ve gotten very, very good at making it feel like the player himself is pacing the action. This is done by doing tricks: waiting until they look at something before you kick off action; really making the player feel like the center of the world during the interaction. We see a lot of very interesting crossover there; just game-design tricks that we’ve been doing for a long time that the film community hasn’t had to think of.

We’re strong in areas they’re weak and they’re very, very strong with over 100 years of experience in areas that we tend to gloss over, so things that are very simple to us are novel to them and vice versa.

Shifting gears now. Since it was such a popular demo and you mentioned it earlier, we have to ask: any plans for Bullet Train?

Bullet Train was a tech demo created on a short timeframe; it was made in about 10 weeks starting out with me and one other person for the first 4 weeks and ramped up to a sustained team of 12 people. The focus of it was to really to see how far we could push the visuals because it came out at a time when Valve (with HTC) and Oculus had launched their specs for shipping and published recommended PC platforms to run on stuff. We set out to see how good we could make something look in a very practical sense, to set a bar. It was the first time we had motion controllers so it was the first time we got to interact in the virtual world in a meaningful way.  So, the demo was basically a) a test piece so that we could push our rendering pipeline and have a tangible target for optimizations that we were considering, and b) to explore using the motion controllers. I think we learned a lot on both those things; Bullet Train had a lot of technical advances that let us take a visual leap over previous demos and also gave us a lot of learnings on how to use the controllers. Previously, all our demos were passive experiences where the action unfolds around you but the player had no actuation in the virtual world.

Bullet Train was an awesome demo, but moving forward, we’ll be taking the lessons learned rather than necessarily making a full game out of it. Now we have a fixed target because Oculus and Vive have both released, we know what we’re aiming for and we’re happy with that.

Bullet Train Oculus Rift Epic Games VR

Last but not least, what are you most excited about for the future?

If you look at the history of narrative, we started with an oral storytelling tradition around the campfire- then we added the literary element on top of that which added not only consistent record of the stories being told, but also literary technique, expressiveness and imagery. On top of that, we added acting and plays, starting to add the human element. Then, cinema came along and, at first, cinema just emulated plays- static camera angles; they’d literally record plays and that’d be it. But slowly, people started creating films of opportunity and modern cinematography, like cutting the camera, panning and changing color tones. Games then started off with cut scenes from movies and dropping them into games but, then, they started doing things like interactive cut scenes allowing the player to affect the outcome of the story. So, what I’m really interested in is how people take VR and do things that are completely unique to the VR medium to build upon the narrative context and toolset that we have now. I really want to see people push the edges of surrealism; we’re starting to see this in some of the projects in film studios where they mess with reality and mess with your expectations.

I’m also super excited about digital humans in VR; the expressiveness of feeling another presence.  We did a networked multi-player experience for DK2 2½ years ago called Couch Knights that explored shared presence; players have real-time avatars in a shared space. There were no controllers, but we used the motion of the head-mounted displays being tracked to fake motion on the players. So, if you simply leaned forward and moved your head around, the other player could see that and vice versa. Even just that was incredibly powerful. I’m looking so forward to see how far we can push this thing.

About the Scout

Lea Kozin

A champion of applying leading-edge marketing across media, entertainment, sports, and now VR, Lea Kozin is a marketing director, active speaker and blogger. She's also probably one of the only people who was both featured on a rap album and a medalist on the high school academic decathlon.

Send this to a friend