Lighting the Torch for In-App AR Development, with TORCH’s Paul Reynolds

September 04, 2019 00:41:58
Lighting the Torch for In-App AR Development, with TORCH’s Paul Reynolds
XR for Business
Lighting the Torch for In-App AR Development, with TORCH’s Paul Reynolds

Sep 04 2019 | 00:41:58

/

Show Notes

Game engines like the versatile Unity have long been the go-to for AR development, and for good reason. But its reputation as a video game engine can also be intimidating — especially to those who want to create AR software for enterprise. That’s why Paul Reynolds lit his TORCH; an app he co-founded that lets you design your own AR platform, right in the palm of your hand. He chats with Alan about his claim to flame.

Alan: Hey, everyone, my name’s Alan Smithson, the host of the XR for Business Podcast. Today’s guest is Paul Reynolds, the CEO of Torch, a really exciting augmented reality platform. It’s a mobile augmented reality development and deployment platform for enterprise. Paul has been a software developer and technology consultant since 1997 – since before the interwebs! In 2013, after 10 years of creating video games, he joined Magic Leap where he was promoted to senior director, overseeing content and SDK teams. At Magic Leap, Paul recognized the lack of accessible tools for non-game developers that was hindering widespread adoption of immersive and spatial computing technologies. In 2016, Paul moved to Portland, Oregon, where he founded Torch to address this very problem. To learn more about Torch, you can visit torch.app. Paul, welcome to the show.

Paul: Thanks for having me.

Alan: It’s such a pleasure. I’ve been looking forward to this episode. Torch is such a cool platform and I keep seeing your posts on LinkedIn of putting stuff around your office and stuff. So tell us, what is Torch, and how did you come up with this crazy idea?

Paul: The easiest way to think about it is, it’s a mobile application — currently for iOS — that lets anyone build interactive spatial scenes. So, you create a project and you’re building it in the camera of your device, which means you’re also walking around the space, or moving around the space and you’re building up interactive experiences visually, without writing any code. We call that the design environment, and that’s the freely available [option] — anyone can jump into it and just start building. What makes it a platform is the capability of taking what you’ve created in Torch, and exporting it and publishing it and integrating it into your existing app, or pushing it out to another platform or tool. What we really wanted to focus on was allowing people to iterate in augmented reality — directly within augmented reality — as opposed to sitting on a desktop computer and trying to figure out how to work a game editor, and get more people able to work productively in 3D. That’s really the heart of it.

Alan: That’s so cool, because if you’re sitting at your office, you’re like, “wow, this AR stuff is hot. It’s amazing.” You know what, go learn Unity and coding and figure out how to actually make it. Six months later, you’re like, “oh, look, I made a portal.”

Paul: [laughs] Right.

Alan: What you guys have built is a simple way to just do it visually.

Paul: Yeah. So, my background was in video games, back in the day where everyone was building their own engine. You really didn’t even have time to build a really nice editor on top of that. So when Unity came out — we’ll pick on Unity in particular, because it’s just such a well-known product — when it came out, it was the game engine that most of these game studios I’ve worked for wanted to build. And that was really unique; they had basically taken what would normally mean millions and millions of internal R&D dollars, and turned it into this tool that pretty much anyone can download for free. But what happened over the past few years is, it’s become kind of the de facto interactive 3D tool. And it was for me as well; I’ve been a Unity user forever. What we learned when we started building a platform at Magic Leap for third party developers that are not necessarily all looking to build video games — so, enterprises and brands — the conversation there was first, somebody in your team needs to learn Unity to even get started. Just like you said.

Alan: And this like a red flag for these companies because they’re like, “what the hell is Unity?”

Paul: Right, yeah.

Alan: And then they ask their tech teams and they’re like, “does anybody know Unity?” And nobody knows what it is. Then they look it up, they’re like, “oh. Yeah, well, that’s for video games.”

Paul: Right! That’s exactly right. And the disconnect that has been — coming as a veteran game developer — a young kid comes up to me and says, “I want to get into making video games.” I’m gonna say “download Unity — for free — and take the tutorials, and you’ll learn just as much about real professional video game development as anyone else that’s actually out there as professional video game developers.” So, in our world, Unity — and let’s not just pick on Unity all the time, you know, there’s Unreal. There’s these other engines–

Alan: Improbable.

Paul: Yeah. For us, they’re the easiest way to get in. But then you have to remember — what I saw firsthand — was most of our target market for Torch, and most of the people I was dealing with at Magic Leap, they’d never heard of Unity. You’ve got to imagine there’s a much bigger world out there besides ours, that people don’t even know where to get started. I said, “look, the world is going to move towards spatial computing, where computing is going to be in our world, and there’s 3D input, there’s 3D display, and there’s cameras involved. And we’ve got to come up with more accessible ways to build software and experiences for these platforms, that doesn’t just come from video game technology.” (All that said, video game technology certainly plays an important role in all of this.) We said, “well, how are people building mobile apps today? How are web apps getting built?” Businesses are run — and new businesses are created — on top of these platforms. How do people build software — in 2018 at the time — what’s their workflow? The workflow is very design-oriented and very visual, at first. Instead of us building a piece of functional software, and then handing it to a designer and say, “hey, make this look nice and usable.” The mobile world in particular has put a lot of investment and flipped it on its head and said, “let’s make a functional prototype first — that everyone can all agree around — and say, yeah, this is the thing we want to build,” and then you engineer around it.

That was ultimately how we arrived at the current version of Torch, where we wanted to fit in with that enterprise workflow. And we say, “well, you’ve already used this idea in mobile apps in particular: prototype first, in-design first. So that’s what we built for AR; anyone can just jump into this and start putting 2D assets in video. We help you find 3D models. We have the requisite Sketchfab and Poly integrations. You can find 3D models. We’ve got Dropbox integration. Even though anyone can download it and start playing with it and working in it, we’ve tuned the integrations and the workflow around your standard UX designer, or a creative person at an agency, or someone that’s already building digital products that are enterprise, and we’re saying “here’s how you can start to test out, and share, and show some of these early ideas around augmented reality.” And as you get more comfortable with what’s good and bad in mobile augmented reality in particular, we’ll be there for you to help you get that deployed, and help you get that integrated, and turn it into a real product.

Alan: So it’s really a prototyping tool then.

Paul: So… it’s funny; it started strictly as a prototyping tool. Anyone can jump in and build something. The only thing you can really get out of it, was you could record a video — which is still very important in the screen-based world we’re in still, to be able to share an AR experience through video. But to add a collaborator, or have someone actually view what you’ve built in real time, you had to add them as an editor. We have a Google Doc-style, real-time collaborative editing, where you can have as many people as you want in a project, but you’re also giving them permission to edit the project.

Alan: That’s the last thing you want. [laughs]

Paul: Yeah, yeah. We’ve had people ask, “how many people can I add? Technically, how many collaborators can you handle?” And the answer is always, technically way more than you actually want collaborating, anyway.

Alan: No kidding.

Paul: We’ve had upwards of, I think, 50 people on one project before. The way we do the collaborative stuff, it’s not very taxing computationally. So, you can have tons and tons of people in there. After being out for… I would say, two or three months, we started seeing this trend, where people were coming to us and saying, “Torch is great. It’s the first time I’ve been able to actually feel like I’m creating in AR; I can pull in my assets” — and these are all professionals and they’re very limited on time — and they’ve said, “but how do we get this out of Torch? Where does it go?” Keeping in mind, we were patterning ourselves after — back to the mobile app world — where people may prototype in something like Sketch and InVision and Figma — these functional prototyping tools — and Framer. And then, once they decide to build the production version, they usually go to code; they’ll build it in React or whatever. And so originally, our thought was, well, we’ll get the creative people iterating visually in our prototyping tool, and that will inform the production team. So, these people will probably wind up building this stuff in Unity, or Spark, or Lens Studio — whatever the tools are at the time. But there’s this moment of handoff. The interesting thing in hindsight is kind of obvious, which was what I said earlier: a lot of the people that found us, don’t even know what Unity is.

Alan: So you’re like, “oh, wait a second, we’ve built this tool for you. And yeah, you can use it. But then you got learn… yeah. Oh, oops!”

Paul: Yeah. And on top of that, the things people are building today are fairly simple. Even the people that knew what Unity was, they’re like, “wait, you’re telling me I’m going to rebuild this thing, but it already works in Torch? So why can’t I just get it out?” And that was always on our longer-term roadmap. But we’re like, okay, this is obviously the direction we need to support, because now we have this very accessible, extremely fast workflow, that you can use as just a prototyping, pre-visualization environment. But we started adding more and more functionality to it, that would allow you to build full-blown experiences. It’s just a totally alternative augmented reality workflow, that people could use depending upon their use case.

And the other part of it was we were like, today — and still, this is true in the moment — right now, today, where is all the engagement, and — quite frankly — revenue, and where’s the rubber really meeting the road? In particular, with mobile augmented reality? And the obvious answer to me was social: Snapchat, and Facebook, and Instagram. They are pumping out a lot of augmented reality content. They’re making real revenue from it. They’re getting huge engagement. People are building lenses and filters that are getting hundreds of millions and billions of views. And brands are trying to figure out how to get into that. So we said, “well, those are pretty simple experiences.” We don’t do anything with the face in Torch; we’re fully concentrated on the world-facing camera, and the world-based experiences on those platforms are fairly simple. And we said, “well, if we gave people a few more authoring capabilities beyond just prototyping — let them hook up some more interactivity and call APIs and things like that — and then we allow them to publish out to these production platforms. We’re kind of filling a need in a lot of ways; filling a very fragmented ecosystem. Because the other part of this was,we kind of addressed the initial friction — for any enterprise that’s probably listening to this podcast, and they’re trying to figure out, “how do we even get started?” — we are trying to be there to say, “we are literally the fastest and easiest way to get started.” And we feel like we’ve fulfilled that promise pretty well.

But then, the other part of it is on the outside of the creation; it is the distribution and deployment. And if you look at that, it’s very fragmented. If you want to build for Snapchat, you’ve got to use Lens Studio. If you want to build for any of the Facebook properties, you’ve got to use Spark. If you want to build a fully-dedicated AR app, you’ll probably use Unity. If you want to embed an AR experience in an existing mobile app — which is a very popular request we hear — you’ve either got to write to Google Sceneform or Apple’s ARKit, or kind of hack Unity into your mobile app. It’s all very fragmented by the tool. So, we’ve eliminated the fragmentation at the early prototyping part, but there’s still all these crazy problems on the distribution side. In the New Year, in 2019 — and most recently — we’ve put a lot of effort in letting you publish and export out. The most exciting thing we announced around that was around the time of AWE – Augmented World Expo – we announced Torch 3, which lets you create a public link to your project.

Alan: Oh, cool.

Paul: Yeah, it opens in the Torch app, as a viewer-only mode. So, you’re not adding people as collaborators. We don’t require the viewer to log-in or register. They do have to download the Torch app. But it’s a very quick, fast way to say, “hey, I built something in AR. I want to tweet it out. I want to put it on LinkedIn,” or, “I want to send it to my client, or my internal team.” And people can very quickly iterate and view it, in real-time interactive 3D.

The other thing we announced is our ability to export a Torch project into another project. Our demonstration of that was, we were able to generate a Spark AR project from Torch. So — for Mother’s Day — we built a little Mother’s Day experience in Torch, but we actually published it through Facebook–

Alan: Cool.

Paul: –through the Spark export. So, that’s where we evolved beyond just prototyping, and

[became]

kind of a creative tool.

Alan: It’s awesome that you didn’t set out with that intention, but you ended up there.

We always knew that we would start with designers and grow a platform around that. I think what happened was we mapped to the mobile ecosystem — which is very mature — and the AR ecosystem is still growing and maturing. And if this were a much more established market, we probably could have built a pretty tidy business, just being considered the AR design tool. But we saw these bigger opportunities for filling in these gaps in the ecosystem. So, in some ways, it’s where we expected to go. But in other ways, we got there a little quicker than we had originally thought we should or would. And it’s been very well-received.

Alan: You’ve been working on this for a bit. How are businesses using this right now?

Paul: It’s kind of across the board. We’ve got people building Torch projects as internal tests or prototypes. But we have had people — and especially now that we just turned on this ability to publish and export; it’s technically under early access right now — we’ve been giving it out to people, so people are still just getting their heads wrapped around what they can deploy and what they can do with this capability. We’ve seen people use it for wayfinding — a super popular use case for us — because if you build content in the environment using a mobile device, building in a wayfinding experience — where you’re actually setting the checkpoints on the wayfinding experience physically in the environment — is just so much more intuitive and fast. For example, we built a couple wayfinding demos for Torch that have always gotten huge response online when we post videos. And… it’s really funny, because the first time I posted a wayfinding demo, of how to get from the front door of our office building to the front door of our office–.

Alan: I saw that it was cool.

Paul: It’s funny, because I’ve had people get to our office by saying, “I watched the video,” or “I remember the video and I just remembered how to get here,” which was kind of fun. But the funny thing was the divide in my super-savvy AR friends who’ve been in the business as long as I have. They thought we had scanned the building, or done complex measurements, because the wayfinding experienced in that case actually goes across two floors — two levels of the building — and they’re like, “what did you do? Did you scan it in, and then bring it into Unity?” I was like, “I literally stood at the front door and I placed the welcome checkpoint, and then I walked to the bottom of the stairs and place that.” We actually priced that out, and if you were to build it with the traditional desktop-based workflow, you’d have a team of people working on it, and we estimated it would take roughly $100,000 or more, a team of four people, and probably at least a few weeks to get it something workable and viewable. And I built it in about 45 minutes on a lunch break, and probably total cost — including recording the video and buying a couple 3D models — we were into it for just a couple thousand bucks. We’re talking about radically transforming the cost of building these experiences. So, wayfinding’s a great example of the cost efficiencies brought in when you actually build AR experiences inside of AR, like we do.

As far as vertical markets, we’re seeing a lot of interest around eCommerce, obviously — in shopping. But also physical retail. People are really interested in bringing a layer of digital experience into the retail environment. Travel and hospitality has been very engaged. Media companies. One of my favorite examples — I will qualify it with they are not using Torch yet — but ABC News Australia. I don’t if you follow Nathan Bazley on Twitter, but he posts these great little infographics in AR that they’ve built. We actually reached out to him and talked to him about his process. And ABC Australia is kind of like the BBC; they’re a government-funded media company, and two or three years ago, they saw AR as an interesting way to present information and to engage. And they actually built an Unreal engine-based app to publish news content to go on with their news. It was so expensive and difficult to update the app, every time they had a news story! Still, kudos for even getting it out, because that’s a huge leap, and no telling how much they spent.

Alan: Oh my God. In the hundreds of thousands.

Paul: Yeah. At least. Right? But they did see engagement. So, when Facebook came out with Spark, they said, “most of our audience is on Facebook and Instagram, so this is a great distribution for us. We could try this again.” And they taught themselves Spark. So now they have a little team of — I think — four people, two or four people, that build these little AR experiences that go along with that news story. And Nathan was telling us — and I think I’m quoting this correctly — when they put out a news video on those social channels, it gets around 30,000-50,000 views. And when they would put out a world filter, or AR-based experience, through the same channels, they were getting hundreds of thousands of views, highly engaged. What’s exciting for us is that they’re getting this value through this one workflow. So, what we can offer them — as an example — is, well, how about everyone in your newsroom can start creating AR experiences to go along with their stories, instead of running through this really small team that’s taught themselves Spark? That’s what we think Torch can provide, is that accessible workflow. And by the way, if you’re already putting together the experience and these assets, and you’re building this AR thing, why not deploy it everywhere AR can be viewed? Why not push it to not just Facebook, but possibly Snapchat? But also your own mobile app that people have installed? And what about wearables? Giving people this flexibility and freedom of publishing is something that we’re seeing resonate with, like, media companies; people in the book industry, and film and television.

Alan: You don’t want to make things twice, that’s for sure.

Paul: Not when it’s this hard and expensive — and experimental — for a lot of people.

Alan: No, you’re right. So you’ve been working on this for a couple years now. You were Magic Leap before. What are some of the experiences that you’ve seen — either made on Torch or otherwise — that have just kind of blown your mind?

Paul: I mean, obviously, my early days and Magic Leap, it was really where I saw incredible things I’d never seen before. That was when I became convinced that spatial computing was coming. Some of that stuff is now public. Now that they’ve released the device and some of the content, I was a part of the org of that company that built the Dr. G, game where they’re shooting robots.

Alan: Yeah. Cool to see that for the first time.

Paul: And we’re talking 2014? That was pretty cool, but it’s only gotten better. Obviously, most of my most crazy experiences have been around that tech in particular, just because we used a bunch of different hardware and stuff. The Magic Leap one that just shipped last year is certainly the most consumer-friendly version of the tech that we’d worked with. But there were some demos and things that I saw that were just very unusual, where people are claiming that they feel temperature changes on their hands when a little firefly-type robot flies up to their finger, or they actually have a sense of weight of an object because of the way the optics were kind of showing stuff in true 3-D. So some of that stuff was pretty mind-blowing.

I’m trying to think of my most recent… I really think, because I’m so in the weeds and I always look at the technical execution of stuff, both Apple and their quick look stuff that they’re showing, where they’re actually real-time generating shadows, light estimation and reflections. And I don’t know if you’ve seen it, but you can bring in, like, a shiny toaster, and you can wave your hand past the virtual toaster, and you can actually see your hand reflecting in it.

Alan: How are they doing that?

Paul: They’re building on an environment map in real time. And so as you’re moving around–.

Alan: What, are they using the camera to capture the environment?

Paul: Yep.

Alan: Oh, my God. That’s amazing.

Paul: Yeah, it’s really cool. To be fair to everyone else, Apple’s very tightly coupled to the hardware. Everything is super optimized, and they can actually do these things because they’re in control of the hardware. It’s a little more difficult to do in a very cross-platform type of context. I really liked the… I want to say “wonderscape,” but I think it’s “wonderSCOPE…” the books–

Alan: Oh yeah. Those are cool.

Paul: Yeah, I like to show those. The other part of it is, is the experimental side. I follow a lot of the Snapchat lens and the Instagram creator community. There’s some folks like Zach Lieberman and Max Weisel — they’re all doing really interesting stuff, and they don’t have a commercial motivation. They’re just seeing what weird things they can do with this technology. And I really think that’s where we’re seeing the most creative stuff come out. Zach Lieberman in particular has an app called… Weird? Is it Weird Type for ARKit? It’s like a toy type of thing, where you can walk around and place words, and have them react to your movements. And that’s always pretty fun to show people is really cool.

Alan: Have you tried Babble Rabbit?

Paul: Yeah. That’s Patrick [O’Shaughnessey]’s. His baby, right? That’s running on top of 6D?.

Alan: Yeah, exactly.

Paul: Yeah. We’re buddies of the 6Ds guys.

Alan: Actually, Matt [Miesnieks]’s been a guest on the show.

Paul: Oh, nice. Yeah. I’ve known him for a while. We actually announced that we integrated 6D into Torch as a proof of concept.

I’m pretty excited about both occlusion and persistence. I think persistence really changes the game for a lot of people. Once people start to get their head around — again, I go back to our target market; they’re so new to this world — I’ve shown people in real time; I’ve built a project in Torch on an iPad in front of them, and I’ve built a little scene on top of a table. And even then, they still don’t quite get that they can walk around that thing that I’ve put out into the world. They think it’s like a static [image]; like, a 2D thing on top of video. They just… people still have not unlocked the spatial part of their mind with this stuff. Which is crazy, because we live in a 3D world, and 3D people. When you do see people get past that initial understanding of what’s going on, then their assumption is, “oh, well, when I build something, or place that in the world, it’ll just stay there, and somebody else walks in the room, they’ll see it. And if I leave the room and come back tomorrow, it’ll be here tomorrow.”

Alan: It should, really.

Paul: Yeah, “should!” Yeah. They’re absolutely right.

Alan: Let’s be honest. That’s what we expect. And I even expect that! The only technology that’s lived up to its promise of, like, rock-solid persistence has been the Hololens.

Paul: Yeah. It is a hard technical problem to solve, but there’s so many people working on it, and we’re getting closer now. Microsoft’s got a great product around it. Google kind of shocked me a couple of years ago, when they announced their spatial anchors were cross-platform.

Alan: Apple seems to the only one that is playing in their own sandbox, and they don’t play well with others.

Paul: Yeah.

Alan: Like, what is USDZ?

Paul: Yeah.

Alan: Come on. Like, everybody in the world is moving to GLTF — for the people listening, GLTF and USDZ and FBX, they’re all 3D model formats, and the world hasn’t come to a standard. But we were getting close. Everybody was moving towards GLTF, and then Apple decided to invent their own.

Paul: Yeah, yeah. We used GLTF at Torch. We have a pretty sophisticated asset processor, so we actually take 70 different file formats. But on the back end, we always turn them in the GLTF, and on our export, we always turn them into GLTF.

Actually, I got a funny little side story around that; as a part of this export and publish thing — having the industry agree upon a 3D model format, like you said, it’s still not fully agreed upon, but it’s getting pretty close; if you can handle GLTF, FBX, or OBJ, those are pretty well-supported, well-known file formats, but they’re not interactive. There’s no interaction in that. And the Torch is all about building interactive scenes. I put an object in a space, and I want to respond once somebody walks up to it, or when they look at it, or when they tap it. To be able to do this whole publishing and export thing, we had to come up with our own GLTF-equivalent for interactive scenes; this portable file format for saying, here’s a construction of a scene, and here’s all the interactions that are connected to it. Apple just announced — at WWDC a few weeks ago — their reality kit effort, and included in that is… I forget the name. It might just be Reality Files. They call them Reality Files, and they’re actually have interactivity in them, and they can be generated from their tool chain and be shared across tools and all [sorts of] stuff.

So, I was talking to someone on the market team about it, and I said, “oh, that’s really cool. You know, we’ve had to do our own thing, and we want to learn more about your file format, and maybe it’s gonna become a standard.” And I was like, “have you guys thought about cross-platform?” And he said, “we’re already cross-platform.” I’m like, “oh, wow, that’s great! “And he said, “yeah, you can use it across iOS; iPad OS or Mac OS.” That was his definition of “cross-platform.” We think a little bit more in terms of, we want you to be able to build and view content on Hololens, Magic Leap, phones, tablets, looking glass display; anything that is a reasonable place to view AR. We think your interactions should be able to be distributed on those platforms.

Alan: Let’s put on our business hat, here: what are some of the business applications that you’ve envisioned for this? How will people use this?

Paul: So for us, what we’re seeing is people wanting to add AR capabilities to their existing systems. And so — as an example, a company that sells CRM platform for the heavy equipment industry. Well, this is a platform that, when sales reps go out and they’ve got all this literature around the different products — and heavy equipment in particular — has all these crazy configuration options. This company built a platform with a mobile component that lets them organize all this information, but it’s all based on, like, PDFs and images and spreadsheets. And so they said, “we see where AR is super helpful, because we could — first of all — show something at scale on location to a customer and say, ‘oh, if you want this backhoe, it’s not going to fit in this particular area,’ or to show different configurations in the sales process at scale.” Like, people can actually walk around and look at this stuff in detail.

So for us, what we’re offering is their capability to inject an AR capability into their existing platform, and say, “we’ll just use Torch to build the different pieces of interactivity in AR, and then use our SDK to be able to surface this stuff inside of our own app.” We’re seeing a lot of people thinking like that, which makes a lot of sense, right? As enthusiasts of the industry, you hope people just go whole hog in, and just say, “AR is going to disrupt everything, and you should be thinking about it for not only your marketing, but your internal processes and your retail side and your sales side.” But rationally, these are companies that are placing bets, and they’re dipping their toes into the waters. They’re looking at how they can… oh, man. I just now thought of this: they’re looking at how they can augment their current product line, or business, or whatever it is they do. So we’re seeing a lot of that.

People that are — like it’s said, the retail side — people that have invested a lot in mobile applications, and they have these really interesting AR ideas for their physical locations. Like, when a customer comes in and you have a personalized experience; you can help guide them to the appropriate things that they’re looking for. You can make recommendations. All this great, engaging stuff, but they don’t want to put out the AR app that that somebody has to download and install, and it’s totally separate from their primary app that already has millions of installs. And that way, that works very similar to the CRM example, where they just say, “we want an AR capability in our app that we can publish content into, and we want more and more of our team to be able to create the content and publish it.” So, retail has been a big one. And eCommerce, you know, obviously; pretty visualization of a product before you buy it, and making that not just a model that you stick in your room, and can’t only scale and rotate it, but it actually tells you about itself and you can actually buy it in the moment.

This has always been the difficult question for me to answer, as far as what is the addressable market for AR. It’s everything. It really is the easier thing for me to say — as you mentioned in my bio — you know, I kind of got into software in the very early days of the web. In those early days, people were saying, “what do we even need a Web site for.

Alan: [laughs]

Paul: And then there’s the progression of, “well, we should have a Web site, but we’ll just stick our company’s logo on it. And then an About Us page.” And then over time it became, “wait a minute: real value is happening. A real audience exists on this platform. Not only do we have to have a website, but we’ve got to have some form of transaction happening there. We could run our support through it. We could run our business through it.” And then eventually, everything matures to the point where people say, “well, we’re going to build a business entirely on top of this. We aren’t this existing business trying to figure out how to incorporate the web into it–“

Alan: “–the web IS our business.”.

Paul: Right. And then the same exact thing happened with mobile! You have people said, “why do we need a mobile app?”.

Alan: That is… like, honestly? The people that are asking these questions: if you’re listening, and you’re asking, “why do we need an AR app?” Think about what Paul’s saying here. “Why do you need a website?” “Why do you need a mobile app?” These are questions that seem absolutely ridiculous to ask now, because the world is on web, and a larger portion of the world is now on mobile. If Google is all-in on AR, Apple’s all-in, Amazon, Facebook, Walmart — like, every major company in the world gets it. So, it’s coming.

Paul: Yeah. Yeah! Without a doubt. When? Now that’s the trillion-dollar question.

Alan: You know what? You just said the magic number: a trillion dollars. And I’m going to unpack this quick, because I actually predicted that XR will create a trillion dollars in value, in five years.

Paul: Yeah.

Alan: The industry itself — just hardware and software — is going to create about between $400- and $500-billion, right?

Paul: Yeah.

Alan: That’s based on all the different studies that are coming out. So, just the sales. And if you add it up… so this year we’ll do probably $20-billion. Last year was $10-billion. Next year to be $40-billion. It compounds. So by 2021, they’re anticipating $100-billion a year. So, forecast out to 2025: we’re looking at… call it half a billion [dollars] created. Right? And that’s just half a billion dollars. That’s not factoring in one dollar of value created for engineers, doctors, hospitals, designers, retailers, training, education. If you factor in the value created with this technology, it’s in the multi-trillions.

Paul: Yeah. Yeah! We’re standing on the shoulders of the web and mobile, and we’re introducing totally new forms of interaction that we’ve only begun to understand.

Alan: I know! It feels like I’ve been doing it forever, but it’s only the beginning.

Paul: Yeah, no, it is. It’s still early days–

Alan: Can I retire yet, Paul?

[laughs]

Paul: No. Not yet. Just like me, we’ve gotta hang in there a little bit longer.

Alan: I think we’ve got to grunt it out. So, what is your timeline prediction on this, then? When are we going to see AR, in its different forms, take off?

Paul: I hesitate to do, like, an actual year, but what I have observed is that last year, when we were out in the market, most of the enterprises we were talking to had experimental and proof of concept-type budgets — things that were in the emerging tech group, or some kind of R&D fund.

Alan: “We hired a kid out of high school to work on Unity.”

Paul: Yeah. This year, it’s very much, very serious conversations around… it’s still early. Most still haven’t figured it out, how it fits in with their business. We spend a lot of our time educating — as you do as well — but we are seeing serious budget considerations around, “this is going to become part of our business.” And so I do see that progression. I talked about where it’s, “ehh, this is kind of weird and experimental. Maybe we’ll do it, just to kind of stand out from the crowd,” to, “oh, it feels like we really should be on top of this early, because we’ve got the extra money for it, and time.” It seems like it’s coming. And then next year, I think there’ll be a whole lot of people jumping on that bandwagon. I still think we’re probably — I would say — two to three years out from it being as vibrant as those Web 1.0 days.

Alan: It’s interesting, because my prediction — and I don’t say this in public, and I guess this is going to be the first time I’m saying it — my prediction is: 24 to 36 months, we’re going to see massive growth. Like, exponential growth. We’re going from $10-billion last year, $20-billion this year, $40-billion, $60-billion, $100-billion. So, we’ll be in $100-billion market, which is huge. But we’re also going to be seeing a roll up of the entire ecosystem. I think there’s going to be big companies that realize, “oh my God, we need an in-house team for this.” And rather than try to scrounge it together, they’re just going to start acquiring studios.

Paul: Yeah.

Paul: And it’s kind of interesting, because this technology is not just about the platforms like Torch, but it’s also about the content creators. A lot of investment has gone into platforms and products, but they’ve neglected the fact that these studios are really, really vital. That’s why we actually started XR Ignite; to bring the industry together, and create a community hub and investment arm and accelerator. To take these smaller companies that have great promise, and combine them with corporate clients, and bring them to that point where maybe they are acquired, maybe they are just selling it. It doesn’t matter. But we need to bring them together. And I think over the next three years, we’re going to see an absolute explosion of growth in this industry.

Paul: Yeah, exactly right. I think you’re going to see people that already have a kind of intuitive understanding about how to execute ideas in this new medium, they’re going to be very valuable people, and there’s gonna be people that will — like me; I was a graphic designer at a daily newspaper building ads, and I heard our executive staff of the newspaper start to think about, “hey, maybe we need a website.” I’ve been online for a few years at this point, and I knew I could do enough HTML to help them out. And not only did they let me help plan the… I was, like a 23-year-old kid helping these executives plan their online department. But I also got the job as manager once it got set up.

Alan: Yeah, we’re seeing that all over the place. Put it this way: one of the kids that we sponsored when he was 13? He’s 16 now. I think it works for Google now. He worked for Microsoft last year.

Paul: The other thing that excites me about it is: I think we should really rethink how software gets built. AI plays a role in this as well, but that’s one of the reasons why I was pretty proud that we introduced a totally different workflow into AR, because we’re not reusing tools from other industries. We’ve built something from scratch. I think it’ll be interesting. I don’t know if coding… it may not be coding anymore, right? It might be application creation. Experience creation.

Alan: Yeah, you’re absolutely right. I do a talk called The World in 2039, and part of it is, what are the jobs of the future? What happens when, all of a sudden, our education system catches onto coding and says, “we’ve got to teach everybody coding” — which they’re doing now; they’re starting to teach code, which is great. But what happens when code starts to code itself, right?

Paul: Yeah.

Alan: We could go down that rabbit hole for days.

Paul: Yeah. Totally.

Alan: Paul, I want to thank you so much for taking the time to join me on this podcast. I’m really looking forward to digging into Torch and seeing what we can build.

Paul: Thank you for having me. I also wanted to say, you’re a prolific poster on LinkedIn. You’re a huge advocate for our industry. And I know that’s not easy to do. Thank you for all the time you put into that.

Alan: I appreciate it. It’s a labor of love for sure. And my mission in life is to inspire and educate future leaders to think and act in a socially, economically, and environmentally-sustainable way. I believe that this technology is the way we are going to educate in the future. I’m doubling down on our future, and the kids who will create it.

Paul: Absolutely.

Other Episodes

Episode

January 01, 2020 00:35:37
Episode Cover

Using XR to Ensure a Safe Work Environment, with Bit Space Development’s Daniel Blair

Access to the Internet can be spotty in Northern Canada. But heavy industry happens up there all the same, and Bit Space Development’s Daniel...

Listen

Episode

October 28, 2019 00:46:02
Episode Cover

XR via Sexy Sunglasses, with Vuzix’ Paul Travers

Paul Travers has been in the XR business long enough to remember the early headsets, which were not exactly elegant in design – he...

Listen

Episode

August 07, 2019 00:37:10
Episode Cover

Retraining for a Post-Retirement World with VRVoice’s Bob Fine

A good friend of Alan’s, publisher of the online XR news publication, VR Voice, drops by the show for a general chat about the...

Listen