Imagine XR Tomorrow; Build for XR Today, with PTC’s Mike Campbell

November 18, 2019 00:39:16
Imagine XR Tomorrow; Build for XR Today, with PTC’s Mike Campbell
XR for Business
Imagine XR Tomorrow; Build for XR Today, with PTC’s Mike Campbell

Nov 18 2019 | 00:39:16

/

Show Notes

PTC LiveWorx is one of the biggest gatherings of up-and-coming XR tech in the industry. With all sorts of amazing future tech demos, PTC’s Executive VP of AR Products Mike Campbell understands how businesses might want to implement the most far-out features of XR technology right away. But he says there’s plenty AR can do perfectly right now that more industries should take advantage of.

Alan: You’re listening to the XR for Business Podcast with your host, Alan Smithson. Today’s episode is with Mike Campbell, executive vice president of Augmented Reality Products at PTC. Mike leads the Vuforia product team, and is responsible for driving the product and technology strategy of PTC’s leading solutions for the development of augmented reality applications. You can learn more about the work they’re doing at ptc.com.

Mike, welcome to the show.

Mike: Hey, Alan, it’s great to be here. Thanks for having me.

Alan: It’s my absolute pleasure. I’ve been really, really looking forward to this, because I went to LiveWorx in… was it June?

Mike: June, yep.

Alan: My God. I had no idea, first of all, how big PTC was and how important Vuforia and your augmented reality strategy is going to be and is becoming to all sorts of different industries. I got to fix a tractor. I got to work on an ATV. I got to look in retail. Is there any industries that this won’t affect?

Mike: Well, PTC’s focus is really in the industrial enterprise domain. And I would say across all of the verticals there — heavy equipment, automotive, aerospace, medical devices — I mean, all of those places are ripe for transformation, thanks to the power of augmented reality.

Alan: It’s amazing. You even had a boat there.

Mike: We did, we did, yeah. One of our customers is Beneteau.

Alan: Yeah. So let’s start from the beginning here. What is PTC? What do you guys do? Where did it come from? Let’s start there.

Mike: Ok. So PTC is a billion-dollar plus software company headquartered in Boston, Massachusetts. We have been around for a long time and we got our start back in the late 80s, early 90s by revolutionizing the 3D solid modeling industry. Basically, we invented a better mousetrap that allowed companies to create products virtually in 3D much faster and more effectively than ever before. Fast forward from then, we not only have a 3D solid modeling CAD offering, we have a great offering used in engineering for lifecycle management. And about 10 years ago, we recognized the trend of the Internet of Things, this explosion of connectivity and ubiquity of sensors, and companies wanting to leverage that information so that they could create products and manufacture products and service products better. We invested pretty heavily at that time. And once we did that, we were thinking a lot about this idea of IOT and products broadcasting information about themselves in the form of digital data. And we were thinking about our 3D heritage and we recognized that augmented reality was a great way to unlock some of that digital data in the context of the physical world, where you do your work. And that’s really what got us into AR. I have been at PTC — as you said — for a long, long time. And I’ve been involved in our AR journey since the beginning and it’s been a fantastic ride.

Alan: Vuforia was an acquisition. Was that something that you guys made a decision to– can we build this in-house, or should we just acquire it? How did that come about?

Mike: So we were thinking about AR. We were taking a look at the different technologies that were out there. And, you know, there are certain elements of the AR puzzle that PTC is well-suited to address: an understanding of what goes on in the industrial enterprise, and the 3D digital context understanding, we have all of that. What we didn’t have at the time, though, was a rich, deep understanding of computer vision technology. So basically we went off and we acquired the world’s leader. We approached Qualcomm and were able to figure out a deal that allowed us to acquire that technology and — frankly, more importantly — all of the expertise that was working on that at Qualcomm at the time. And then basically around that built our offerings for industrial enterprises to really unlock the potential of AR in those kinds of settings.

Alan: I have no idea the details about this acquisition, but it seems to me like it was a pretty damn good idea to acquire Vuforia. And it’s really positioned you guys very well for things like see-what-I-see capture technology, so being able to look at something with a tablet or a headset and have somebody looking over your shoulder and being able to annotate on that. But also being able to hold up a tablet — or a Hololens or one of these devices — and it recognizes the image in 3D and allows you to annotate. Some of these things that were kind of esoteric a few years ago, you guys are really delivering and you’re delivering at scale now, which is really interesting.

Mike: That’s really the value of the combination of sort of this underlying industry-leading computer vision technology and the knowledge that PTC — because of our heritage and our domain expertise and our technology in the form of CAD, and PLM, and IOT — is really the fusion of all of those things that makes these amazing experiences — like the ones you saw it LiveWorx — possible. And what we’ve found is it’s that combination that is required in order to unlock this potential in the industrial space. If you show up to a– pick your favorite large industrial customer and you show up with a great computer vision SDK and Unity and you say, “listen, we can go build anything.” They say, “OK, that’s great.” And they go build something, but it doesn’t scale. And that’s really the key, if you’re gonna be successful in an industrial enterprise. They need scale. They need re-use. They need these approaches to work across a variety of different use cases and product configurations. And the complexity gets pretty mindblowing. And that’s the experience. That’s the expertise that PTC brings to the equation here. So I think it’s been a great combination so far, and I think we’ve got a bright future ahead, for sure.

Alan: There is so much to unpack with LiveWorx. It was kind of mind-blowing and it was the first time I really fully understand because a lot of people are using Vuforia for non-industrial applications, just making some AR things. I know we made AR business cards five years ago or four years ago, and we used Vuforia as the image recognition and trigger for it. But there’s got to be thousands upon thousands, maybe hundreds of thousands of people using this technology not for the industrial use cases. What is the percentage of Vuforia users in industry, versus marketing — let’s say — or other things?

Mike: Yeah, well, remember Vuforia is a brand for augmented reality offerings at PTC. And when you say “Vuforia”, I think you may be talking about Vuforia Engine. That’s the computer vision SDK that basically Qualcomm started and PTC has taken on. There are — you’re right — there are almost 700,000 developers that are taking advantage of that technology to build apps, to build apps for iOS and Android and various pieces of digital eyewear. And the use cases that they’re attacking are all over the place right there. They’re shopping, they’re gaming, they’re entertainment. Some of them are industrial, as well. Some of them might be product visualization, or other types of industrial apps that they’re building with computer vision.

On top of that library, though, PTC has then taken that technology and we’ve built purpose-built offerings for things like — as you said — you-see-what-I-see, or expert knowledge capture, or an offering that we call Vuforia Studio, which lets you leverage 3D CAD data you already have and present step-by-step instructions with animated 3D to make it very, very clear. I think that’s probably how you were able to replace those brake calipers at LiveWorx, right? You were using augmented 3D instructions. And what’s great about that is we’ve been able to make it super easy for our industrial customers to create these experiences at scale. As far as the answer to your question, it’s probably about 50-50 in terms of customers that are using the Vuforia computer vision SDK to go build all kinds of custom things. And the rest of them are really embracing these industrial enterprise use cases with purpose-built solutions that we’re delivering.

Alan: Let’s talk about what these solutions are enabling your customers. So let’s say, for example, we’ll just use John Deere. I was at LiveWorx and I’m walking around and my jaw is literally hanging open the whole time. I’m trying to figure out what do these guys do? I came to LiveWorx thinking, “Oh, they make AR for industry, not thinking anything else.” And then, of course, I realize I get the crash course and “Oh, we make this CAD-like program where you can build a product.” Let’s say you’re fabricating a product digitally and then it’ll also say, let’s enter in the information about that product. I need it to be 500 grams or less, I need it to have this type of tensile strength. And it’ll run all sorts of calculations and give you unique build designs of a product in ways you never could possibly think of as a human. And it’s a collection of all of these tools that are serving this customer. So let’s just take John Deere for a second. I put on a RealWear headset and I was able to see a screen in front of my eyes that walked me through step by step on how to change an air filter. Recognize that I’m in front of the tractor, give me the information, said “climb up the tractor,” pull this door open, pull out the filter, replace the filter, do it up. And within three minutes, I had replaced an air filter on a tractor that I would have assumed the air filter was on the front of the tractor, not the back. It turned me into an expert instantly. So what are the types of things that customers are doing then?

Mike: That’s a great example. What you were able to experience is the output of a product that we actually introduced just in May, and that product is called Vuforia Expert Capture. And basically we built this product, because there’s a lot of domain knowledge out there in our customers. There are people that have been working in industry for a long, long time. And they’re getting to the point where they’re retiring right here in North America, a lot of the baby boomers are leaving the workforce. And companies have this challenge that when those people leave the workforce, their knowledge goes with them. So what we did is we built this tool called Vuforia Expert Capture. And basically what it does is it allows an expert to put on either a RealWear device or a Hololens, and basically just do their job. So what happened in the demo you saw is we had an expert come in and teach us all how to replace the air filter. They went through and they did their job. And when they were done doing their job, we took the device and we plugged it into a computer. And we extracted all of the video, all of the spoken word, all of the bookmarks and pictures, everything that they captured as they did their job. And we prepared that, kind of enhanced it a little bit, structured it, and we published out a procedure. And that procedure is then presented back, either on a RealWear device or on a Hololens or on a phone or a tablet or frankly, I mean, you can even dump it out to Word, if you want it on paper. But that’s all we’re talking about here today. So who wants that?

But what you saw was the result of that, which basically provides procedural guidance. And this is — again — a new product we introduced in May. The market reception to this has been outstanding. I mean, again, this is a real problem that companies are facing every single day. And this is a great solution to that problem, taking advantage of some of the latest technology. That’s just one of the things that we allow you to do. That’s — again — the newest offering. And that’s one demonstration you might have seen. The one where you did the brake repair, that’s one where the situation was a little bit different there. What we were trying to teach you was not something that somebody had in their head, but sort of an engineered procedure. That was a procedure that somebody either in service planning or maybe manufacturing process planning, they would have defined ahead of time and there would be engineering deliverables, animations, and sequences, and prepared processes for that. So in that case, we got to leverage 3D engineering data and use that to present to you how to get the job done. And what we’ve realized in our AR journey here, is that there are different constraints on affordances that a company might have. They might not have 3D. They might have knowledge in people’s heads. And Expert Capture is important. They might just need to be able to access an expert remotely. And that’s where our you-see-what-I-see — or we call it Vuforia Chalk — offering is most relevant. Or they might have a highly engineered set of information that they want to present to somebody. And really that whole spectrum has to be respected. And we’re trying to embrace that with a true enterprise AR suite.

Alan: What are some of the other ones? I know there’s a couple of different things here within the Vuforia family.

Mike: Yeah. So there are four key offerings today. The first we’ve talked a little bit about and that’s Vuforia Engine. That is the foundational computer vision SDK that you would use most of the time with Unity. You can use it with other 3D modeling or rendering tools and build custom apps. Then we have three offerings really targeted at the industrial enterprise. So the first of those is Vuforia Studio. And the key story there is that allows you to reuse 3D CAD data you already have. It seamlessly integrates with our Thingworx platform, which allows you to bring in IOT data — data from frankly any digital source — and then create AR experiences really, really quickly. This isn’t deep programming with computer vision, it’s basically reused 3D, add the content you want, sort of decorate your scene, and you’ve got an AR experience, literally in a matter of minutes. So that’s our Vuforia Studio offering.

Alan: Yeah, the first time I saw Vuforia Studio and Thingworx was actually back at Augmented World Expo in Silicon Valley, maybe three years ago?

Mike: Could have been, yep.

Alan: Three years ago. And you guys were not trying to reinvent the wheel with like, “Hey, we need to have image recognition that’s precise”. It was “No, here’s a barcode. Look at the barcode, it’ll recognize it, and then overlay the data.” And I thought that– QR code, not barcode — and I thought sometimes we as developers, we’re overthinking things. A QR code allows you to identify an object really quickly, rather than try to put it through a database of a thousand machines that all look the same. That was a really easy way to do it. And then once you’ve got that, you can just add annotations, you can bring in CAD data, you can overlay the CAD 3D model on top of the actual physical unit and teach people how to use it. Teach me how to fix it, that sort of thing. And that was three years ago. And what I saw this year was basically the real practical use cases of that technology. There was kind of like a coffee machine, I think was the demo. And now it’s expanded to boats, and tractors, and all sorts of things. What’s the craziest thing that you’ve seen somebody work on using AR?

Mike: Well, let me make a really important point, based on the story you just told first. And that is– first off, you’re right. We have evolved from image markers and QR codes to 3D CAD data being used to help us recognize a shape, and then overlay the geometry. I mean, you may have seen in our CEO’s keynote where we were actually originally using CAD data, but basically looking at a table full of parts, and then being able to identify which part is which. So that’s not technology that’s ready for mainstream today, but that’s sort of that idea taken to an extreme. But another important point is, you sort of recognize that three years ago when you saw this technology, we showed up with what I would call a very pragmatic approach.

What we’ve learned is that it’s really important to meet the market where they are. There’s all kinds of crazy things that we could potentially do. And a trap that customers often fall into, is they imagine the most outlandish thing that AR computer vision technology *could* do for them. And what we try to encourage them to do is identify things that are practical, that are going to have a business impact, that are going to be valuable and move the needle. And frankly are achievable, let’s go do something bite-size and make an impact, and then build off of that success and go on. So there’s this element of pragmatism. There’s this idea of meeting the market where it is, not showing up and saying “You’ve got to spend bajillions of dollars.” or “You need the most outlandish high-end hardware.” or whatever the case may be. Just identifying business problems that they have, that are well-suited with AR technology that’s available today and then going off and solving problems for them. So I’m glad you saw that a few years ago. And that’s a mantra that we really hold dear and continue to drive into our customers.

Alan: One of my previous interviews was with Dr. D.P. Prakash from GlobalFoundries. He was saying that the Vuforia Expert Capture system is decreasing time to generate standard operating procedure manuals by 10x.

Mike: That’s right.

Alan: I mean, that’s a big number. When you start to combine AR and AI, then you’ve got– the world is suddenly this magical place, where you can manipulate data and then display it in ways that we’ve never really contemplated before.

Mike: Yeah, I mean that– and that’s– I’m glad you had the chance to speak with DP; he’s a great guy, and we’ve had the chance to work together closely, actually. But you’re exactly right. I mean, when you think about applying AI *and* AR, you think about providing people with — really — superpowers. I mean, you give them the ability to understand things — that people can’t understand — through the power of AI, and then visualize that stuff in the context of the physical world where they’re actually doing their work. And that can have profound implications, like tenfold increases in terms of productivity when you’re documenting your SOPs.

Alan: One of the things that you guys have done very well, is being advocates in promoting augmented reality to the industrial workforce. And one of the things that you did was a joint piece with the Harvard Business Review called “A Manager’s Guide To AR,” but it was also AR-enabled. So if you downloaded the app, you could bring the white paper to life, and this factory popped up. What was the genesis of that?

Mike: It’s been an interesting journey, right? I mean, you’ve been working in AR for some time, so maybe this isn’t a surprise to you and your listeners. But five years ago, when we were talking with our customers about the potential impact of AR, they would look at us and say, “What’s AR?” That article had to have examples of what AR was, right? This idea of presenting digital content in the context of the physical world. Now, of course, the good news is that we’re largely beyond that. That article was written several years ago. I think a lot of the key elements are still relevant, but I think a lot more people know what AR is now, and we’ve sort of gone through a journey from “What is AR?” to “How would I ever use that at work?” to now discussions about real value and tenfold increases in productivity and all of those kinds of things. So it’s been an interesting journey over the last five years or so, as we’ve progressed and educated the industrial enterprise market on the true potential here.

Alan: We’re still very early in this technology and you guys are pushing the limits so you have more experience than most. But one thing that I found really amazing is that I was at a bicycle show recently, and the Cannondale bikes have a– well, it’s a custom, it doesn’t say PTC, but I recognized the shape. They have a QR code on them. How are they using that? I didn’t pull out my phone and make it work, but I saw the Thingworx tag on the bicycle. Now, is that shipping with every bicycle? What are they doing with that?

Mike: So that’s actually called a Thingmark. It’s a combination QR code and AR marker. So basically it provides unique identification, so our system knows what bike is this? And then we use it also to place the content. It’s 000 for the augmented content. And what Cannondale is doing, is they originally wanted to help their technicians in your local bike shop — their dealer network, if you will — understand the new features on their bikes. For some of their bikes — it’s not available on all of them, but some of their higher-end bikes — they built an AR experience. And that AR experience does a couple of things. It teaches the dealer what the important features are of the bike. So what are the new capabilities and what are the performance specs and all of those kinds of things? It also provides them service instructions for how to do certain things to the bike, replace the shocks, or whatever the case may be. And then finally, it also provides spare parts identification. So the technician, instead of pulling out apart and trying to find it in a manual or find it online somewhere, they can simply look at the bike. And then in AR they see what all the part numbers are so they can order replacement parts.

Alan: That’s amazing.

Mike: It’s a very, very cool app. And what they quickly realized– this is due in part to the fact that if you’re a cyclist, you generally like to tinker with your bike anyway.

Alan: [chuckles] Yes, you do.

Mike: Their customers, as well as their dealer network, were interested in this technology. So they’ve seen quite a bit of our uptake there. and it’s become a bit of a sales and marketing tool as well for them.

Alan: It’s interesting you say that, because I had Jonathan Moss — head of learning for Sprint — on the show, and they rolled out AR training to about 30,000 staff. And because, of course, you can keep track of how many times it’s used and stuff, they kept seeing certain employees were using the training 10, 20, 30 times and they couldn’t figure out why would a staff member take the training so many times. So they went into the stores to figure out how they were using it. And what they were doing is exactly what you’re saying. They were taking their training and using it as a sales tool in the store.

Mike: Oh, that’s awesome.

Alan: [laughs]

Mike: The unintended uses of technology is amazing sometimes. That’s a great story. And there are all kinds of cases like that, especially with technology that’s as compelling as the technology that you talk about on your show. People, they get so excited by it and they want to share it. They want to evangelize it, almost. And it’s a great thing.

Alan: Some companies are seeing such dramatic improvements in process, from 20x-ing their expert capture, to increasing sales by 20, 30, 40 percent by decreasing training times by up to 100 percent. It’s kind of one of those unique, rare times in a technology’s life cycle where people are — like you said — evangelizing this technology. And it’s almost to a fault, because if you think about it, this is a direct competitive advantage that companies have right now over companies that aren’t doing this. And so by them telling the world about it, they’re kind of saying, “Hey, here’s our competitive advantage.” and letting everybody else know, which is really wonderful, because I don’t think we’re in a net sum game of humanity. There’s enough to go around. There’s business for everyone. So it’s really interesting how in these early days of this technology, everybody’s rolling up their sleeves to make it and also evangelize it. And then thank you for being on the show as well.

Mike: Yeah, it well, it’s my pleasure. And it’s a pretty exciting time to be involved in technology. And I think the key is — and part of what your show, I think, is helpful at doing — is distilling what’s real. It’s important to set realistic expectations. It’s important to really understand where is the value opportunity? Where does this stuff work? Where does it not work? It might work there in the future, but where does it not work today?

Alan: What are some use cases that you guys have worked on, or worked with that were just kind of like, they had a slight improvement, but you’re like “Nah, maybe not useful here”?

Mike: I think that the potential impact for AR across the industrial value chain is deeply profound. It will fundamentally change the way that we interact with the world around us. But when you factor in the reality of where does content come from? How comfortable is digital eyewear? Can I work truly hands-free, with all of the benefits that AR promises, for an entire shift? Some of those things aren’t quite there yet. And then frankly, the computer vision technology is still maturing. It gets better every quarter. We come out with great new innovations, but it’s not a human eye connected to a human brain yet, not yet anyway. Those limitations can get in the way of some of the more advanced use cases. But as we’ve talked a little bit about, there’s so much potential impact right now, whether it’s capturing expert knowledge, sharing expert knowledge in real-time, or presenting compelling instructions and other 3D and digital data in the context of the physical world. And what we’re really encouraging our customers and our clients is to work with us to identify those opportunities, and let’s go drive some real value there.

Alan: That’s like music to my ears. So there’s been all sorts of companies use your tools. One of them is Hot Wheels. I mean, I don’t know about you, but I grew up with Hot Wheels. I had little race cars.

Mike: Absolutely.

Alan: How is Mattel using AR?

Mike: Yeah, I would speak for Mattel and many of the other toymakers. A lot of them are Vuforia customers. And what they’re doing is they’re recognizing that the nature of play is changing. You and I grew up with Hot Wheels. You and I did not grow up with iPads. [chuckles] And kids today, they do. So the challenge for some of these toymakers is, how do they bring a digital element into the physical world of the toys that they make, whether we’re talking about Lego, whether we’re talking about Hot Wheels, whether we’re talking about Mattel and a hundred other companies. Augmented reality gives them the ability to do that. It gives them the ability to supplement their physical toys with an experience, whether it be animations or gameplay. All of those kinds of things, which really resonates with the kids that are playing these games today. So that’s a great space for us. And we’re really lucky to have a lot of great toymaker customers using our AR tools.

Alan: You actually mentioned Lego and I know Lego has been doing a ton of stuff in AR over the last few years. We had Eden [Chen] from Fishermen Labs on the show and they’ve done a lot of work with Lego in Denmark, to not only animate the boxes, but– I was just on actually, if you go in walmart.com/lego and then you click the “see it in action” button, you can now drop the Legos toy set on your table, and see it animated and see how it plays in front of you. And I mean, that’s all web-based.

Mike: That is very cool.

Alan: Let’s take a look at that. So right now, everything that you’re building is app-based. Are you guys moving towards a web-based offering in the future, or is that something on the roadmap?

Mike: It’s something where we’re looking at. As with everything else, as technology proliferates and standards are established and embraced, we really have the opportunity to drive this democratization even further. So that’s something that we’re there were researching. The advanced research team is looking at that. What I can tell you right now, is that app-based is really what the foreseeable future has for us. Whether that app is a broadly applicable viewer, like we have with Vuforia View, or custom-tailore apps that a toymaker like Lego or an automaker like Mercedes Benz will make for a particular use case, doesn’t really matter. But in the near term, that’s really where a lot of the focus is.

Alan: One of the things that people have to realize is that companies like RealWear, for example, RealWear is a head-worn display that allows you to move this articulating arm into your side of view and see almost like a maybe 10 inch iPad, 10 inches from your face so that you can see stuff. But it’s not really AR. It’s just kind of augmenting, giving you a screen. It is, by all accounts, the lowest possible tech of this. It’s not doing image recognition, it’s just literally showing you either PDFs, or videos, or information and being able to capture that using a camera and project it back. And they just raised $80-million. So I think we’re– as a collective group, we need to take a step back from trying to invent the future of the future of the future, and say “Hey, the tools we have right now are driving real ROI value. How do we leverage those the most so that we can fund the future of the future kind of things?” And I think you guys have done a great job on that.

Mike: We’ve got a great partnership with Andy Lowery, the CEO of RealWear and all the folks over there. I think they are a great example of that point I made earlier around sort of pragmatism. Meeting the market where it is. That device is intrinsically safe. It’s got long battery life. It’s got a hot-swappable battery. Does it provide the deep immersion of some of these other pieces of digital eyewear? No, but for some of the use cases out there, that’s not required. That’s a device that meets that need very, very effectively. And I think that as the Qualcomm XR-1 chip, this new augmented virtual reality chip that’s been built for those kinds of devices, as that gets adopted more, you know, we’ll see more and more advances in the digital eyewear technology. But there’s a ton of value to be realized today. And that’s a great example.

Alan: You really hit it there. And it’s funny, because you look at something like Magic Leap, they raised $3.5-billion, almost $4-billion now. They’ve built this beautiful device that does spatial computing, spatial mapping, spatial audio, but you cannot use it in any industrial use case yet. It’s not IP rated, it doesn’t have safety glass. So you’ve got these two extremes, one, that they just bomb proofed a display for your eye, and the other one they made the most advanced spatial computing device ever made, but didn’t make it available or useful for the enterprise. It’s an interesting dichotomy there.

Mike: Yeah, but like you said earlier, it’s early days. I am quite sure that the folks that Magic Leap are going to recognize how much value there is in the enterprise space, and figure out that they’ve got to have certain characteristics. And I think that there’s a lot to happen in front of us in the realm of digital eyewear. Sometimes I think about 15 years ago when maybe you had a smartphone with a great camera, and I had one with a huge screen, and my friend Matt had one with a keyboard on it.

Alan: If you’re talking fifteen years ago, I had a belt. And I had a phone on one side, I had my camera on the other side, I had my Palm Pilot, which wasn’t connected to Wi-Fi, which was just literally a calendar on the backside. I looked like Batman, I had a bat belt.

Mike: All right. But what do you have now? I bet you a hundred dollars there’s a black rectangular iPhone sitting on the desk in front of you. Right?

Alan: There is a black device here, that has multiple cameras and it’s got all the things. And we’ve exponentially every year getting better and better. Better. OK. So you’re [garbled] you think about this all day, every day, I’m going to put you on the spot here. Timeline for ubiquitous eyewear in the public.

Mike: Not taking the bait.

Alan: Ahhh, dammit! [laughs]

Mike: [laughs] I do believe that it will come, and certainly our grandchildren will have that luxury. But listen, there’s– again, it’s early days. There’s a long way to go. There are many questions left unanswered at this point. We know a little bit about how Google thinks about this. We’re beginning to see glimpses maybe of how– I don’t know if you saw it just last week, Amazon released the Alexa glasses.

Alan: Yeah.

Mike: We have no idea what Apple will do. Facebook’s got–

Alan: It’s interesting. The Alexa glasses are– they’re just audio. Spatial audio.

Mike: They’re just audio, right.

Alan: But they’re direct — literally direct competitor — to the Bose AR offering, the Bose glasses.

Mike: Yep.

Alan: Assuming they’re going to be pretty awesome, because I tried them out and spatial audio, I think, is actually going to drive a lot of value, just with the audio.

Mike: Yeah, I agree. I would say I think that visual cues are gonna be the most compelling. I think we get– scientists will tell you we get something like 85 percent of our input visually, but certainly sound, touch, those are all other important elements.

Alan: Oh, speaking of touch. Have you tried to Haptex globes?

Mike: I have. I was at EWTS last week in Dallas, and I had a chance to play with a whole bunch of all kinds of gadgets. And it was a blast. There’s some cool stuff coming, no doubt.

Alan: I got to play with the Haptex gloves last week at the Simulation Summit. It’s just–

Mike: It’s mindblowing.

Alan: –mindblowing. [laughs] It really, really is. I remember the first time I tried the Ultra Haptics. I was like, “Yeah, I don’t get it.” And then I tried it again, when it was a little bit further ahead, and I was, “Oh, I get it now.” You have Expert Capture, you’ve got Chalk, which is being able to look over somebody’s shoulder. So let’s talk about Chalk for a second, because I think it’s a really important one that gets a little bit overlooked, but it’s very, very important.

Mike: It’s an application for a remote expert assistance. So it basically allows you to see what I see, and it allows both of us — while we’re speaking to each other — to draw on a live video stream of the real world. And what’s interesting about that, is that we’re drawing on this live video stream and what we draw actually sticks to the real world. So this isn’t a case where I take a picture, send it to you, you circle something, send it back to me. We’re both looking at my view of the real world. And if you see something you want to draw my attention to, you simply draw an arrow, draw a circle, whatever the case may be. And no matter where I look, when I come back to that spot, those annotations will be fixed there in space. So this is a tool that is really key for helping, let’s say a junior technician, somebody that’s a novice out in the field. This is allowing companies to save money on rolling a second service truck. It’s allowing them to increase their first-time fix rate. And one of the most exciting things about this technology is that it works on a really, really broad collection of devices. So, you know, the reality is most technicians out in the field, they probably have an iPhone, they probably have something running AndroidOS. And this technology works on many, many of those. It doesn’t require — for example — ARKit enabled devices or anything like that.

Alan: Being able to have an expert — like you said — Expert Capture, there’s people retiring. But it’s not that they want to retire, so much as that they just don’t want to work in a factory anymore. Or maybe they just want to just work a couple hours a day. But that one expert can now look over the shoulder of hundreds of employees and help them and guide them as they go about their day.

Mike: Yeah, I agree. I think– sometimes I think about these drone operators that sit in the bunker out in Las Vegas. They remotely control the drones all over the world. I think about the remote experts like that sometimes. They’re sitting in some really comfortable environment, they’ve got a bunch of screens around them, and they’re supporting this army of technicians out there with their knowledge. It’s a powerful concept and our customers are getting a lot of value out of it.

Alan: It’s one of those things where I think we’re scratching the surface, but the surface is pretty amazing and it’s driving real ROI now. What problem in the world do you want to see solved using XR technologies?

Mike: What problem do I want to see solved? I think that we– I think the problem that’s most interesting to solve for me is mistakes, right? There are so many mistakes that are made in the world, that are made because people can’t get access to the right people or they don’t have the right information or they were trained on how to do something a year ago just in case they ever encountered it. And the way that I see XR technology applying to the world we live in is shifting that information delivery from “just in case” to “just in time.” And when it’s delivered just in time, a lot of stuff gets done a lot better.

Alan: I want to say thank you. This has been an eye-opener. PTC’s LiveWorx. If you haven’t been to LiveWorx, you gotta go the next year. The stage that you guys set up at that event rivaled EDM stages. It was incredible. Multi-level stage with lights and lasers, and it was just mindblowing. And you really know how to bring people together in the context of industrial applications. You could feel the palpable energy. So visit ptc.com to learn more. Yeah. Is there anything you want to add, Mike?

Mike: No way, man. I wouldn’t dilute that message if you paid me. Well said, and thanks for the chance to talk on your show, and I can’t wait to see you and all of your listeners at LiveWorx.

Other Episodes

Episode

January 20, 2020 00:37:30
Episode Cover

The IKEA of AR: Making Content Effortlessly, with EON Reality’s Dan Lejerskar

It’s been said on this show before; XR doesn’t have a technology problem, it has an adoption problem. In Dan Lejerskar’s experience, everyone from...

Listen

Episode

August 28, 2019 00:43:32
Episode Cover

Save Me a Seat in meetingRoom, with Jonny Cosgrove

A plain room with a table, a few chairs, and a whiteboard has never sounded so…futuristic! But that’s one way to describe the technology...

Listen

Episode

September 07, 2019 00:06:12
Episode Cover

Digital Vandalism, in the Name of Feminism – XR News for September 4, 2019

Tired of waiting for the US treasury to put a female face on the $20? (Because you’ll probably be waiting a while yet!) Google’s...

Listen