#892 Sundance: ‘Persuasion Machines’ is Architectural Storytelling Against Surveillance Capitalism

There have been a number of documentaries at Sundance over the past three years that have taken a critical lens the social impacts of technology including The Cleaners (2018), The Great Hack (2019), The Social Dilemma (2020), and Coded Bias (2020). Karim Amer was the director The Great Hack, and he returned to Sundance this year continuing to explore the dynamics of surveillance capitalism in a virtual reality piece called Persuasion Machines co-created with immersive media artist & architect Guvenc Ozel.

Persuasion Machines takes an architectual approach to telling the story of surveillance capitalism by exploring three different layers of space. The first layer of space is a projection mapping of a grid that’s on the floor that gives you a sense of the space you’re about to enter. Then as you enter into VR, there’s a photorealistic, millennial living room of the future filled with all of the latest gadgets, and then the final layer you reach by walking through different portals that then reveals a visual-depiction of the world as it would be seen by the immersive gadgets of the future. You’re metaphorically cutting through the matrix to be able to see abstract representations of data that shows what all of these gadgets actually are, “persuasion machines.”

I had a chance to speak with co-creator Ozel about their architectural-approach to design, and their strategy of slowly deviating from the affordances of digital vernacular in order to create a deeper sense of plausibility and environmental presence.

While I absolutely love the deeper intention & experiential design of Persuasion Machines, they had one creative decision that I strongly disagree with. Persuasion Machines is a piece that’s critiquing the tools of surveillance capitalism, but yet they’re also using some of those very same dark patterns. Persuasion Machines was livestreaming sessions of people as they were experiencing their VR experience live onto YouTube without making it explicitly clear to the audience. I didn’t find out about until I was in the middle of my interview with Ozel, and needless to say I wasn’t too happy about it.

persuasion-machines-live

Here’s are the links to the archives of the 3 days of livestreams of audience members watching Persuasion Machines at Sundance.

The Persuasion Machines team didn’t go out of their way to disclose that the livestream was happening, and in fact were hoping to trick people in providing consent by presenting a long terms of service form that they later skewer in their piece. I honestly don’t remember signing a digital release form. It’s entirely possible that I did sign it, and I just don’t remember because I was in such a rush to see the experience. If I did sign it, then I definitely didn’t do a close reading of the release form, which says,

I acknowledge that The Othrs, LLC might be currently filming, photographing, video and audio taping and telecasting scenes at this location for inclusion in television programs to be initially exhibited on digitally via live stream.

IF YOU DO NOT WISH TO BE FILMED, PHOTOGRAPHED, VIDEOTAPED OR AUDIO TAPED, OR TO APPEAR ON TELEVISION, PLEASE LEAVE THIS LOCATION DURING OUR FILMING, VIDEOTAPING AND TELECASTING.

I trusted the creators that they had my best intentions in mind, and that they wouldn’t be adopting the very same dark patterns that they’re aiming to critique. I certainly don’t recall anyone verbally disclosing that I was entering into a space that was going to be livestreaming all of my moves. There are only archives from Sunday, Monday, and Tuesday, and so it’s also possible that they weren’t streaming yet on that previous Thursday, and didn’t have the journalists sign the digital release form.

The quality of the stream is indeed very low and there may be a perceived safety in thinking that all of the data in the videos is anonymous and private. There are moments when attendees are clearly identifiable. But yet there are also many surprising ways of de-identifying the data by people seeing themselves, their friends, family, or acquaintances, and also through techniques like gait detection and other artificial intelligence systems that aggregates data from other sources are able to de-identify data as well.

But even if there isn’t any personally identifiable information on these streams, it’s still the deeper principle of not being providing human-readable or contextually-clear, informed consent that is the most bothersome. It’s the same lack of consent transaction that companies themselves are using, and so my overall experience of their execution felt more like they were replicating and amplifying the problem.

If the intention of this was to be provocative and get the audience angry, then it certainly achieved that with me. However, I don’t think the reveal of the stealth livestream was all that clear, as most people I tell this too were not aware of the livestreaming. It sparked a dynamic debate with Ozel during our interview, and I quickly conceded after not recalling as to whether I had signed my consent away or not.

Deconstructive Criticism versus Constructive Solutions

If I were to try to summarize an argument to the Persuasion Machines creators Ozel and Amer, then I’d say that creators have a moral responsibility to not just replicate dark patterns, but to actively try to construct solutions that help to solve some of these issues around privacy. There’s a role for creating cautionary tales and provocative demonstrations within an artistic context in order to make a larger point, but there’s also a role for exploring creative solutions and implementing best practices.

As Helen Nissenbaum explains, there has already been a movement towards “comprehensible privacy policies, more usable choice architectures, opt-in and not opt-out, tiered consent, and more effective notice.” She cites Ryan Calo’s work on this in his papers Code, Nudge, or Notice? and Against Notice Skepticism in Privacy (and Elsewhere).

It wasn’t made clear to me, many of the other attendees, or even the organizers of the Sundance New Frontier that Persuasion Machines was actively livestreaming attendees without their full, informed consent. What does informed consent look like in the future? Should privacy be considered like a copyright licensing contract that we maintain control over as Adam Moore argues? Or should privacy be considered a human right that we can’t choose to yield control like Anita Allen argues?

Helen Nissenbaum says that currently there’s an escape clause in all privacy protections that you can do anything that you want as long as you get someone’s consent, but operationalizing choice through unread and unreadable terms of service that are too complicated to fully comprehend is not the way to go. Do we need to build an institutional backdrop of contextual integrity that protects privacy in certain contexts? Nissenbaum says there can be room for choice, but that choice shouldn’t be able to trump some of our fundamental rights to privacy and protections. There’s still a lot of open questions in the philosophy of privacy, and I have some references down below to dig into more details of these debates.

But another point that I should make is that currently our rights to privacy are normative standards that change based upon our collective behaviors. The way in which a “reasonable expectation of privacy” is defined at any moment is determined by what the culture is doing. As more and more companies and artists start to broadcast representations onto the Internet, then this actually starts to erode our rights to privacy. The third party doctrine, also means that any data collected by a third party has no reasonable expectation of privacy, which means that all data broadcast is available to the US government without needing a warrant.

I’m not sure if Ozel and Amer were aware of these deeper privacy dynamics, but in hindsight I’m glad that I had the experience because it catalyzed a lot of deeper research for me into philosophical foundations of ethics and privacy with immersive technologies. I went down quite a rabbit hole of references and deeper constructive conversations about this topic that I’m including down below, as well as with some of my previous work on ethics and privacy in XR.

Resources to Get More Context on Ethics & Privacy in XR

Here’s some pointers to some of the previous work that I’ve done on ethics and privacy in mixed reality.

Here are more academic references that talk about the ethics and philosophy of privacy in relation to technology

Hopefully I’ll be able to synthesize a lot of these thoughts in the course of more interviews and conversations here soon.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye and welcome to the Voices of VR Podcast. So continuing on in my series of looking at Sundance 2020, both the immersive storytelling innovations, technological innovations, as well as the experiential design process for the creators. So on today's episode, I talk about Persuasion Machines with one of the co-directors Gavinsh Ozel. He's a media artist architect VR producer in LA And so Persuasion Machines is covering different aspects of surveillance capitalism It puts you into this futuristic living room and as you walk around you walk into different portals and there's a bit of this dialectic between the narrative and the story that the technology wants you to have and then a you have this opportunity to resist that existing narrative by walking through these portals and cutting through the matrix and be able to see all the data that's being exchanged and trying to convey this deeper story of how much information is being tracked by this technology and this larger context of surveillance capitalism and how we're going to be interfacing with technology in the future. And I will say also that there were some specific dark patterns that they were implementing into persuasion machines that I was not aware of. And so they're critiquing the dark patterns, but at the same time implementing different aspects of the dark patterns, trying to make a larger point. And so there's a bit of surprise as I learn about what kind of dark patterns they were actually implementing within this piece as well. And I'll be unpacking that a little bit more at the end. But that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Kavinj happened on Sunday, January 26th, 2020 at the Sundance Film Festival in Park City, Utah. So with that, let's go ahead and dive right in.

[00:01:48.705] Guvenc Ozel: My name is Güvenç Ozel and I am a media artist, architect and VR producer based in Los Angeles. And I was trained as an artist doing large-scale media installations and then I did a master's in architecture. After I graduated, I worked for Frank Gehry for several years, and then after that, I wanted to go back to a more multidisciplinary practice and continue to do large-scale immersive installations, create interactive content. Six years ago, I was one of the founders of The Ideas Lab, which is a part of UCLA School of Architecture and Urban Design. And there, what we do is primarily looking at the intersection of technology and architecture to create interactive environments that enhance the experience of people that occupy them. So, I also started the VR lab there around four or five years ago, which I believe is probably the first VR curriculum in any architecture school in the world. And we've been playing around with these tools for some time. We have a robotics lab. We have created our own software to control industrial robotics through virtual reality interfaces, we play around with projection mapping, sensory interfaces and gaming engines play like a big role in that endeavor. So yeah, that's what I do.

[00:03:26.507] Kent Bye: Yeah, and maybe you could also, in his absence, introduce your collaborator because we just learned at the last moment that he was not able to make it for this interview. So maybe just introduce your collaborator on his behalf.

[00:03:36.074] Guvenc Ozel: Sure. Karim Amir is a documentary filmmaker and producer, and I believe that he is one of the strongest voices of his generation. And his most recent film, which is a documentary called The Great Hack, aired on Netflix. And I believe what is so important and poignant about The Great Hack is that it brings the world of technology and documentary filmmaking together, which I believe is a very, very important. issue in our time. I believe that technology is very political. It is one of the most politicized subject matters of our culture. So I think, you know, what Karim does is incredibly important and creates many layers of awareness for people to really understand and dig deep into what are the implications of our contemporary technology and that is actually how my conversation with him started. A very good friend of ours introduced us to each other and we quickly realized that we are actually addressing the same topic which is approaching technology with a critical lens without rejecting it and actually engaging in it and hopefully providing a perspective where we can do a course correction to technology. So our goal, what I was doing at the time about, I would say, a year ago, was that I was creating this large-scale robotic installation, which was an interactive installation that was a room of projection with a sculpture that is activated by a robot. And the projection was based on surveillance footage of multiple different cities. And through machine learning, we generated a kind of a synthetic artificial city. looking at how machines see our world and also playing around with this notion of what is real and what is virtual and what is actually generated through data. And once we started playing around with those ideas, Karim and I decided to first maybe look into an augmented reality app. And then afterwards, one thing led to another and once we got this offer to be at Sundance, we switched gears and we prioritized a virtual reality installation first and we're hoping to do an augmented reality app after.

[00:06:08.352] Kent Bye: Yeah, just the last year, 2019, I had a chance to go to the Architectural Association in London. There was a couple of the creators, Frederic and Lara, they have been teaching like little segments of using VR as a tool, but it's part of a larger curriculum. So it sounds like you've been developing a larger approach to, you know, what VR is going to be able to do to these different aspects of virtual architecture. So having this architectural awareness that I've been gaining and I see this piece, I see that you're creating these impossible spaces where you're walking through portals and you're actually kind of laying out this story as you walk around the space. And so maybe you could talk about from that architectural perspective, how you start to tell this story in a spatial way.

[00:06:50.933] Guvenc Ozel: I mean, the goal, which was also something that Karim and I felt really passionate about, was exploring this intersection of physical and digital, and how we can, in a way, choreograph a person's experience that allows that person to go from a spectrum of realities, so to say. And in that regard, as an architect and designer, I have been generally frustrated with the focus of the architectural discourse because there's so much focus, there's so much effort being dedicated to materials and how materials come together and how physical spaces are designed. But in fact, As a society, we spend so much time, maybe more than half of our time, interfacing with screens, interfacing in a virtual and digital world, and nobody, pretty much nobody in the architectural world cares about that space. And the gamers cares about it, the tech industry cares about it. As an architect, I felt like it is very important for us to bring this knowledge of knowing how to design environments and spaces and thinking about architecture as a technological interface. Because also if you consider contemporary buildings, there's so much technology integrated into them. that the space that architects see as their kind of window to design is getting narrower and narrower due to the fact that there's much more machines and systems integrated into buildings. So in that regard, you know, the way that we choreographed the experience persuasion machines was precisely that, that we would first allow the audience to enter into a realistic depiction. of a physical space because throughout my experiments in VR, the most important thing in the beginning is to actually convince your audience that it is real. Real in the sense that it is, that you're able to experience it. So you cannot actually defy the laws of reality in the very beginning. I think you have to do that gradually. And that's why I think there are very interesting parallels between storytelling and that kind of choreography that allows you to create a spectrum of realities. Or the first space, for example, in Persuasion Machines is a kind of a generic, very realistic depiction of a millennial living room. and it is filled with these smart devices and they're not doing anything out of the ordinary. But then as you progress through the experience, the physical architecture in a way starts to dissolve. And then the devices take over, the machines take over. Then as we progress through the experience, we start to experience space as machines see them. So there's also, I would say, an important intellectual objective in the sense that we need to, in a way, not only recalibrate the way that we look at architecture, but also we need to start understanding how the devices we create, the machines we create, see the world.

[00:10:06.019] Kent Bye: Yeah, and this piece maybe should unpack a little bit more of the deeper context and the story of this piece, because we're looking at this situation of a larger context of surveillance capitalism, where a lot of the ways that these major companies like Facebook or Google make their money is through essentially surveilling us and trying to create these psychographic profiles that are trying to evaluate what we value, what we're willing to pay for, which then they can sell advertising to. And yet, with that, there's loss of sovereignty, of our data, of our privacy, all sorts of hijacking our democracy, you know, all these unintended consequences where there's almost like this agreement that we've made to be able to get access to free services by mortgaging our privacy. So, there's a utilitarian exchange where they're actually providing value, but yet at the same time, there's this larger loss of our privacy and our sovereignty that are maybe not exactly clear about what those are until a big major event happens. But it's still, that pain isn't good enough for people to say, yeah, I'm not going to agree to this exchange for my data to have access to the services because the services are so compelling. So it's like this existential dilemma of, and I've gone to places like the Decentralized Web Summit, talked to people like Vint Cerf. And the decentralized web is an alternative. Like, how can we do the decentralized web? And he's like, OK, well, how do you provide universal access to everybody in the world with this information and pay for it? And so there's a completely different decentralized architecture where it's less of the technical aspect. It seems to be more of a cultural aspect, either money coming from the government or as a culture supporting that type of infrastructure that would allow those more peer-to-peer microtransactions. such a huge shift architecturally from the foundations of the technology that we're still in that pain point of people still making this agreement to be able to make this exchange. And I feel like part of what Persuasion Machines is trying to do is to put you into this imaginal future and have this this interface between the digital and real and have these kind of glitched out matrix like experiences where you step through these portals and try to lay out this things that are not seen and try to tell that larger story, which is actually a very difficult story to tell. So maybe you could talk about like trying to tell that story.

[00:12:20.088] Guvenc Ozel: Well, I think one of the major objectives was, like I said, approach a technology from a critical lens, meanwhile, actually reveal its potential. So in my mind, when I think about the services that I'm using in the digital platforms, actually, I don't believe that they're free, because there's actually a substantial amount of money and capital being generated through them, and I am actually submitting a large amount of data which has real value in order to get a very basic service. So if you think about the technology of email, for example, and its conception and the way it plays out, it is the most primitive mode of communication. It doesn't really enhance the actual way in which we do business. It actually creates major roadblocks. It's like a letter that nags you. So I don't think that there's actual innovation in the digital world. I feel like there has been a system that is intentionally created to create this illusion of service. But at the end of the day, I believe that human communication is a human right. And the fact that it is completely privatized and we have actually zero political control over it is exactly the reason why we are at a place where we're at. And I believe that working in the media arts space, just thinking about how we use technology also as artists, we are partially responsible for this problem. In the sense that a lot of media artists that I look at right now are in a way kind of fetishizing the graphic quality. of what is capable through either, you know, CGI or gaming engines or what have you. And there's an absence of a critical inquiry about what that technology means and what it represents. And that is one of the big opportunities, I think, in Persuasion Machines, because Karim comes from this journalistic research-based documentary world, and I understand the technology and its experiential and aesthetic possibilities. And what we aimed to do was to, in a way, tell a critical story and actually ask those hard questions that are really difficult to ask, especially in the media arts space and the VR space. Because also, when you're operating in that world, you always need the support of these big tech companies, you know. And, you know, they usually are not very, I would say, open to this kind of internal criticism and debate. And in a way, to me, it's a form of hacking that you understand that you kind of implement yourself and immerse yourself in that conversation in the tech world. and then you try to push a critical objective and approach within so that we are not actually kind of looking at that system from like outside in, but in fact we can use the tools that is provided by that larger ecosystem of media and interaction and use it in ways that will create awareness about the ways in which it doesn't operate well. Or injecting a level of political conversation into that experience so that it still remains to be a compelling experience and it still in a way reveals the potential of that medium. but in fact it adds an additional layer of criticality which is immensely important and currently absent in that conversation.

[00:16:02.034] Kent Bye: So in this piece, you walk in, there's this, it's a black room and then you have this projection map with a grid that tells you where to go to start, which I think is very helpful as you're going in. But it's also this aesthetic that gives you this impression that you're about to go into a space and that grid actually represents the virtual space that you're in. So then there's this. parody between your onboarding and offboarding where you see this outlay of this virtual grid and you walk in, you have an experience there and you come out and there's actually like this weird part of your memory of that virtual space that then is also connected to that grid. And so maybe you could talk about that design of that projection map to onboard and offboard people.

[00:16:41.974] Guvenc Ozel: Well, the main objective was to create, again, this spectrum of reality and virtuality. So when you first are waiting in line to put the headset on, you're seeing the projection as a kind of a media application that is like a veneer, like a layer in the physical world. And then when you put the headset on, that entire thing is duplicated again in the exact same spot. So you went one step further into a virtual world, yet it is aligned directly with what you saw in the physical world. And then afterwards, once the experience starts, it gives you a kind of a realistic depiction of a physical world. And then afterwards, it starts to, in a way, unpack into these fully virtual environments. So again, from a choreography perspective, experiential design perspective, the intent was to constantly force the audience to question notions of reality. And also the grid, the grid and the projection and the start button, so to say, intended to make allusions to what I call the digital vernacular. It is in a way things that people aesthetically associate directly with the digital world. And then as the experience gets more complex, obviously we're starting to push the boundary of that vernacular. But it remains culturally, I would say, familiar in order to allow the narrative to also be communicated through that story.

[00:18:23.312] Kent Bye: So there's an onboarding and offboarding. And then when you come out, then there's somebody there to walk through the different infographics that you have on the wall that are describing things and have different pamphlets that you can give to people. So it's really to further educate people on some of the things that were happening and then to give them some things that they can do with this call to action. So maybe you could talk about the design of that then offboarding process.

[00:18:45.742] Guvenc Ozel: So for that, we worked with this organization called Tactical Tech. And what they do is that they create these infographics and they also set up these exhibits called The Glass Room, which basically is looking at implications of digital technology. So we wanted to have a moment at the end of the experience so that it is not just about scaring people, about the perils of contemporary technology, social media, data economy, or what have you, but actually allow them to take action in their personal ways. And I think this relates to me, this relates to how we also handle something like the environmental crisis, right? You know, you not only have to take action on an individual level, but you also have to take action on a political level. And they have to go hand in hand. So I think our reaction to surveillance capitalism needs to be the same way as well. Those small changes that you do to your digital habits is only a very small part of that equation, but creating larger awareness about data economy and how it impacts business, how it impacts creativity, how it impacts politics, and actually allowing a political space to flourish to find alternatives. to the current way in which it is happening is immensely important. So I think taking personal action is only a small part of that equation, but we also wanted to give people a tool, a hope, so to say, to basically tell them this partially is under your control. in your personal control. There are steps that you can take individually in order to, in a way, protect yourself, but also respond to this larger persuasion machine.

[00:20:41.829] Kent Bye: Yeah, I just did an interview with Trevor Flowers about WebXR, and we were talking about the shift from print to the dynamic responsive web design was such a huge paradigm shift to go from that 2D static to the more dynamic for cross-platform, cross-device, many different contexts, and to still have that same content. So I think people are familiar with those different user interfaces that are used in that 2D realm. And I think as we go from the 2D to the spatial, I think there's going to be an equivalent like quantum leap from that dynamic responsive into the spatial and embodied and sensory design as well. And I think one thing that is striking about this piece of persuasion machines is that you're using a lot of the affordances of those 2D interfaces, the terms of service and different interactions where you're clicking on buttons. And so you're trying to invoke these experiences that we all have day to day. We want to use this website. We just sort of scroll through and we click yes. And you're trying to pervert those pattern behaviors because we have this condition response where We do the action in order to get the benefit. So we kind of put that into just a rote thing of like, in order to get access to this, I have to do all this kind of legal terms of service, agree to everything. But in some ways, you're trying to interrupt that in some ways. So maybe you could talk about that design process of trying to interrupt a lot of these pattern behaviors.

[00:22:00.263] Guvenc Ozel: I mean, a part of it was, in a way, me and Karim working to figure out how we collaborate, because his experience as a documentary filmmaker and as an artist is based on telling a story through the two-dimensional world of film, editing, montage, and also, in a way, pacing a particular mood and atmosphere in a timeline. And my experience in VR is about setting up a certain set of parameters or circumstances, and then you have to give a substantial amount of agency to the person who's experiencing it. And you may or may not hit the certain points that you want to make, but you as a designer need to kind of live with it. So we talked a lot about how we wanted to do this, how we wanted to make sure that certain points that we wanted to make and communicate actually is coming across loud and clear, and those kind of integration of two-dimensional interfaces as well as videos was a vehicle for us to, again, choreograph a person's attention from one subject matter to another. So there are moments in which you are watching videos in a kind of a multi-directional way, or you're interacting with two-dimensional interfaces that are situated in three-dimensional environments. But then in a way the experience uses those moments to tell you a story that is much more spatial, much more three-dimensional. and then we kind of anchor it back into a linear storyline and then we let the experience take over again so that there's always this back and forth in the timeline of the VR piece that allows you to go back and forth between those two modes of, I would say, storytelling and communication.

[00:23:52.483] Kent Bye: Yeah, because you're in some ways stepping through a portal and getting the more subversive counterculture arguments, the critical discourse about everything, and being invited to step into these other realms. And then when you come back, then there's traces of you going into that realm and the consequences of being tracked. you know, not being obedient and imagining a future where there's other ways in which that you are conditioned to just follow the rules and not, you know, subvert. So maybe you talk about that dimension of how to create this inner conflict between the status quo, the existing culture and this counterculture and this polarity between those two.

[00:24:30.538] Guvenc Ozel: So in order to in a way address that we wanted to not only ground the experience in the present but also taking elements of the experience that we all have in the present and layering it in different kinds of interfaces meaning For example, in the second version of the living room that you enter, once you get out of the first portal, you're seeing the room as a point cloud, which is like the first sign that you're starting to see the room as if it is documented by machines. And then these ads are starting to be served in front of your face based on your interaction with these devices. And that is supposed to be a pop-up ad, but that is a kind of a future projection. in the sense that what if like in the near future when there are augmented reality glasses or maybe contacts or what have you that are constantly negotiating between you and the physical world and like hyper commercialization of that world what would that look like and it's disruptive it interrupts your actual interaction with the world and we wanted to in a way kind of give the audience a glimpse of that But also, as I said, you know, there are layers of that, right? The first one is like, you know, what you see in your HUD, what you see in your main interface. And then there's a second layer, which is this kind of pop-up layer that gives you information about the devices. And then there's a third layer, which is the architecture. So that, you know, we're in a way allowing a certain amount of depth to happen in the way in which you kind of zoom in and out of different topics or objects that require your attention. Then how do we do that? How do we do that through our eyes? How do we do that through our spatial perception? And how do we turn that into a design element like varying from 2D to 3D?

[00:26:29.839] Kent Bye: Yeah, and that's the thing that I think is really striking about VR as a medium is that it allows you to draw out these relationships between these entities in a spatial way that allows you to tell that larger story. And from an architectural perspective, there's certain limitations of gravity and materiality and cost. And in VR, you're able to get over a lot of those, although I would say that there's a parallel between the limitations of cost of the materials with the complexity of the geometries and the polygons and the compute that you have, that you can't just have infinitely fractal, very high polygon experiences. So there's similar constraints and limitations in the virtual architecture, but you don't have gravity and there's all these new things that you can do. And I think There's also the ability to interact and have dynamic architecture that's changing and growing over time. So how are you making sense of this new aspects of virtual architecture that doesn't have the same constraints of materiality and gravity, but also being able to draw these new pattern relationships as well as have dynamic interactions and have the architecture changed based upon your behaviors?

[00:27:36.997] Guvenc Ozel: I think a big part of that question is actually about our biological makeup and how we perceive the world, how we have a sense of depth, how we look at things, how we zoom in and out of things with our eyes. And obviously the current technology of VR is not fully calibrated with that yet. But one thing that I feel strongly about working with VR for the last five years is that you can defy the laws of nature, so to say, such as gravity and physics. I think if you're going to do that, you have to do that gradually. And, you know, if you throw the person into a completely unrealistic environment, you basically kind of lose your credibility. But on the other hand, you have to fully take advantage of the fact that you're not limited by anything. So there needs to be a very good balance that needs to be very well choreographed in order to create that moment of immersion. Because if it is completely unrealistic, I don't think that you are immersing the audience in a new kind of experience. And I think you can do that, but I think you need to give enough pointers or enough moments of familiarity in your virtual interface to the people so that they can actually predict what they need to do next. Because not only that our biological makeup is kind of evolved over millennia to respond to the world that we live in, but also there are cultural constructs about the way in which we interface with the world. You know, for example, what I refer to as a digital vernacular, right? You know, there's a certain kind of consistency in the way in which we interact with two-dimensional interfaces. And that repeats across platforms. And each time you see something that is actually hard to interface with, that is due to the fact that it is not following those moments of familiarity. from the history of interfaces, I would say. I mean, obviously, as you know, there are two camps about this. I would say one of them is the metaphorical camp and the other one is I call the experiential camp. meaning what we call a folder in a computer, right? It's not really a folder, but it is making reference to a thing where you store information. And again, it's kind of alluding to management of information, right? That's why things are called folders in the computer. And if you look at the older graphic design of these interfaces, it looks like a binder or a folder, right? But now I think we are familiar enough with digital interfaces and ways in which those interfaces store and organize information that we still call it folder out of habit, but that thing no longer looks like a folder, right? That icon doesn't look like a folder. because there are multiple kinds of information that is being stored, then you basically categorize things according to the type of information. So I think it's also a learned cultural thing that evolves based on the capabilities of the technology. And we kind of have to work with that in order to not only allow for the audience to be immersed in an experience, but also in order to transgress that, in order to push that to the next level.

[00:31:01.988] Kent Bye: Yeah, a big trend that I've seen at these film festivals is the whole installation aspect. So you walk into an installation that is priming you before you go into the experience. And so in some ways, when you are entering into Persuasion Machines, you're seeing that grid and you're entering in. The first place you go to then is this ordinary living room, which people have a lot of experiences with, invokes this context in your home. And then as you walk through those portals, then you're getting into those more abstract aspects that you're saying. Like if you were to just start off in there, that may be disorienting, but you're able to ground people in the sense of presence where they're able to suspend their disbelief because they're able to have enough of that familiarity, but you have this way of going in and out of that. So I see that as that same type of process of onboarding and offboarding through an installation. This piece happens to have an installation. So you're doing that on all those three levels, but allowing people to, have this grounding and then have those context switches that is almost like this level of inception where you're like having the ground of the context, you're going into a deeper context and then a deeper context. And so you were talking about the different depths of being able to look at things, but this is like the virtual depth. It's like in the 4D because you're talking about something that has happened in the past, but it isn't the same virtual space. So it's super interesting to think about like, as this moves forward of like, what are the common affordances that, you know, the user interface that's been on the web for the last 20, 30 years has evolved to the point where it's pretty standardized. And in some ways, boring, it's stabilized. And it's going to take another 20, 30 years for the same spatial design. But once we get to that point, then we're going to have these other primary metaphors of knowing when we see this indication, we're going to have this type of interaction with our body. But for me, it's like this long 10, 20, 30-year journey to figure all that out, but that we have this inception-like aspect of context that we have that I think is also really interesting.

[00:32:52.887] Guvenc Ozel: Yeah, and also just I want to say a few words about what you said about the onboarding experience. What we also wanted to do was also looking at the experience of VR. I think what is interesting about the experience of VR physically is that when you put the headset on, you lose contact with the physical world. And then you start losing your inhibitions a little bit, right? Because you cannot see people seeing you, you automatically assume that nobody's seeing you. And that is one of the, I think, interesting absurdities of having VR experiences in public, right? Because people are laughing uncontrollably or they're amazed and they behave in ways in which they normally don't behave socially. So we wanted to in a way kind of take advantage of that by creating that grid and by shining a light on each user that tracks their position. We wanted to almost turn them into performers and allow the audience to see real time metrics of their experience. and even the portals in persuasion machines is indicated in the grid. So each time you're watching a person go through the experience, you always know exactly in which portal they are, if they're in the living room, it lights up, that projection is also in a way documenting and surveilling the person going through the experience and then projecting it and communicating that information to the people that are waiting to take the experience, to go into the experience. But because you have the headset on, you lose track of that. So in a way, it's also a metaphor about how we engage with technology. Because in its essence, we know that people are managing our data, we know that we are being watched, but once we are engaged with the technology, we forget about it. The same way that when you put a VR headset on, you forget about the fact that other people can see you. So this kind of notion of being watched and watching, surveilling, voyeurism is also very much so embedded into the installation set up so that it becomes a spectacle. Just the process of waiting to go into the experience becomes a subversive spectacle.

[00:35:14.388] Kent Bye: Yeah, when I was going in to see Persuasion Machines, I saw that there was all these depth cameras that were there. There's different aspects of being tracked because there's a social element, so you have people in the same space. And I kind of joked to the person, I was like, well, is there a privacy policy I should be looking at and reading? Are you recording any data? I guess that's sort of a thing to think about as part of the installation. Should that be a part of what is the most exalted way of ensuring people that as they enter into this experience that you're not actually going to be recording information? Because as I go into this space, that was actually one of my questions. It's like, are you actually recording information in here? Or are you gathering stuff for your own research or stuff? And what are you gathering? You know, there's these cameras that can detect my face and different aspects. So there's an element there that it was sort of a meta comment, which I trusted the experience, but that it was still at the same time, this element of what kind of disclosure needs to happen. And is there ways to innovate on how to do that in a way that ensures that there can be deeper levels of trust?

[00:36:10.595] Guvenc Ozel: Well actually what we are doing in the experience is we are recording it and we are broadcasting it real time on YouTube live. What?

[00:36:20.721] Kent Bye: Are you serious? So that to me is actually really upsetting. Because there is no disclosure. So in some ways, you're very hypocritical. I'm a little livid right now that that has happened, because that's not disclosed to anybody.

[00:36:35.171] Guvenc Ozel: But the idea is that we're actually, again, the intent is to make people aware of that. Because in any public space that you are, you're basically automatically giving consent to the entire surveillance system to watch you. And what we are doing is that in the last portal, in the big reveal, those videos are actually streaming from YouTube Live directly. But there's an anonymity to that, because when you have the VR headset on, nobody can tell.

[00:37:06.799] Kent Bye: Well, there's gate detection. So you can determine someone's identity by how they walk.

[00:37:12.223] Guvenc Ozel: Correct. But again, the question is this. When you're walking on the street, There are CCTV cameras. There are sensors. We are always constantly being watched. In our internal YouTube Live, we have nobody watching it. But the very fact that we can put it on and people not knowing it.

[00:37:33.224] Kent Bye: But you're also feeding it into Google. So you're fitting it into the algorithm. You know, I think, well, so I think there's certain aspects of that where, how do you disclose that to people? Because there's actually an experience at DocLab where similar, where you go into this room and you're playing with this machine, but you're playing as being broadcast outside. So that's a bit of a violation of my trust.

[00:37:55.116] Guvenc Ozel: Well, I think, uh, I don't think so. I disagree with you because there is actually no trust in the way in which the surveillance mechanism operates. And our objective is to kind of call attention to that by enhancing that aspect of it. But it also happens anonymously in the sense that through the grid, through the lighting, it is almost impossible to decipher the motion of people, what they're doing, or what have you. So there's a certain kind of anonymity to the way in which it is unfolding. Your face is covered. Everything is actually in this equalizing space that the bodies in space look the same. So our objective behind that was to also show to people how easy it is, and how natural it is, and how ordinary it is, for every kind of experience to be constantly tracked in any platform, whether it is physical or digital.

[00:38:54.425] Kent Bye: Yeah, and I think there's definitely a value for, as an artist, to be able to do these different transgressions, to be able to make a larger point. I can definitely see that. But at the same time, you have a responsibility to do the best practices and to have my image or likeness, some aspect of my embodiment, the way I walk, different aspects. I mean, you may have thought that there are certain aspects that are not anonymized, but you imagine 50 years from now, that video being there and having people Recording information and be able to have all this biometric data if you get a hold of that data You can actually sort of like in retrospect, but maybe it's not possible now But do you think about five or ten years from now? That may be actually possible to go back in retrospect and be able to de-anonymize So it's this whole thing you're sort of doing the same argument that surveillance capitalism is doing is saying that it's de-identified data and So if it's not personally identified information, then they can do all this stuff. But the problem is that when you aggregate that information together, then it becomes like this big picture. And when you combine those different aspects together, then you're able to kind of unlock that anonymity. So I feel like There's certain best practices that need to be developed and need to be cultivated To be able to have people informed because if that's disclosed like you're being broadcast live on YouTube I would imagine there was a lot of people be like well I'm not gonna do it because it sort of creeps them out that there's a certain level of Surveillance that's happening within it and I was never told that you mean you tell me that right now is the first that I'm hearing of that so there's a bit of like anger that I have that like, there's a bit of that trust that there is a trust that's violated. And for you to say that it's not is to disagree with my direct experience of that.

[00:40:32.374] Guvenc Ozel: Okay, well, I apologize for that. It's just our understanding was that that mode of surveillance is constantly happening. And the reason why I feel comfortable basically talking about this is because it's a part of the overall structure of the experience. And the point that we are aiming to make is about creating that awareness that there's actually no escape from that mechanism. And that is happening in every level. And I think people should be aware of that. And I think that is very clearly depicted and communicated in the piece. So in that regard, I do feel that there is a sense that everything needs to be communicated, but on the other hand, the terms and conditions that you're signing without reading it is actually the exact same point that we're trying to make. is that when you sign the terms and conditions every day, all day, in order to get service, right, you're actually doing the same mistake over and over and over again. And in many cases, you're doing it willingly. You know, you're actually fully aware of the fact that you don't understand the terms of conditions, and you are, in a way, kind of submitting to that because you want to see and art experience, because you want to get an email service, because you want to use Instagram or what have you. So it is a part of that entire mechanism. And I think that is conceptually the basis of why we did this installation, is to create a certain level of reaction in people, not only through the visual information that we're communicating, but also through revealing the background mechanics of that system.

[00:42:20.009] Kent Bye: And I'm trying to remember in the beginning, did you have people sign stuff before they go in?

[00:42:24.432] Guvenc Ozel: Yes, yes.

[00:42:25.753] Kent Bye: And what did it say? Because there may have been a part that I didn't read it. So what's it say in there?

[00:42:31.416] Guvenc Ozel: It says that, you know, you're being broadcasted. So I guess that's on me. Yes, but also again, nobody is watching our broadcast. That's the other thing because it's not like public. It's going through the YouTube channel, but it is not a public broadcast.

[00:42:47.900] Kent Bye: It's just live. So you're using the same adhesion contract language and everything and giving it to people in the same way that they're probably not reading it and then probably not really sort of realizing it.

[00:42:58.928] Guvenc Ozel: Exactly, exactly. And that's like one of the actual points. And that's why the terms and conditions is the first pop up menu in the experience. And it pops up again. And you click yes again. And then you know, and then the transmission happens from glitch, and then you reject, and then you enter the portal. So it's in a way embedded into the entire narrative of the experience.

[00:43:24.002] Kent Bye: Well, I guess part of the thing that was lost on me on that, in this conversation we're discovering, but how can there be a reveal of that at the end in a way to really show people that that was happening? Maybe they didn't realize to sort of communicate that and then what would people do instead? Because there's a part of that that's interesting, but how do you visually show that? How do you tell that larger story of this whole thing we were talking about actually just happened and you may have not even realized it?

[00:43:50.712] Guvenc Ozel: Well, I mean, again, I think the story of the piece communicates it in my mind quite clearly. Because you are understanding through a series of depictions how your data is being used. What does that data economy look like? And what can we do to control it? How can we participate in it in productive ways? But again, going back into the terms and conditions, that is the biggest metaphor in the piece. is that we're constantly agreeing to the terms and conditions and also in the voiceover it says we need to renegotiate it. So that is in a way not only an actual thing that happens in the piece, but it's also kind of a metaphorical pattern, so to say, that keeps coming up over and over and over again. I believe three to four times there's an allusion to terms and conditions.

[00:44:44.712] Kent Bye: So what's next for Persuasion Machines? Maybe you could talk about your reaction here and where you go from here.

[00:44:50.658] Guvenc Ozel: Well, I mean, we're hoping to kind of do next installments of this, show it in more festivals, show it in museum spaces, more cultural institutions, maybe create panels where we can bring in additional experts to create a larger conversation about this topic. and also hopefully do an augmented reality version of this so that we can reach to a much larger audience through mobile devices and so that people can experience it in their own homes and just, you know, really spread the message and really communicate this awareness about this topic. I mean, obviously, it is very prominently represented for the last four years in the political realm. But I still think that there's still a huge layer of invisibility and ambiguity about the mechanics of this. and about the machinations and about how data is being used for multiple different kinds of purposes. So in that regard, our objective is to continue doing this and creating other installments of it so that we can reach to a much broader audience. Like, you know, many people with children, actually, that we encountered through the experience were saying, I wish my kids could see this because they never lived in a world where this notion of privacy existed. And I think maybe my generation or your generation, you know, I believe that we're around the same age, is probably the last generation that experienced both. And I do feel like we do have a sense of responsibility to, in a way, communicate to the younger generation that this is not the only way. and allow for a much broader conversation, a cross-generational conversation to flourish so that we can actually increase the spectrum of choices about how we manage data and privacy.

[00:46:49.644] Kent Bye: Great. And for you, what are some of the either biggest open questions or problems you're trying to solve?

[00:46:56.489] Guvenc Ozel: Well, I'm interested in, as a designer and architect, what it means to interface with the virtual world. To me, that's very important. And in a way, kind of developing the design parameters of that world is important. I believe that VR is a much more developed platform than two-dimensional devices. A lot of people are afraid of VR. I think VR has a kind of a bad rap like when it comes to science fiction or what have you, but I do believe that it could be a very productive platform and technological platform to enhance communication between people and allow for a much broader set of conversations to happen, different ways of creating art, different ways of doing business, different ways of communicating with each other for any kind of purpose. And I believe that the design parameters of that requires the eye and the sensibility of designers rather than just engineers. So to me, that is a personal project. I really think that we need to turn VR into a much more comfortable platform. And I also think that, you know, the way in which we look at two-dimensional interfaces, especially in public, is not great. You know, when I look at people on a train, for example, everybody's looking at their phone. How can we create media environments that are much more social? That is important to me. But while we're doing that, I think we need to also consider the political aspect of it so that we don't live in a world that is reminiscent of minority report.

[00:48:41.789] Kent Bye: Great. And finally, what do you think the ultimate potential of immersive technologies and immersive architecture, immersive storytelling, what the ultimate potential of all those are and what they might be able to enable?

[00:48:54.353] Guvenc Ozel: I mean, I think ultimately everything is about communication and about socializing with each other. I think, you know, one of the reasons why we were so excited by a platform like Facebook, for example, was because it allowed us to communicate with people that we couldn't communicate with directly due to distance, due to time, all these kinds of different factors. And it basically allows you to have a better interface socially with other people. So to me, every technology in a way feeds into that objective that it needs to enhance communication between us. And also, that's why I think new technologies, we have to always approach them with enthusiasm and caution at the same time. But also I think it's a matter of control and we need to, as designers, as I would say creative people, need to be much more engaged in the way in which these systems are being produced and managed. So I personally aim to engage with them or continue to engage with them in a more critical lens. so that we, in a way, can have a say in the way in which they are being formulated and managed.

[00:50:13.545] Kent Bye: Awesome. Is there anything else that's left unsaid that you'd like to say to the Immersive community?

[00:50:17.688] Guvenc Ozel: I think that these kinds of podcasts and giving a voice to people who are aiming to do something outside of the box is very important. I do believe that we need to be much more active politically in the immersive community and I think To me, game design and VR is, in a way, the kind of new type of medium, and we need to inject a certain level of, I would say, intellectual endeavor into that medium. And I see this similar to, let's say, the birth of photography and film. Each time there is a new medium being born, first everybody's trying to figure out the basics of it, but then you have an entire spectrum of different artistic expressions flourish within it. And I am happy to start to see that in the world of VR and interactive art. And I believe that right now it is really starting to have a real impact and a cultural significance. And I am very excited to see that and I hope everybody else engaged in this medium are too.

[00:51:26.785] Kent Bye: Awesome. Well, I really love the Persuasion Machines experience, all the message and the architectural integrations. There is aspects of the content I love. There was an evolution of my experience in this interview that I went from feeling betrayed to then feeling like this aspect of actually that was on me and sort of a deeper layer of the piece that I didn't get at all. So I was happy to have that whole journey as well. I think I'm on the other side there. I still have some questions in terms of the role of designers to say, can we create alternative Frameworks that are not bad user interfaces or you know, the sort of like the user interfaces that are really feeding into the surveillance capitalism So can you actually use design to create something different which I think that as we move into spatial design? we'll get into those new patterns and ways of disclosure and ways of and because all of those levels of consent are going to be really important as we move forward, and to really make sure that it's clear for people as they're going forward. So I think that's still a lot of work to be done by the larger community, and that hasn't been figured out by anyone just yet. So I think this piece of art was sort of catalyzing that, that that's still more work to do. But thank you so much for joining me and for unpacking it more.

[00:52:32.922] Guvenc Ozel: Thank you for having me.

[00:52:35.092] Kent Bye: So that was Kavench Ozel. He's one of the co-creators of Persuasion Machines, and he's also an architect, VR producer in LA, as well as media artist. So I have a number of different takeaways about this interview is that first of all, Kavench having a background in architecture, I could really see the influence of his architectural background in this experience, because as you walk into this experience, first, you have the grid that's projection mapped. And so it's priming you to know that you're going to be walking into a space that's roughly outlined onto the floor so it gives you this sense that you're going to be walking into this virtual space and then as you walk into the virtuality experience you see the fully fleshed out room and then you're able to walk into these different portals where they try to abstract different aspects of the data that's being tracked and make this juxtaposition between you being in this virtual world which you just feel like you're in a living room and then being able to walk through these different portals and then to create this contrast of trying to look at the deeper patterns behind the surveillance capitalism and all the data that's being tracked to allow you to have these moments of contrast where you're actively resisting the narrative that's being put forth by the technology and then you go in and get a different perspective in these portals and then you walk back out and then they know that you've gone off grid for a little bit and so you feel like you're being somewhat transgressive and as the narrative progresses then it gets more and more trying to prevent you from resisting and going in and finding out the deeper story about what's happening with surveillance capitalism. So I really appreciated using space to tell the story. I think what Gavinch was really saying was that with virtual reality you have to give people a sense of familiarity of walking in and seeing this kind of parity between these virtual worlds and to give you the sense of you're actually in this living room and then from there to be able to step in and to go into further levels of inception where you get your plausibility built up and then as you go into those abstractions then you have more believability because you're able to have a grounding of feeling like you're in a space that is familiar enough to what you expect. And so we have all these expectations. And with virtual reality, you have the ability to pervert all those expectations and do experiences that are way out there. But I think what Givench is saying, trying to incrementally change what you expect, but you have all these existing vernacular within the digital realm, all these user interaction paradigms, and starting to slowly apply those within these virtual spaces. But that this is probably gonna be a long process of trying to figure out what that full digital vernacular is. But also, as we go on into more and more virtual experiences, then our expectations of what is possible is going to slowly expand. And then you'll be able to have a different baseline to start from. But I think from his perspective, he's starting with a very low level baseline of trying to replicate different aspects of reality. And then as time goes on, then be able to do more and more aspects. And so from a design perspective, that was, I think, one of the strategies that I think worked really particularly well within persuasion machines. And in terms of the content, you know, this is a very difficult story to tell different aspects of data that are being tracked. And I think they're actually trying to mimic different things. And so not actually track them, but make it feel like that they're being tracked and also trying to critique the different aspects of the dark patterns that are being used by the big technology companies by. Using some of those exact same dark patterns. So I was trying to go back and remember as I was entering into this experience, I saw it during press hours. I don't recall signing a big, long release form that all of this information. I may have been handed a iPad and then scroll down and signed it. I don't actually know. I was trying to ask and say, did I sign this? Or I remember having a piece of paper that I signed, but I don't remember having this disclosure. And even if I did, I don't remember if at that moment, given the pressures of you got to get in and see everything, it's almost like, okay, I'm just going to sign this, whatever. It's like the same type of. interaction pattern that I have with all websites where you're like, I don't have time to read all of this terms of service and all the privacy policy and all the nuances. And, you know, it take you a long, long time to read all the different privacy policies for every service that you use. And you kind of get into the habit of just signing away your rights to be able to have access to the service. And there was something about this experience that I just really inherently trusted that they were going to have my best interest at heart. And there was a bit of a shock to realize in the middle of the interview that they had, in fact, implemented some of the very same dark patterns that they were critiquing. And that was a little bit of a surprise to me just because I wasn't expecting it. And also I didn't feel like at the end of the experience that that loop had been closed to say, okay, everything that you were doing was actually being live streamed. I realized they're trying to critique it, but I feel like they're replicating a lot of it rather than actually critiquing it, especially around the whole aspect of a third party doctrine, which is the behaviors that we do. changes the normative standard, then the more that that makes it okay, if you get a bunch of people to sign things, and then without really fully disclosing to them that you're live streaming, then all of a sudden, that becomes normal for everybody to start doing that. And I actually don't really want to live in a world where there's no clear explanation that whatever I'm doing is being live stream live on the internet, which when I go back and look at the videos, I don't think they were actually live streaming when I went through the experience. They do have videos of other long sessions. And in talking to a number of different people afterwards and saying, you know, did you know that you were being live streamed live? And there's a lot of people that were really quite shocked that they were because there's nowhere in this experience where that's being disclosed. So it wasn't made clear for me or a lot of people that this was happening. And so I think the deeper message of them trying to make a point around it, I think, was probably lost on a lot of folks. But I think there's also this dialectic between critique and constructive alternatives. And I think there's a value for critiquing these different things. As well as I think there's a huge amount of value for actually creating constructive alternatives that are actually not using those dark patterns, but are actually using best practices that other people can see and then start to implement those best practices. And I think that's a lot of what I try to focus here on the voices of VR podcast, trying to come up with constructive solutions and you know, what are the best practices for consent and to make sure that people are clear that as they're walking into this experience, that you are going to be live streamed. That wasn't made clear. There's no signs. It's sort of like you sign a whole release form and then all of a sudden you're being live streamed. You know, there's a role for things like Black Mirror where it's really trying to project out the worst case scenarios of the dystopic potential futures. But I think there's also a value to. try to figure out the protopian alternatives to be able to see, okay, what are the best practices that we can start to implement and put out? And so as artists, you have to decide as to whether or not you're going to replicate some of those dark patterns to be able to critique the dark patterns, or if you're going to create some more alternative best practice solutions that are trying to be something that people can use that go against the existing dark patterns. But I think overall, I really love this experience just because it is trying to tell a story that's actually very difficult to fully tell. As we move forward, we are moving into a world where we have a lot of these companies that are going to have access to all this information, all this biometric data, and there's not existing frameworks to be able to have these companies take this data and use it in what's going to be for our best interest. I know there was a whole movie that was called The Social Dilemma that was really Following the trajectory of Tristan Harris and the Center for Humane Technologies, as he's going out and trying to tell the larger story around this asymmetrical relationship between these major companies and the data they have on us and what they're able to tell about us, and in some ways be able to know more information about us than we know ourselves, and how that information can be used to subtly shift and change our perspectives and change our behaviors. And as we move further along into virtual reality, as we have more aspects of our eye tracking data, our galvanic skin response, the way that we're moving, our different actions and behaviors, being able to correlate what we're seeing in virtual reality and our behaviors and be able to have this huge amount of information on us. All these companies that are doing these immersive technologies are going to have a goldmine of information and what happens to all this biometric data and how do we actually like have some sort of resistance or protection. This is so far ahead of what the regulatory bodies are even considering or have any sort of conceptual frameworks around. And talking to the philosophical community, there's no comprehensive framework for privacy that's being put forth by the larger philosophical community. And so it's really up to each of these companies to figure out what the boundaries are between what should be private and what should be public. And know with the third party doctrine and all this information that's being collected then you know essentially it's creating this potential future and roadmap where if all these companies start to record this biometric data then we're saying collectively that this is a new normative standard that we're okay with having all of our biometrics tracked and then the government all of a sudden can start to track all that information and because the behaviors that we do changes the normative standard, then the more that these companies are recording on us, then the more that that makes it okay for the government to start to record that same level of information, for them to be able to have access to all of our eye tracking data and all this stuff. And so essentially creating this whole 1984 surveillance state, that right now we think that it's okay that these companies have access to the information, but the more that we do that actually changes the definition of what should be private and allows the government to be able to have access to that same information. Which is part of the argument that I would give against what Persuasion Machines was doing in terms of not really telling people that they were live streaming, because it's actually helping to shift the normative standard to actually go the wrong direction in terms of privacy without really recognizing the full implications of the third party doctrine and what they're actually doing there. So I think there's actually things that they're sabotaging themselves without fully recognizing it. So anyway, that's sort of a bit of the privacy rant about that. But also just the deeper message of what they're trying to say is that this is a huge issue and we need to get more people concerned about it. I totally agree. And there needs to be more discussions about ethics and to have a larger discussion about the third party doctrine and whether or not the third party doctrine is that when you give data over the wire and give it to a third party, then that is the sort of the third party nature of it. And it actually makes that data not private. If we want to be able to have access to all this biometric data, then we need to figure out new ways of doing either edge compute or homomorphic encryption to be able to put some of that data locally, so that it's not going over to this third party. And so it can be this encrypted store of information, if we do want to start to use this for education or medical purposes, or even through gaming, that it's not being sent over to these third parties. And then you know, having this adverse effect on our privacy, that is the current state of the third party doctrine. So either the third party doctrine needs to change, or if it just fundamentally hasn't changed, because it's the The principle of privacy, if you share information with third parties, then it's inherently no longer private. So how can you still have access to the information for yourself into more of a decentralized architecture? And so that's where this whole decentralized web and edge compute and being able to have things locally on things that you control, even edge compute and cloud computing is a little bit of a question as to like, if you're sharing it under the cloud computing, even though it's under your control, is that still a third party? You know, there's definitions for how that all gets worked out and. you know, as we move forward, trying to figure out what is the best way to be able to preserve our privacy, and to not walk into this dystopic future where we have all these corporate driven surveillance capitalism initiatives that then bleed over and create a surveillance state within its own right. Again, I think this is a very difficult story to fully tell, but I think hopefully they're able to give people at least a sense of an embodied experience of having a little bit more skepticism around some of these different things, including some of these dark patterns around adhesion contracts and the terms of service, kind of mortgaging your privacy in order to get access to some of these experiences. Just a final shout out is that I did do a whole XR ethics manifesto going into a lot of these different aspects of trying to come up with a comprehensive framework for both privacy, as well as some of these ethical and moral dilemmas that we have within mixed reality, which, you know, certainly covered in this podcast, but there's a lot of other issues as well. So recommend for anybody else that's interested to go into YouTube and search for the XR ethics manifesto. So that's all that I have for today. And I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends and consider becoming a member of the Patreon. This is a list supported podcast. And so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash Voices of VR. Thanks for listening.

More from this show