Alton Glass: Yeah. Initially, it was just I got into Google's VR lab and I pitched the project. And, it was just supposed to be, you know, a funding to explore a new medium format that they had called VR 180. But I was really passionate about after watching this video with that run that video, right with Tamir Rice, with the young man who was killed by the police, for carrying, I think a toy gun. And I was like, man, what if you know, there was an opportunity to create intermediary technology that could have went up to this young man, told him to put this weapon down, and then the police came in? So I was like, you know what? Let me just make this this concept as like, a short, immersive experience.
Alton: But what that opened up was, as we started doing research about artificial intelligence. It opened up sort of a rabbit hole to the different things that were happening in that space with facial recognition technology, surveillance technology, and I start reading about, you know, China and other countries and law enforcement agencies who are actually developing these technologies for real and then using drones and deploying drones. And I was like, Wow, you know.
Alton: So we made the piece, and then it sort of organically started to grow because we started to look at and find out about organizations who are actually looking to grow messaging around this issue. And then it became an actually immersive experience that became a community engagement experience. And then we started to get funding for community engagement. That's when I learned about… we did a program with Black Public Media in New York, focused around community engagement to get the messaging around. So it made me critically think about and learn about… Okay, if you're going to tell a story, how do you make create maximum impact around that message? And then that's when it started to grow. And I met individuals like, you know, Elizabeth and Crux to continue to spread this mission and spread this message out. And now it's taken a life of its own, and we're able to do workshops, and there are components to this that go beyond just the actual VR experience, and that's where it incubated, and that's where it started to teach and grow now.
Erinn Budd: That's really, really great. And I just wanted to pose the question to the both of you, with the piece that you created and knowing what's going on in all these different spaces, like Ban The Scan. Where do you guys see how AI centered technology is going to be within our day to day lives, specifically even within policing. And this is really what the future is going to be for us? Or is this something that we can change? Or make sure that there are people that are visible in those spaces that aren't necessarily represented right now?
Alton: I can say that for me, it's always been about creating cultural responsive narratives and creating narratives of empowerment because, you know, for years we've always asked for permission, right? And we've always been on the other end of the table as, I would say things are happening to us, and I wanted to create stories that and experiences that will allow us to have a conversation and discourse around this and then working with organizations like like Elizabeth who who focus on civic technology and oversight right? So that communities have understanding and educate themselves on the awareness, the terminology, you know, getting the literacy. So they understand: Okay, Now I know what this is and how I can participate, whether from a community level or as she talks about AI and the black experience being able to get into the field and create opportunities for myself to make a change as an immersive storyteller, creator, a coder, an activist, whatever you feel you're passionate about.
Elizabeth M. Adams: And I'll say this it's been a real interesting experience for me being a technologist because I love technology. But I also am a part of the group where some technologies don't necessarily work for us because of algorithmic bias. And I've had to kind of sit and and decide for myself what kind of future do I want to see and I want to live in and part of what I think is so important, as we bring a new generation on to talk about the possibilities of AI, is that we create multiple paths for them to explore. One of those paths could be purposeful and meaningful AI versus, you know, just chasing bad actors in the space.
Elizabeth: There are many, many different avenues for people to get involved in this and in the city of Minneapolis as they are moving towards their 2040 goals, which is more of a smart city, this is, for me an opportunity for communities of color, vulnerable populations to be a part of those discussions, so that when lifesaving innovations and technologies make their way into the community, we have a voice. We have partnership in that, and that's what civic tech is. It is forming those partnerships between the community and policy makers so that there is shared learning, shared leadership and shared decision making. So for the future, my hope is to help guide the next generation towards multiple avenues of exploring leadership in their communities, in their organizations, in their cities, countries wherever it is, that gives them, like I said, an opportunity to think about purposeful, meaningful as well as combating some of the implicit bias that we see currently in technologies.
Erinn: And just to piggyback off that, do you believe that we can actually remove implicit bias from AI? And if so what steps can we do now to prevent that from happening in the future?
Elizabeth: That's a good question. I'm not a data scientist, and I'm continuing to explore that. But what we can do is work that I've been working on with certain CEOs and boards and saying: Let's unpack your design. Let's start the questioning way up in the process so you don't have all these people that are excited about technology. You've got your budget, you've got your leaders. You've got your teams focused on this, only for it to make its way into the marketplace, for community to decide it doesn't work for us. And for them to join together with policymakers and ban it. Right? You don't want that. There's a lot of wasted energy and space. And so until we have regulation here in the US that drives that kind of fairness and opening up and explainability within the algorithms, we're going to have to depend on these organizations.
Elizabeth: And so I do think it is possible, I would love to work on a pilot where I actually can work through the entire life cycle, versus just pieces of it. Many companies are a little bit shy about opening up their secret sauce. And so those are why we still have these conversations out there about ethical AI is because we got companies who are in charge of it and they hire people to work on it. But then once you start exposing some of the harms, then they want to get shy about it and not really address it. And so, yeah, constant evolution.
Erinn: Amazing. And I just wanted to, you know, check in with Alton and specifically with people be like were the characters inspired by anyone you know personally or just by a multitude of different situations? I know over the past year we've seen a lot, you know, specifically within the Black Lives Matter lane, but overall where did you get the inspiration for these characters? And, you know, how did they kind of be created and were molded into the people that we see in front of us?
Erinn: Oh, I think you're on mute, Alton.
Alton: A combination. It's a combination of experiences I had, you know, just in life in general, and trying to create characters that you don't necessarily judge, right, because it's easy to sometimes judge and not necessarily know what people go through who they are. So for me, the main character stemmed from before I was 21 I had three DUIs and I never and I never forget when I went to the judge the court, I was actually in court on my 21st birthday, and, the judge said, you know, you've been getting in trouble, you know, fights and you and now you're on your third DUI. You know, I can really… you know, I can lock you up. And what would you like? What should we do with you?
Alton: And I talked to him and he, you know, gave me some leniency, and he said, Look, I'll give you another shot, but I don't want to see you back in here again. And that's when I really started taking my career seriously as a filmmaker, a storyteller, and turned my life around. But had that been technology where I would have just been labeled as data, that algorithm could have said: no, this kid here, we're not gonna give him another chance, let's just go ahead and sentence him, and and that could have been a different trajectory for my life, so that that's where things like that stem from and and understanding, like the character, like Cassius he was a talented individual, but he needed a new an opportunity to do something different and get out.
Alton: And as a result of him sharing and taking this drone into new spaces and teaching him that he was more than just data, that drones able to come up with its own assessment into who this kid was. I don't know if you all saw. Yes, I want to give a spoiler alert, but it can make his own evaluation decision, because I believe that technology is an extension of us, and it's not artificial. It's it's an alternative intelligence to who we are as human beings. So that's what those characters come from, at least the main character and other people that I've met.
Alton: Even police officers right, like all police officers are not bad cops. And I don't want to portray police as bad cops but understanding that they deal with a particular algorithm or implicit bias. And we all do too, even as an African American man, there are things that I've grown up in my community that I've experienced. I have to be aware of and check my own implicit bias about certain things. So we all carry them to a certain degree. It's just can we be cognizant of them and then figure out how we can make different decisions? You know, that affect us all as a whole, because I feel like narratives are no different than algorithms, and they're becoming sort of one and the same.
Elizabeth: And Erinn, can I add to that? One of the things that is happening in the AI ethics world is that people are moving the conversation from just calling it biased to calling it harm algorithmic harm, because it can appear that when you're talking about bias, you're talking about a subject, you understand what bias is. But we really have to understand the 2nd and 3rd order of consequences of algorithmic bias, and it ends up being harm, harm to communities, harm to individuals, as Alton mentioned, especially when you're talking about predictive analytics around bell algorithms and all sorts of things. And so for me, that's an important distinction that I'm starting to make in the engagements I'm a part of it, is to start calling it what it is, which is harm to communities, so we can begin to address the real issues around it. And, so I just want to add that piece there.
Erinn: I'm definitely glad that you added that piece in there because again, with us being very much dependent on technology more than ever, over the past year the algorithms have learned a lot more about our spending habits, who we are as a person and even down to details that you think that you wouldn't share on the Internet. It knows you just because of your patterns. And I wanted to ask you two, with technology being an amazing feature also, you know it has its downfalls. You know, what could technology do to change the course of some things that we have seen happen like police brutality and just learning more about how AI can be used for good to kind of combat those things?
Alton: Oh, I love that one. I would say, like, when we look at… you know, you got two spectrums we got… um I was talking to young lady, it was definitely about this. She was talking about how you have the utopian and you have the dystopian. But in the middle, you have the protopian, right? And I love that analogy, because when you think about predictive policing, it's like, well, why are we just making technology that polices the community? What about an AI technology that will create an algorithm that says: okay this particular cop has had these types of offenses, and this particular cop might do X, Y and Z. So we need to potentially put a watch on this person to prevent this person from shooting on killing someone in the future, right?
Alton: So why can't we have the same types of oversight and technology that could be predictive to save someone's life on the other side, instead of defending the police? What about the technology that could help us the community and strengthen and create that algorithm that can help as well before someone goes off at the deep end before it's too late, because we know that a lot of those guys have also had offenses prior, that led up to potentially harming someone at some point.
Elizabeth: And I'll say that I am quickly becoming part of the aging population, and I think about technology from a very different perspective in terms of the future of technology, in terms of what it could do. And so if I think about my future and retiring soon and wanting to enjoy it, I don't mind living with technology. I don't… I actually like I said, I enjoy it, but I do want it to be safe. I want my data to be protected. I don't want someone tracking me or profiling me and making a decision about using technology, that I don't deserve to live in a certain neighborhood or that I'm most likely going to commit a crime or that someone uses a facial recognition system and I have a hoodie on, and now I'm already flagged as some sort of menace to society. So I think about the whole different kind of population that, you know, we're moving towards.
Elizabeth: That we've got to protect people. We have to remember that we're humans first. And that, for me, is where humane engineering comes in. We've got to think about: Do we really need this technology? And if we do, who is it going to help and who is it going to hurt? And can we use technology to eradicate racism, eliminate bias? Can technology be used for that to counter some of these things? So the field of AI is spectacular, where we are in the fourth Industrial Revolution, the digital age, and there's lots of concepts. So I would just encourage anyone who has an interest to start where your curiosity is and move towards something that obviously is more positive for all of us.
Alton: Yeah, and I think that's where the opportunity comes in, when you bring in storytellers and technologies to start to think about creative ways and things that you can create to solve problems, like you talked about whether the aging populations or connecting with different organizations to create solutions. We just hosted a workshop with Joseph Business School in Chicago, and we did that after they watched POV, they started to think about how they could use AI and how they could do applications to solve certain problems in that community. And I think that's the beauty about just sparking that conversation and understanding where you could fit into creating a great solution, to a problem in your community or something that could be useful long term with this type of technology for good.
Elizabeth: And I'll just add one more thing Erinn, I'm sorry. I had a conversation with the young woman a few weeks ago and she asked me about 2050, what AI ethics would look like in 2050. And I said, I hope we don't even have to address AI Ethics in 2050. And we began to, as Alton said, imagine a world where technology seamlessly works through our lives so that people can enjoy it. You know, someone has a robot, someone has their own personal drone and I can connect with that technology and find out where my neighbor is, or it can even order coffee for me, and I can go down the street and get it.
Erinn: But I just think that those conversations we need to start having so we can explore the possibility. And that's why I think what POV means to so many people is giving people a chance to immerse themselves in the experience of someone else so that they can design a world and design technology that fits better in our society than many of the technologies that we have today.
Erinn: I think you you hit it right on the nose. And one thing I love about POV and the entire project as a whole is that we are innately problem solvers. Everything that we do is to solve a problem. And POV is a creative way to problem solve something that we see is happening. And we have the option and the ability at this point in time to kind of nip it in the bud and change its course and change its direction a bit. And I just wanted to ask you guys, right now, I know we're already thinking ahead to 2050, 2070, but what can we do in 2021 to kind of prevent ourselves from doing that, like, even on our mobile devices or, you know, using certain social media websites? What can something that someone is an everyday average user do to kind of combat these things?
Alton: No, that's a good question, because I really enjoy technology as well. I think first is like Elizabeth shared with me on POV. She made me really think about how we put out our projects. And how we, share with individuals who participate in our projects what we're gonna do with their data. That was a very big step for me and being transparent in that process. Because if we build those practices, other people will start to be cognizant of those practices whether you're creator or a technologist, and that's a very good start. So then when they're starting to download other things or participate in other things, they can now be cognizant of that first step of responsibility and accountability. And I think that right there starts to get a solid protocol going and awareness as we continue to put data back into the ecosystem and deal with it in a proper way.
Elizabeth: Yeah, and I will add one of the things that Alton and I talk about is data is the new currency. So people are taking your data. You are on all these free, you know, sites You're on Facebook, Instagram, Twitter or whatever it is. So be cognizant of how your data is being used, and at every opportunity you can protect your data. You can put VPN on your phone so that when you do searches, you know it doesn't tie back to your IP address or your location. Um, so, yeah, we've got to start thinking about that. Like I said, the reason why I'm thinking differently. And I'm thinking about the aging population because I've been on Facebook since I think 2009. So my data is out. I didn't know about all these things. When I download a puzzle, I don't even know who's creating the puzzle. My data is out there, but I can do some things once I learned about it. Kind of like what you say once you know better, do better.
Elizabeth: So protecting your data, but also start thinking about how to protect the world and how to protect your communities and your families through the way that data is being used in your cities. One of the things that you can do if this is a interest of yours, is get involved in your city government. There's usually tons of commissions and boards and civic organizations that you can be a part of and just start having the conversation, because that's exactly what I did three years ago as I joined the committee, I started asking questions. The next thing I know, I'm meeting with the chief of police talking about body cams and video surveillance and then other elected officials. And so many of us can do things immediately with the technologies we have in our homes, making sure that you have good passwords and your data is protected. But then also engaging with your city, your schools and your organizations as well.
Erinn: That's great, because I know I consume a lot of things and I get concerned by putting all my information out there that sometimes it's too much and it's overwhelming and you can look and see the database of information that they pulled for you and basically the demographic card and who they think that you are. It can be overwhelming just by thinking you're just buying something on certain websites or just shopping for your groceries. It's a lot of information to take in. But I thank you two, for kind of shedding some light on that and Alton specifically but what do you see next for POV as a project and even beyond POV. Like, what are other stories that you want to tell with GRX and how can we support you or how can we consume that type of media?
Alton: Oh, I wanna make POV a video game that really challenges you to think about how these technologies are used and how you can be a part of prototyping in the future, in a way. I really, really believe we can do that and design our future through great activities like gamification and bringing immersive technology together. I'm really really excited about that space. And then also, you know, working with organizations, I really like working on both ends of the spectrum with technology and also on the other end, which is being making sure you tap back into the earth, you know? So I'm really interested in understanding how you tackle food deserts and food justice through technology. So that's another thing that we do in Detroit with the organization called Rescue Me Now Nature, we're helping them implement technology into creating urban farming and entrepreneurship and beekeeping. So I love those two ends of the spectrum. I think they both go hand in hand. So I'm excited about those two things right now, and anyway we can support at GRX. We're always here, and we love the educational process of sharing our knowledge production and hacking the future together. So that's what we're here to do. And that's our mission. So that was the project we're working on right now.
Erinn: That's really great. And just to kind of piggyback on what you've already said, a big part of where we are right now is education, and we always have to educate people on these different technologies because as much as they are developing, we don't know much about it. And then 10 years down the line, we understand how there mining our data. So like, even going back to our K through 12 audience like, how do we give the tools to the children the next group of people coming up that develop these technologies? How do we educate them? And you know, Elizabeth, please chime in here, but, like, do you know of any ways that we can grab onto the youth and really push them and help them understand one how they're using technology, how they are consuming technology, but also how to create and make sure these different implicit biases are not present whenever they are the generation that's up and creating?
Elizabeth: Yeah, I can say that there are a number of amazing organizations out there that are doing that work, but of course they need funding. One of them is a AI Education project that happens to be, co- founded by a woman her name is Laura Tanner. She actually is a Minnesota native and she's phenomenal, but there are others Black Girls Code, there are a number of different organizations out there that actually doing the work. Here in Minneapolis we have something called the North Stem District, and it is a community of churches, organizations, community organizations as well as schools. And they have the STEM district and they have a mobile bus that goes around and it's really fascinating and the kids get a chance to go on with their laptops. And then there's different speakers that come in and talk to them about about different things.
Elizabeth: It is about exposure, and some of that can be done in a person's home if they have access to a YouTube video. So yeah, there are many, many different ways. I know for me, I would like to work a little bit more with the youth. I think the kids are fun and they have a a great way of learning, and I'm very impressed with how quickly they are able to adapt to technology, and they're going to be the ones that save us right? So we definitely need to invest in their future and invest time and resources into their growing, into building their minds. You know, that's when they become adults. I'm gonna be relying on them for the technology that keeps me safe.
Alton: Yeah, I would agree that's actually how GRX continued to initially grow. When we did our first VR project Calling On Love. It was like a youth based sort of piece, that was kind of like a Honey I shrunk the kids in VR, and we and Verizon had a chance to see it at a festival. And they asked us, could we create immersive experiences for youth for K through 12? Well, actually, from middle school through high school. And then we started to build out immersive experiences that were focused on educating them about STEM awareness.
Alton: And now we have a project coming out with Verizon Learning and we do workshops with them around immersive technology. And we have a program that that is like an immersive experience, uh, and creative collaboration application that teaches them entrepreneurship, STEM entrepreneurship and how to leverage understanding of arts. And they get a digital literacy as it relates to understanding at an early age. What is artificial intelligence? What are these things and where are they? So they can identify what augmented reality is through the things that they play with, like they're able to understand Okay, I know what Pokemon Go is now. Oh, it's an augmented reality app, and it breaks down these things for them. So they see, like roadblocks for building things and roadblocks and classrooms for them to go in and understand what's on the backend of roadblocks. And how do you build this and participate in the technology side other than just the consumption side?
Erinn: Looks like we have a question here in the chat. And if you guys have any questions, feel free to dump them in there. But we have a question. How do you think about public education and awareness particularly in communities where there is a recent increase in crime leading residents pushing for increased police presence slash surveillance? You guys have any comments on that?
Alton: I would say…
Elizabeth: I was gonna say can we ask the person to expound a little bit on that?
Erinn: Y if you're comfortable, I would love for you to jump in.
Particpant: Hi. Yes, I'm Yadi here from Pasadena, California, and I work with the local group trying to promote a moratorium on facial recognition and other surveillance tech. And what we're finding is that probably as a result of the pandemic, there's been an increase in crime in our city. And while my group here locally is trying to do advocacy and awareness about bias and tech and all of this and all the surveillance tech, we're seeing that there are residents that are very concerned for their safety because of increasing crime, which is probably as a result of issues stemming from the pandemic.
Partcipant: But how do you balance and and try to rally residents to push for greater transparency, accountability and moratorium or ban on the surveillance technology when, at the same time, there are residents who are most likely affected by this tech actually asking for greater police presence and installing more cameras in public spaces without realizing the harms to their own community and doing that?
Elizabeth: Well, you are describing Minneapolis perfectly because after George Floyd's murder, that's exactly what happened here is that there was a lot more crime because our police officers were taking short term disability and we also have something here where we only have 8% of our police officers who actually live in the city, and so they weren't familiar with the community. And so we basically had to face this ourselves like do we continue pushing for a full ban of all the surveillance technologies, or is there one specific thing that we could pinpoint? We had a number of residents who were concerned about the crime and actually wanted more surveillance, to your point as well. And part of that would, I would say why, would be to continue working with your policymakers and your communities? I don't know if you are involved in a coalition but would invite you to research our postme and we can have a conversation after this.
Elizabeth: But look at the research. It's something that your policy makers are going to have to do and your community is going to have to do as well together, and that's what we have to do. And that's why we spent so much time educating policymakers. We had a lot of town halls. We had a lot of community meetings helping people understand by the elected officials pushing for a ban than that gave us a pause, so that we could find greater transparency in that particular technology. Not all of our technology is banned. We're still pushing for transparency and oversight and accountability. But it is a huge community lift, and you want to make sure that you have everyone involved at the table.
Erinn: Now please, if you have any additional questions, just throw them in the chat. We'll get to them as soon as possible. But, Alton, what is a way that you want to get POV out to the world like, so beyond, you know MozFest, how can we get POV in spaces that wouldn't be easily accessible? Like, how can we get this in front of kids? How can we get this in classrooms? How can we stay connected with GRX and the journey of POV? Because, correct me if I'm wrong, I know you have bigger plans for POV like, how can we keep this information in this conversation going?
Alton: Yeah, so I'll put you know, pov at grximmersive dot com in the thread and then we also have a mighty networks community around these topics. If you have that link Erin, I love to share that out. I believe it's also in the POV flowco link where there's a lot of resources in there. And there's access to the community that we're developing around this. And I think it's just a goal for me as a storyteller is to continue to, you know, connect with other creators and storytellers and technologists. And then individuals that that are out there, creating and, for example, like the work that Elizabeth is doing on the advocacy and ethics side right?
Alton: I think it's a marriage that has to happen and the building blocks we have to take together from different aspects, and we can continue to build that ecosystem. We can start to build support for, like Elizabeth mentioned, they created the civic technology oversight and then Wise here, and they're trying to create something in Pasadena. So how do we take those frameworks as storytellers bringing awareness and then get that knowledge base together? And then move that over to these other communities that need that particular type of resources and help and know where to start. And remembering just how powerful storytelling is. And then when you bring those things together, how do we create that community together? I don't have all the answers, but I want to be able to bring light to certain things as a storyteller.
Erinn: Elizabeth, for you, in your most ideal picture perfect world, which, you know never is necessarily the case. But what do you foresee the next, I don't even want to say 5 to 10, but, you know, 3 to 5 years, when it comes to AI and ethics? Again, what do you want to see happen? Even if you think it's not feasible within that time period. But what do you think we can actually start putting actions towards outside of, you know, having conversations, like what can we do now that you think is is tangible, and if it's not tangible, like at least within reach in the distant future?
Elizabeth: Erinn, that is such a great question and part of my leadership style is never to tell anyone what they should do. I always like for people to explore because when you start exploring, your own gifts and talents will serve the community, your own space, your own orbit in the best way, and I get a lot of inbox messages asking me how to get started. I can point people to different groups, but ultimately every person has their own gifts and talents as you start right, you know, where they are.
Elizabeth: For me, leadership is natural for me and also storytelling, not the way in an immersive experience, but talking about it from a human experience telling people what it feels like to do this work and and because I am a technologist, I can tell the story of what it's like being a technologist and also someone who has been falsely accused of being somewhere because of video surveillance. So I like to merge those worlds. But every person that I talked to, even from a data scientist to a CEO, they all have stories. And just this week I talked to a CEO about a particular family challenge that they might have been going through it. So I talked to them about those things that are important to them and help them understand. Well, imagine that being someone who has, um, no ability to affect the technology that makes that makes it into their world. So I would just say, honestly, find something where you could center joy in the work and your own strengths, and start there, and it will lead to somewhere.
Erinn: Well, that was really, really powerful. And I know it wouldn't be proper not to bring COVID into into the world that we're living in because we can't have these physical spaces. So just thinking about outside of policing, how do you think that implicit bias and AI has affected even the medical industry because we have seen a lot of people over the past year, you know, particularly people of color, having issues with access and getting different treatments and stuff like that? Like, have you guys thought about even on a medical side, what that looks like and what steps can be taken from there?
Elizabeth: Yeah, I can tell you here in Minneapolis, one of the joys that I do find is helping people find vaccinations here in Minneapolis because I need to get fine, right? And so I'm a part of this Facebook group that helps people based on their eligibility find vaccinations. Here is what's so interesting in the state of Minnesota we have most of our populations in the Twin Cities Minneapolis and Minnesota, but the large number of vaccinations are four or five hours away, well we're helping people who are willing to drive four and five hours away. That doesn't make sense to me, and and there's certainly some algorithms that are involved in deciding the pharmacies where they're located. So you've got 5000 vaccination appointments available in rural Minnesota. But you don't have that many people who are eligible to take the vaccine yet. That just doesn't make sense.
Elizabeth: Now on the city level that would be different because the city has prioritized racial equity and they actually have ordinances that prioritize the racial equity in all of their business systems. So they would take that lens around, and even when there was testing, they had the proper kinds of protocols around that, but from a state level that's what we're seeing here and that is to me why you your question is so important it's like what could we do? They could certainly have involved some technologists from the community to say: this is how we can roll this out. And even, you see some that are being rolled out in the community. It's like, Lord of the Flies here unfortunately. You have some communities that really, really need it, where most of the COVID cases have been, that just don't have it or they're being… they have events that someone is telling you about. And so there's a lot of work that needs to be done as we move into how we're handling health care as it relates to COVID, as it relates to communities of color, specifically vulnerable populations that are mostly in metropolitan areas across the country.
Alton: I would say, I have not personally been into the health aspect yet, but what AI has done for me is, you know, once you learn one thing, it takes you down and discovery into other things. And now I'm looking at you know what is… how does AI impact the future of digital assets and digital likeness and avatars and, like, you know, your content as a storyteller and what they call 3D humans, like in the future. What does that mean in the future when I have an asset that looks like me, and someone can take it and program AI into it and make it say anything they want to? So, what does that look like in the future? With black bodies or BIPOC, you know, characters and digital… our digital imprint in the future with AI What can be done with the power of that? So I'm really, really exploring those areas now and what that looks like with these meta humans now coming out.
Erinn: Yeah, definitely the meta human sounds like something from an I Am Legend movie, but it's definitely something that is real for us. And for anybody that doesn't necessarily know more about meta humans, do you have a resource that that you can direct me to have any additional information on that?
Alton: On meta humans?
Erinn: Yes.
Alton: Yeah, I learned about it when I did the march. You know, we scanned someone and then we, did like… they use artificial intelligence to recreate different aspects of Dr King's face. And it really blew me away how they were able to create a picture on the cover of Time magazine from the virtual reality of the experience that you could not tell it wasn't a picture of Dr King, but it was a 3D rendition created through artificial intelligence. So now you're talking about taking a deep fake and changing your video to say anything you want, where people have the power to start wars from a video or riot, so it's like, where is the oversight with that as we move into the future, when it comes to, you know, saying someone did something that they didn't do with a video? So meta humans, you can go to Unreal Engine, where you can actually start to learn how to create content in that aspect as well. I'll put a link in the thread, but you can learn more about that.
Elizabeth: Can I just add, not so much about meta humans, but just about the capturing, the real narrative and the historical pieces of what's happening today. I think that is so important in Alton's work, and he could talk a little bit more about that. And the reason why I say that is because as an AI ethicist, I look at a lot of different tools out there and some of them I play with and one of them happens to be one that develops a portrait based on you putting in your picture. And there's one out there that takes black people and they create these portraits, but they look like white people. And I put Cicely Tyson in there and I put Maya Angelo and I put Shirley Chisholm in there, and they all came out as white women from a Victorian time. That is chilling to me because these are images that could be used to tell stories, that could be used to educate children about who these people are, and we are erasing a whole population of icons, not just icons, but what if someone else… you know, I put my picture in there, and I came out as well, as a white woman.
Elizabeth: And so these are just it just feels like we're constantly chasing these types of things which are necessary. Which brings me back to my point of why regulation is so important. That organizations begin to explain and begin to be transparent and begin to ensure that there is integrity in their products that are out there. So when we talk about all these different things that can be done with AI even in some of the chat spaces, I've signed up for some avatars that do not look like me. They don't take into account my natural hair, and so we have a lot of work to do. There is a lot of work in this space, but I am at least hopeful that we'll be able to at some point, you know, Erinn and everyone turn the corner as we get the next generation involved.
Elizabeth: Absolutely, absolutely. Does anybody have any questions? You know, out there? I know we were in kind of chatting for a bit, but does anybody have a direct question for Alton or Elizabeth at this time?
Particpant: Can I ask a question? One of my favorite things about the experience of POV is the use of emerging media or emerging technology to interrogate technology. And you touched on this a little bit Alton already, being mindful of how to use people's data who go through the experience. But can you maybe speak about the challenges with that? And how you can make these processes transparent to the audiences that come through the experience?
Alton: Yeah. So when we were creating the experience and we had a lot of different entry points into getting into the experience, you know, Elizabeth went through everything to really give us… make sure we were really intentional about some of the questions we're asking. And letting people know why we were asking these questions was very important. So that was really helpful for me, as Crux and I we put all this information together so we know how to formulate those questions to get the proper results back and to give people an understanding of why we need this data. So that can be used to empower them and and do the research we need so that it could be something that's measurable to create a real impact opportunity for change, right and what we're doing. So that was one thing, two it allowed me the opportunity to be able to create, as we piloted this, we were trying to figure out what are interesting ways we can also use our skill sets in immersive technology to create something?
Alton: So, like the privacy policy we built that in Unreal Engine, where you have now one of the characters talking to you about coming into this world, and we want to be transparent about your data. So take a moment, please, to review this and then come into the experience. I think there's fun ways to now not make it just about having to just go read 20 pages, but making it intentional, you know, as we move through these experiences. So that's what we're had to do. And I appreciate them making me think about how to onboard in a way that's meaningful and intentional and empowering at the same time throughout this immersive experience.
Elizabeth: And I'll add and that is an exact way to be responsible as an organization and as a leader and to one of the things I'll say is obviously Alton and team have built an agile framework for them to be able to course correct and pivot so quickly. And that's ideally what you want in your own organization or organizations that when new information becomes available, you can digest it and correct it so that people are… people can feel safe and comfortable using your experience or your tools.
Elizabeth: Thank you for that question. We have one last question in the chat. Does AI have anything to do with driverless cars and the decision as what to do in choosing who hit in an accident and who's like walking by?
Alton: That's a very good question.
Elizabeth: Yes, it does. There was. I'm sorry Alton were you gonna go ahead?
Alton: No go ahead.
Elizabeth: One of my early, early, early early and tell you how long I've been in this space. It's only been three years, but I swear it feels like 10. One of my early experiences was doing a learning event on autonomous cars and facial recognition and what happened, and the Institute of Georgia did a study I want to say it was back in 2019 and autonomous cars have cameras on the top and they used recognition, and they were having a hard time… having a hard time identifying darker skin tones. And so some of the experiments that they had actually showed that it was there was a high potential for crashing into a pedestrian with a darker skin tone. And so Apple right now, has I believe the patent for facial recognition scan on selfless driving cars self driving cars.
Elizabeth: And so I hope that they are addressing that. But this study was so intriguing to me, because again, these are the types of things where we just want to live, we want to be able to go to the store and want to be able to do all of these things. And yet technologies is all around us, and we are not 100% comfortable that it's safe for every single person in society. So to answer the question, yes, there are studies, there's a patent, and hopefully they're doing the right thing by increasing diversity in their data sets to be able to train the algorithms that the camera uses on top of the autonomous cars.
Elizabeth: Well, I know we have to kind of wrap here soon, and I wanted to thank you guys for this amazing conversation that we had and, you know, directly after this, we do have the Spatial Chat. Alton and Elizabeth has many things to do, so we will see her at a different time. But for both of you, what's the best way for people to keep in contact with you to follow what's going on with you in your day to day?
Alton: You can inbox me pov at grximmersive dot com. And IG, uh, Alton Glass VR, um, and LinkedIn is always good too, but, you know, I'm accessible. Um, and I just want to I hope that you all go out and create, and we can continue to hack the world with really cool problem solving ideas.
Elizabeth: Yeah, I'd say LinkedIn is the best way to see kind of the things that I am a part of. And as Alton said, I look forward to seeing all the amazing things that everyone who participated in this event today end up doing in this space, whether it's writing an article, creating a product, you know, giving a speech, all of us can play our part in building stronger communities with technology. Thank you.
Alton: Thank you very much. I appreciate you all.
Erinn: Thank you both. It's been, again, like I said, amazing. I've been having a really good conversation with you guys on and off camera, but I just wanted to wrap it again just appreciate my gratitude towards you guys for putting this together, dedicating the time and the research. I know you both are super super busy, but this was an important conversation that needed to be had in a space that everyone was seen and heard and definitely felt. So if you guys want I'm gonna cut this session a little like early for two minutes. Everybody can go get some water, go to the bathroom, and then you know come in and join us for a Spacial chat. Thank you guys so much. Elizabeth, you're amazing. Alton, so are you. And I look forward and hopefully I will see you guys in the Spatial Chat. If not, I will see you guys on the Internet. Make sure that you follow GRX and Elizabeth for any sort of updates that are coming along on either side. And we will definitely connect at a later time. So thank you all so much. I really appreciate it.
Erinn: Thank you Erinn, you were amazing. Bye everyone.
Elizabeth: Erinn: Bye guys.
Alton: Erinn: Another link for that?
Erinn: Yeah, the link… the link is in the chat,
Alton: But I got to sign into the platform, right?
Erinn: Mhm.
Alton: Okay. All right. I see, that's at 12. 15. Yeah, Pretty seamless. It just prompts
Elizabeth: Yeah, Pretty seamless. It just prompts you at the beginning, and then you're in.
Alton: All right, that's…