The complex questions that arise when algorithms and AI intersect with race : Code Switch OK, not exactly a computer — more like, the wild array of technologies that inform what we consume on our computers and phones. Because on this episode, we're looking at how AI and race bias intersect. Safiya Noble, a professor at UCLA and the author of the book Algorithms of Oppression talks us through some of the messy issues that arise when algorithms and tech are used as substitutes for good old-fashioned human brains.

How does a computer discriminate?

  • Download
  • <iframe src="https://www.npr.org/player/embed/1197954253/1211316785" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

LORI LIZARRAGA, HOST:

Hey everyone. Lori here. Before we get to this week's episode of the show, a little bit of news. CODE SWITCH is coming to Arkansas. We're going to be performing a live show in Little Rock on December 7. We'll be sharing more information about that, including where to get your tickets, really soon. But in the meantime, we need your help. We'll have a portion of the show with our famous Ask Code Switch segment, where we answer listener questions about how to deal with the racial quandaries and queries that come up in your own life. They can be silly, serious, social, personal. We want to hear them all. So please send us your questions, especially if you're from Arkansas, live in Arkansas or just want to know something about racial dynamics in Arkansas.

(SOUNDBITE OF MUSIC)

LIZARRAGA: My co-hosts and I will choose some of those questions to answer live on stage. So if you want to hear our brilliant musings, our reporting and advice, please email us at codeswitch@npr.org with your questions, subject line - ask CODE SWITCH. Again, that's codeswitch@npr.org, subject line - ask CODE SWITCH. OK. On to the show.

You're listening to CODE SWITCH. I'm Lori Lizarraga. Today on the show, we're tackling an issue that's been in the news a lot recently. Familiarly known as AI, we're getting into what artificial intelligence is, what it isn't and its history. So let's start with this. What's the first thing that pops into your head when you hear AI? I don't know about you, but for me, it is an embarrassing storm of "I, Robot"...

(SOUNDBITE OF FILM, "I, ROBOT")

WILL SMITH: (As Del Spooner) Robot write a symphony?

LIZARRAGA: .../"Spy Kids"-type tech that has finally helped us achieve the magic microwaves and makes the checkout line a completely contactless flying-robot experience. I don't know.

(SOUNDBITE OF FILM, "M3GAN")

RONNY CHIENG: (As David) M3gan?

(SOUNDBITE OF FILM, "HER")

AMY ADAMS: (As Amy) You're dating an OS? What is that like?

JOAQUIN PHOENIX: (As Theodore) It's great, actually.

(SOUNDBITE OF FILM, "M3GAN")

CHIENG: (As David) What are you doing?

(SOUNDBITE OF SONG, "INSPECTOR GADGET THEME")

UNIDENTIFIED MUSICAL GROUP: (Singing) Go, gadget, go.

LIZARRAGA: But, you know, whatever. I know that's all ridiculous, sensational TV make-believe. That's the point. But according to one expert in the field of algorithms and artificial intelligence, that vague understanding of AI as the beginning of our dystopian robot takeover end - it's not just me.

SAFIYA NOBLE: We've got two major industries that have created our imaginary about AI, and that's Hollywood and the tech industry.

LIZARRAGA: That is Safiya Noble.

NOBLE: I'm a professor of gender studies, African American studies and internet studies at UCLA. I study the worst parts of the internet.

LIZARRAGA: And she's the author of the book "Algorithms Of Oppression: How Search Engines Reinforce Racism." She told me there are a lot of mythologies when it comes to AI.

NOBLE: Let's talk about, like, what we think it is and then what it isn't. We think that it is better than human beings. We've been told that it's sentient and that it has a mind of its own, and that mind is more powerful, again, than ours. We have to actually contend with the fact that there's kind of two modes of AI. There's kind of generalized AI, and this is the "Terminator" kind of imaginary that we have coming out - right? - like the killer robots who think and have their own agenda. And then there's narrow AI, which is actually the kind of AI most of us engage with. It's like an app on your phone.

LIZARRAGA: Apps like Duolingo or Google Maps, Alexa, Snapchat, Instagram, even autocorrect - functions and features that we use every day without ever really thinking of them as AI. And because AI makes it possible for machines to learn from experience, that means AI is susceptible to the same bias of the humans it's simulating.

Some of our colleagues at NPR just reported on a good example of this in October, about a social scientist who tried to create a picture of a Black African doctor treating white children using an AI image generator, basically asking the image generator to invert the trope of a white doctor caring for African children. And the software couldn't do it - literally. It just, like, would not compute. Midjourney, the AI program they used, could create images of Black African doctors, and it could create images of white children as patients. But putting those two images together with all the data this machine has learned wasn't enough to generate results it had never been taught.

That's not an example that's going to come up in everyday life, right? But it is a downstream effect of a learning technology that is coming of age in a landscape built on racism. And that's a problem when it's encoded in computer programming that impacts everything from elections and housing policies to health care and who goes to jail. Professor Safiya Noble happens to be a pioneer of this algorithmic discovery. She first started wading into the world of algorithms, kind of by accident, all the way back in 2009.

I know that AI, for some of us, is going to be relatively new, but for you, we know AI is not a new study, and it came into your purview some 15 years ago now.

NOBLE: It's - yes, it is about 15 years. So I had spent my first career in advertising and marketing.

LIZARRAGA: She contains multitudes.

NOBLE: Yes. I mean, I know a couple things here. It was interesting to me because this was, like, in the early days. We didn't even have something called search engine optimization as an industry. In the ad industry, what we were doing is, like, hiring some young, white guy to come 'cause that's who the programmers were...

LIZARRAGA: Sure.

NOBLE: ...And we were like, hey, dude, can you come in here? Like, help us figure out how to get our clients on the first page of like, Yahoo! and, you know, oh, there's this new search engine, Google. I mean, this is how - you know, how long ago this was.

LIZARRAGA: That is so awesome.

NOBLE: I know. And then we were like, but you got to make this copy look like it's objective and it came from, like, a journalist or some third party and not an ad. I mean, it was like these were the meetings I would be in.

LIZARRAGA: You were on the inside.

NOBLE: On the inside. And so the recession hits in 2006. I go back to grad school and, you know, I'm listening to people talk about Google and they're like, oh, yeah, you know, it might be - it's, like, the new public library online. And I'm like, whoa, wait, what? No, it's not. And then this amazing book comes out called "The Googlization Of Everything: (And Why We Should Worry)" by Siva Vaidyanathan. And I'm like, see, that's what I'm talking about right there. I mean, I love this book. I'm reading it as a grad student, and I'm like, yeah, he understands that the search engines, they're really controlling what people see, and that's kind of dangerous to the future of knowledge. But then I felt like, well, let me just add this one little part. You know, the future of knowledge for people of color, especially on issues of, like, race, are going to be determined by these advertising companies. And I want to talk about this.

And so I'm at home and I'm, like, thinking about, like, how to craft a study. You know, at the time, my daughter was a tween, and I was just, like, let's start with Black girls, Black women. I type in Black girls, and the first thing that comes up when I type in Black girls, like, 80% of the front page is porn.

LIZARRAGA: Wow.

NOBLE: And I'm like, whoa. OK. I didn't have to type sex. I don't have to type the word porn or sex or anything. Black girls are just synonymous with pornography. And of course, I look in all these sites and they're women, you know? So this is just, like, a fundamental disconnect that women are coded as girls, and Black girls are coded pornographically, as are Latina girls, as are Asian girls. And so now I'm doing a full-blown study on all kinds of different racial and ethnic and gender combinations. And very much, it mirrors the racial hierarchy of our society, which is to say, if I look on white men, guess what I get? I get, like, white - the color - men's shirts. There's nothing about, like, white men as a racial category. It's, like, white, as a color, fill in the product - OK? - which is not surprising. But again, it's, like, you go from that to Black girls - porn.

LIZARRAGA: But are we to understand that is the search engine program being programmed with racial bias into it, or is it, like, a learning program that is taking in a racial bias from who is using it?

NOBLE: It's a couple of things. I mean, what I really tried to do in the book "Algorithms Of Oppression" is unpack how search engines work.

LIZARRAGA: OK.

NOBLE: Their core business is AdWords, which allows any company or anyone to say, I will pay X amount of money to make sure that every time this keyword is searched on, it is linked to my product or my page or pages. So if this is the fundamental operating system of Google, which is AdWords, then this fantasy, which at the time really was a fantasy that what you see in the organic search results on one side is totally different from the ads, well, that's just ludicrous.

LIZARRAGA: That really was wild to read about. I think I had a vague understanding, but to be honest, like many other people, I do think that there is this, like, general understanding that algorithms are this truth machine. Like, you type in a keyword into Google and it's just going to give you, like, the most correct answer.

NOBLE: I mean, listen, Lori, part of the reason that that does happen is because, first of all, if you're shopping for something, which is the way a lot of people use search engines, or you are just looking up something extremely specific, it is very likely to give you the right answers, OK? So that's actually part of the banality, like, the boringness of search is that it's like, I don't know, where's the nearest coffee shop? What time does Starbucks close? Like, all these kinds of things, you're going to get the right answer.

LIZARRAGA: Right.

NOBLE: So then when you go and you ask something more nuanced, now you are already primed to believe that what you're getting is true. And what my point in the book was, at least on that one example, is, listen, even if all the Black girls in the country, if we broke all the piggy banks open, we would never have as much money as the porn industry. We'd never be able to recuperate or recover our identities in these spaces. And this - when I made this argument a decade ago, I will tell you that up to that moment, the tech industry itself, as well as everybody else working in computer science that I was working around, would say, Safiya, algorithms that are running search are just mathematical formulas - right? - they're just, like, a model, this, like, narrow, AI model. They're just like a...

LIZARRAGA: Like, a perception that they truly are unbiased...

NOBLE: It's just...

LIZARRAGA: ...Neutral...

NOBLE: ...Math. They're like, Sis, it's just math.

LIZARRAGA: And they really believed that.

NOBLE: Yeah. They believed that. And I was like, it's not just math. And prior to that, I would say very few people would concede on that point. And then when the book "Algorithms Of Oppression" came out, everybody was like, well, OK, I guess. All right, that's true. And you see the difference, because now we know that the porn industry still spends the most money online, right? But if you search for Black girls now, you get Black Girls Code. You get all these, like, amazing organizations. You get, like, all these different things. And so we know that, in fact, there are decisions and there are human beings, and it's not just math. And I think reducing the conversation about algorithms and AI to saying it's just math really strips away the social context within which the math is deployed, which has all kinds of politics and meaning attached to it.

The premise of AI is really that we will take a lot of data that is culled from everywhere - the more the better - and we run statistical models on it to look for patterns. So it's taking, let's say, in the case of any one of us, our GPS locations that - you know, the data that's being fed from that, the apps we have open, the things we've clicked on, the things we're, you know, engaging with and then merging that to develop profiles about us and that, you know, we think of as kind of like the tasks of narrow AI.

LIZARRAGA: In other words, like, getting to know us.

NOBLE: Kind of getting to know us and then making predictions that are better than, you know, we could predict for ourselves. And this is kind of one of the ways you hear people talk about AI is that AI kind of knows more about us than we know about ourselves, and partly because it's, like, aggregating all of this information. Companies are aggregating and data brokers are selling lots of data that's collected about us that we don't even realize is happening. And it's a, you know, billion-dollar industry. So I think that the most important headline about AI is that it is wholly artificial, because any given data point that's collected about people or, you know, the world doesn't tell the whole story about who we are.

LIZARRAGA: Right.

NOBLE: And it's deeply unintelligent. There's no intelligence happening. What we have is just kind of a series of different kinds of predictive models. And here I have to point to and call out the work of Professor Emily Bender and Dr. Timnit Gebru and Deb Raji and their collaborators who wrote this incredible paper on generative AI. So, you know, generative AI, which is ChatGPT and these different kinds of technologies that we're dealing with now, are really - they're just prediction machines.

LIZARRAGA: Right.

NOBLE: Really, what it's doing is predicting a lot of things that are just absolutely incorrect. And you're already seeing the studies now that are showing more than 50% of the things that ChatGPT might respond to you is just factually incorrect or made up. So I think, you know, if I were to say, what is AI? - I would say AI is kind of like a smoke-and-mirrors, you know, fantasy that is, like, sold to the public in order to get us to invest more in it and make, you know, companies richer. And it's novel and it looks like magic, but it's really not. And it's changing, certainly, the way in which we're engaging in industry and education, in lots of different occupations. But it is not a replacement for the genius of the human being.

LIZARRAGA: Which is probably going to feel slightly reassuring. I know that it does to me. But, I mean, it's good to remember that it isn't the most profound thing in the world to be able to guess that you're going to need a refill of the thing you bought 40 times.

NOBLE: This is what I'm saying. It's also kind of like - well, you know, I asked ChatGPT the week that it came out, what is critical race theory? And when you ask these kinds of technologies about questions of race, it's really interesting. The answer it gave me, which was critical race theory is a theory about, you know, systems of racism and racial oppression in our society that, you know, deny opportunity or have disparate, like, impact. So it was like a kind of a very short little paragraph on the textbook definition. And then it had about four paragraphs that were like, critical race theory is very controversial. It's really problematic. I mean, then it - like, it goes on to basically say, like, it's a dangerous ideology. And I thought, well, wow, this is interesting.

LIZARRAGA: Aggregating, like, the perspective.

NOBLE: It had a point of view, and that point of view was more informed by the incredible amount of propaganda and racist, vitriolic comments on the internet about CRT and what politicians and Marjorie Taylor Greene and whoever else - you know, Ron DeSantis has had to say. And guess what? They all are saying so much more about critical race theory than racial justice scholars or activists, right? So just the weight and the volume of the anti-CRT...

LIZARRAGA: Yeah.

NOBLE: ...Discourse is going to overdetermine and skew what the response will be. And this is why it's so dangerous to me to be...

LIZARRAGA: Right.

NOBLE: ...Using these kinds of technologies to answer questions about society.

LIZARRAGA: Right. I mean, it's the group that's the loudest not the most right. I guess I'm curious, what are the implications of our preconceived notion that this is just, like, a truth machine? What risk does that play when we have the wrong impression and the wrong information?

NOBLE: The fact that it seems like a human response really seduces people, again, into believing there is a superior intelligence or superintelligence. And that, I think, is so incredibly dangerous...

LIZARRAGA: Yeah.

NOBLE: ...In our society that - the fact that the technology is even made that way, that it's made to look like an expert and presented like expertise. And, of course, for those of us who are experts in many different areas - I mean, one of the first things I did the week ChatGPT opened, also, was I asked it to write a syllabus in an area that I'm expert in. And guess what? Almost all of the citations were made up.

LIZARRAGA: Really?

NOBLE: So if you don't have expertise, you actually can't figure out what is truthful in these systems. And that, to me, also is really important.

LIZARRAGA: So how do these issues get addressed? Because as you've alluded to, things do change, right? A lot of the problems that you first wrote about in your book, "Algorithms Of Oppression," for instance, aren't problems anymore.

NOBLE: Well, I'll put it this way. There are a lot of scholars and journalists and activists who raise attention every day to harmful things and projects that we see underway in the tech industry, and sometimes the tech industry responds in order to crush it so that it's like, well, that's not a thing anymore. I know for sure my book was thrown on some poor programmer's desk and they were like, fix all these things. Like, the dozens and dozens of examples, they're like, fix them all, I'm sure. So yes, that's great.

But on another level, you know, I'm a public state employee. And you think about all of the journalists and the people working in the public interest. It's not our job to find all these harmful products. And the offloading of the detection of harm to the public is really, I think, despicable, to be honest. These companies should be required and held accountable, legally, to pay for and remediate and do the cleanup of all the damage that they do in society. Just like if Exxon has an oil spill here off the coast of California, they got to go in and clean it up and restore the ecosystem. And we should not have an information ecosystem that's overdetermined and controlled by advertising companies when there's danger - especially around things like elections and other kinds of very important public needs - to then just leave it to everybody else to figure out how to fix it and how to clean it up.

LIZARRAGA: That's such an important point. The burden of responsibility for cleaning up the mess is almost always done by women of color, and there's always some other marginalized community who didn't create it but is fixing it.

NOBLE: Yeah. It really is a - it's not only a misplaced responsibility, but the truth is the most expansive ways that we've ever come to experience more democracy, you know, greater rights, greater sense of justice have come because women of color have been precluded and excluded and fought for our participation and our inclusion. And I think on these issues, it's not a surprise that women of color are at the forefront. But I will also say they're also paying a great price.

(SOUNDBITE OF MUSIC)

LIZARRAGA: Coming up - more with Safiya Noble.

NOBLE: What people in power want to see is us just numb out and doomscroll and buy things. We were born for more than just to be groomed into consumers who don't care about other human beings.

LIZARRAGA: Stay with us.

(SOUNDBITE OF MUSIC)

LIZARRAGA: Lori. Just Lori. CODE SWITCH.

(SOUNDBITE OF MUSIC)

LIZARRAGA: And I'm back talking to Safiya Noble. She's a professor of gender studies, African American studies and internet studies at UCLA. She's also the author of the book "Algorithms Of Oppression: How Search Engines Reinforce Racism."

(SOUNDBITE OF MUSIC)

LIZARRAGA: I really wanted to talk to Safiya about the fact that a lot of people don't know how the technologies that we use really work. One of the classic examples we've seen was back in 2018 when Mark Zuckerberg had to testify in a Senate hearing about data privacy after the Cambridge Analytica leaks. But a lot of the people questioning him there seemed to really not understand how Facebook works.

(SOUNDBITE OF ARCHIVED RECORDING)

ORRIN HATCH: Well, if so, how do you sustain a business model in which users don't pay for your service?

MARK ZUCKERBERG: Senator, we run ads.

HATCH: I see.

LIZARRAGA: So I wanted to know, does a lack of digital literacy keep the public from holding the tech industry accountable?

NOBLE: You don't have to be super literate in data science or, you know, critical data studies or the things that I know you know something's amiss. I think these issues are way far upstream. It's kind of like saying to anyone in any city, hey, listen, you should really know how much forever chemicals or forever plastics are in your water - like, whatever the toxic thing is. I mean, the truth is, you would not be able to do something about it even if you knew, OK?

LIZARRAGA: Yeah.

NOBLE: So - because these are upstream decisions that get made by the people we elect and the people that influence the people we elect. So I put it there and say, yeah, it's great. When you know some things, you can really scare people at a cocktail party, I don't know, or whatever you're going to do. But it's more powerful that we elect people who we can hold accountable who aren't accountable to more powerful people, but are accountable to the people that elect them. It's more powerful that we show up in these school board meetings, in these city council meetings and we put the pressure on and we say, we're not going to put more money into the wrong things, and we demand solutions.

LIZARRAGA: Yeah.

NOBLE: And this is where, to me, it's like, I will never stop doing the education. I'm an educator, but I will never also blame people for not knowing a thing and then hold them responsible. Because even if they knew, I don't think that would necessarily mean that they could directly make personal choices that would change the ecosystem.

LIZARRAGA: Does policy feel like the most effective way to make change?

NOBLE: Well, I will put it this way. My parents - they both grew up - they came of age pre-civil rights legislation, pre-1964 passage of the Civil Rights Act. Their lives were overdetermined by a lack of civil rights legislation. The things my mom or my dad could do in their lifetimes were directly impacted by not having civil rights legislation. So there's no question to me that policy sets the agenda. Let's say it enhances the world of possibilities. It opens up more possibilities, and it creates more protection of our rights. So we cannot ignore that space. And we know this because we're witnessing the rollback of civil rights right now. We're witnessing the rollback of women's rights right now, and that is substantively changing the way in which we can move in the world. So, absolutely, we have to have digital civil rights protections on the books.

LIZARRAGA: Can you talk about what that means to you?

NOBLE: When I think about a digital civil rights agenda, it goes beyond even just thinking about individual, personal protection. It's also about collective rights. And there are such amazing models in other parts of the world. I mean, I - right now, I'm really trying to study and learn from the Maori people in New Zealand who have conceptualized things like digital rights to include seven generations into the future, of proving no harm will come for seven generations. They also hold the earth as a stakeholder. Now, that alone is powerful and transformative because that would mean we actually could not do the kinds of extractive, exploitive economic and environmental practices that all of this hardware and the Internet of Things and all of this connectivity and all of these networked computers are built upon. We would have to think about the disposability and the e-waste at the end of our use of these technologies.

And, you know, I spent the summer, about four years ago now, in Accra, Ghana, witnessing a beautiful, pristine wetland in the heart of Accra that has been really turned into a massive toxic waste site. And it's full of the waste of the West. And, you know, you think about, like, if we conceptualized digital rights beyond just our own personhood, we would have, I think, a really beautiful imaginary for the future, and we would prioritize what we're doing now very differently.

LIZARRAGA: You've talked about the dangers of AI in society for years now, including in our electoral system. How are you seeing that play out?

NOBLE: Well, I think people definitely need to know and remember that around election times, Black people especially, but also Latinos and Asian Americans are often targeted with anti-voting, anti-democratic-oriented material that comes across their social media feed. These are well-orchestrated campaigns. They look like they're coming from other Black people. They often are not. Sometimes they get celebrities to join in and participate willingly or unwittingly. There's, like, a predatory nature about disengaging and not trusting in our collective power.

LIZARRAGA: Yeah.

NOBLE: So I think we should recognize those as campaigns of racist oppression. It doesn't matter the fact that in, I think - what is it? - 2030 or 2040 that we're going to be a country that's a majority minority, right? So it's like it's going to be a majority people of color. And guess what? Some of us were around. We remember South Africa and apartheid. We know that you can have a minority rule - white rule that is also still racially oppressive. I mean, so we should not take anything for granted. I think, you know, what people in power want to see is us just numb out and doomscroll on Instagram and buy things.

But the truth is, we can and should assert our agency in our lives. And we were born for more than just to be groomed into consumers who don't care about other human beings and don't care about the planet and its inhabitants. That's not what we were put on the Earth for, I don't think. That's my worldview is that we come here with purposes. So this is my admonishment to all of the listeners that, you know, we're in it. We can have these conversations, we can stay aware and we should not be influenced by the seduction of the propaganda that wants us to check out.

LIZARRAGA: How do you not power down and get discouraged, knowing that you're getting involved to the extent that you can and also knowing that you are up against a Goliath?

NOBLE: Well, you know, I definitely have days where I feel discouraged. There's no question. The story of David and Goliath, I mean, you know, it's a very specific kind of religious story, but it is a story about a small, but mighty force can take on and point the light, quite frankly, at - there's so many holes in the stories that come to us from Big Tech, and all we have to do is kind of shine the light on them.

And, you know, I'm encouraged by a story my mom told me when I was, like, younger in my 20s. You know, she'd say, you know, everybody said they marched with Dr. King, but they didn't. And I was like, what are you talking about? She's like, everybody says that they were down with Dr. King, but they weren't. She's like, it was, like, after the Civil Rights Act was passed and after he was, you know, assassinated, then everybody was like, I was there, you know, I was part of it, but they weren't part of it. They were actually on the sidelines, or they were naysayers. She was like it was probably only 10% of the community that was actually really in the streets. She's like find your 10%, because look at what that 10% did. And so that's actually how I energize myself, is, like, I'm just looking for my 10% who want to link up and figure out, like, how can we strategize to make change?

And I hope everybody will then, in the retrospect, say, I was part of the change and be empowered in it. Fine. It's not even about who gets credit. It truly isn't. Who cares? The tech sector is sucking all the resources out and leaving us with just, like, I don't know, a piece of plastic and glass and metal to hold on to. That's not enough. That's not enough. We got to reimagine something far better.

LIZARRAGA: Hopefully this conversation, eventually, Safiya, becomes a relic.

NOBLE: I look at things like the era of Big Cotton, which was predicated upon enslavement, transatlantic slave trade, the reproduction of the, you know, chattel slave system in the United States. People would say things that are just like the things people say now about tech. They'd say, we can't do away with this because our whole economy is predicated upon it.

LIZARRAGA: Yeah.

NOBLE: There's no way. Like, how would we function?

LIZARRAGA: What we - yeah, what we thought we couldn't live without.

NOBLE: Couldn't live without it. Absolutely not. Again, these arguments about, like, the mainstay of our culture. So I believe in paradigm shifting. And I work at that register because we may find, like, wow, remember the time when people just stood around constantly and, like, were at dinner, and they were just constantly on their phones? That is sick. That is, like, horrible. Why did they do that? Like, they - no. You know, so part of this is the culture work, too, that we have to do of saying, like, that's just not cool. Not into it.

(SOUNDBITE OF MUSIC)

LIZARRAGA: My last question was going to be do you ever worry that it's too little, too late, but I think you just answered that.

NOBLE: I believe in miracles. I believe we're the miracle. Come on.

LIZARRAGA: That's what it might take. Keep working, and then I definitely do, too.

(SOUNDBITE OF MUSIC)

LIZARRAGA: Thank you so much, Dr. Safiya Noble, for this conversation, for your time. We appreciate you.

NOBLE: Appreciate you, too. Thank you.

(SOUNDBITE OF MUSIC)

LIZARRAGA: And that's our show. You can follow us on Instagram at @nprcodeswitch. If email's more your thing, ours is codeswitch@npr.org. You can subscribe to the podcast on the NPR app or wherever you get your podcasts.

(SOUNDBITE OF MUSIC)

LIZARRAGA: And you should definitely check out our newsletter. It drops every week in your inbox. Sign up for that at npr.org/codeswitchnewsletter. And just want to give a quick shout-out to our CODE SWITCH Plus listeners. We appreciate you, and thank you for being a subscriber. Subscribing to CODE SWITCH Plus means getting to listen to all of our episodes without any sponsor breaks, and it also really helps support our show. So if you love our work, please consider signing up at plus.npr.org/codeswitch. This episode was produced by Jess Kung and Courtney Stein. It was edited by Leah Donnella. Our engineer was Josephine Nyounai. And a big shout-out to the rest of the CODE SWITCH massive - Christina Cala, Xavier Lopez, Dalia Mortada, Veralyn Williams, Steve Drummond, Julia Carney, B.A. Parker and Gene Demby. I'm Lori Lizarraga. Call your robot friend?

Copyright © 2023 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.