How Israel is using facial recognition in Gaza : Short Wave After the Hamas attack of Oct. 7 triggered Israel's invasion of the Gaza Strip, hundreds of thousands of Palestinians began fleeing from the North of Gaza to the South. As they fled, many Palestinians reported passing through checkpoints with cameras. Israel had previously used facial recognition software in the West Bank, and some Palestinians reached out to The New York Times reporter Sheera Frenkel to investigate whether the same was happening in Gaza.

Science correspondent Geoff Brumfiel talks to Frenkel about how Israel launched this facial recognition system in Gaza late last year with the help of private companies and Google photos.

Read Frenkel's full article.

Want to hear us cover more stories about AI? Email us at shortwave@npr.org.

How Israel is using facial recognition in Gaza

  • Download
  • <iframe src="https://www.npr.org/player/embed/1198910043/1253081464" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

EMILY KWONG, BYLINE: You're listening to SHORT WAVE from NPR.

GEOFF BRUMFIEL, HOST:

Hey, SHORT WAVErs. I'm your host today, science correspondent Geoff Brumfiel. Earlier this year, I went to Israel to cover the ongoing war that began after the Hamas attack of October 7, and I found myself on a grassy hill overlooking the city of Hebron.

(SOUNDBITE OF BIRDS CHIRPING)

BRUMFIEL: Hebron is in the occupied West Bank. It's home to hundreds of thousands of Palestinians, but also living there are some militant Israeli settlers. It's a tense place full of soldiers and checkpoints and high-powered security cameras.

Even here, you can see there's cameras. There's cameras sticking out from the rooftops, sort of peeping out of the corners of houses. And, yeah, I mean, it does feel like you're surveilled pretty much everywhere you go here.

These cameras are doing more than just watching people. They're identifying them thanks to facial recognition. Issa Amro is a Palestinian activist and longtime resident of Hebron. He says the cameras know him.

ISSA AMRO: They have our own data. It's connected to the camera with facial recognition. This is why they say facial recognition. I think it's more than that. It's body. It's eyes. It's your shapes. So it's more than that, OK?

BRUMFIEL: And the cameras tell Israeli soldiers patrolling the city everything about him. Before he even shows them his ID, they've got his life story.

AMRO: I am a human rights defender. I was in jail many times. They tell me about that I'm divorced. How many times I passed a checkpoint - they know that. Where I've been, you know, in certain hours - they - the soldiers tell me all of that.

BRUMFIEL: For a few years now, Hebron has been a laboratory for Israeli security forces and private companies to test out their latest facial recognition software. But then came the Hamas attack and the Israeli invasion of Gaza. Hundreds of thousands of Palestinians began fleeing on foot from the north of Gaza to the south.

(SOUNDBITE OF ARCHIVED RECORDING)

UNIDENTIFIED REPORTER: Under the watch of Israeli soldiers, desperate Gazan civilians - whole families - are fleeing their homes.

BRUMFIEL: And as they fled, many reported passing through checkpoints with cameras. Some of them contacted New York Times tech reporter Sheera Frenkel.

SHEERA FRENKEL: I had previously been a Middle East reporter for 10 years, and so they knew me from my time there, and they knew that I now covered technology. And so they essentially just reached out and said, we think something's going on here. We think this might be a camera system. Maybe it's like the facial recognition program Israel uses in the West Bank. Can you start to look into this?

BRUMFIEL: She decided she is going to try and find out what was going on.

FRENKEL: I went about sort of getting in touch with various contacts in Israel's defense ministry, Israeli soldiers who were serving in the Gaza Strip who had left who could potentially tell me what was going on and just trying to report out whether they had, in fact, launched a facial recognition program in Gaza.

BRUMFIEL: And she discovered that the Israeli military was trying to set up a new facial recognition system with the help of private Israeli companies and, believe it or not, Google Photos.

(SOUNDBITE OF MUSIC)

BRUMFIEL: So today on the show, Sheera and I are going to talk about how Israel's government is using facial recognition and AI to track Palestinians and why people in every single country on Earth should be paying close attention to what they're up to. You're listening to SHORT WAVE, the science podcast from NPR.

(SOUNDBITE OF MUSIC)

BRUMFIEL: So first things first. You're a tech reporter. I'm a science reporter. We should probably talk briefly about how modern facial recognition works because it's quite closely related to the sort of AI revolution we're living in right now, right?

FRENKEL: Yes, very much. It's part of this new wave of technology that's coming out and AI-backed, which is really changing the capabilities of what we knew technology was able to do until now.

BRUMFIEL: Right. And these facial recognition programs are neural networks, right? They are similar to sort of ChatGPT or cousins of those models.

FRENKEL: Exactly. I mean, the way I like to think of it, because I'm actually not a very technical person - they're creating maps of your face. These programs are taking your image, and they're mapping every element of your face and then storing that as sort of data, as ones and zeros. And then when another image is uploaded, it scans that image, and it says, are there matches here? Do we think these two faces actually belong to the same person?

BRUMFIEL: And just like these AI models, they're shown huge databases. And they learn to identify different faces by training on those databases.

FRENKEL: Exactly. And like any AI system, they're only as good as the training that goes into them. So the more data you give them, the more examples you give it of these two people are, in fact, the same, the more they learn to recognize, in fact, what makes up a human face and how they can, with some degree of confidence, say, we believe these two faces belong to the same individual.

BRUMFIEL: So the tech itself has gotten very good, assuming the algorithm is properly trained. But, you know, it's just as good as the system you're plugged into. So when you're at the airport, you know, there are facial recognition technologies being used more and more now. I've come across them recently. And that's a situation - it's a very controlled environment. You have a photo ID. You are staring straight at a camera. You know you're being logged by a facial recognition system. But the Israelis have been using it in different ways in the West Bank and elsewhere, right?

FRENKEL: Sure. I mean, I think the way to think about it is in the airport, you're making a choice to be scanned by facial recognition software. You're deciding to allow that system to scan your face. And, you know, a lot of people prefer it. It makes their time at the airport go faster. They don't have to pull out a passport every time in some airports, at least, that use these systems. You can just have your face scanned. So it's seen as time-saving, and people basically see it as a service. The difference between that and governments, including the Israeli government, that use facial recognition is that in the West Bank, for instance, Palestinians who need to move between two areas because they work in one or because they just need to visit a hospital or get, you know, literally across a road - they're not choosing to be scanned by that facial recognition software. There's no opt-in, opt-out system.

BRUMFIEL: Yeah. I mean, you know, one of the things I heard while I was in Hebron is that they don't always even know when they're being scanned, right? Like, they have high-powered cameras that aren't at the checkpoints that at least people like Issa Amro think can pick up their face just walking through the city.

FRENKEL: Right. And, again, so there's the checkpoints themselves, which - the cameras are very visible. They know they're being scanned. And then there are a number of cameras that are just scattered through cities. And we'll get to this soon, I think, but in Gaza, there are drones that fly above head. There are entire systems that are in place that nobody in Gaza has an opt-in to. They are just living their lives in Gaza, and their faces are being scanned.

BRUMFIEL: So turning into Gaza, what's going on there? How is it different from the West Bank?

FRENKEL: So Israel has not had military presence themselves in Gaza since 2005, which is when Israel announced it was going to go through a process called the disengagement. It was going to withdraw its troops from the Gaza Strip. So any kind of surveillance, including facial recognition software that's done in Gaza, is done from afar. It's done literally from the physical border that Gaza shares with Israel. There are a number of cameras that are set up along that fence. It's done via drones, and it's done via online surveillance. The Facebook pages, the Instagram, the YouTube, all the social media of Palestinians who live in Gaza is very closely monitored by Israel.

BRUMFIEL: And this is all being done by this specific part of military intelligence called Unit 8200, or at least they have a central role in what you found out was going on.

FRENKEL: Right. So Unit 8200, which is, in fact, one of the largest units in the Israeli military, does their digital surveillance, their online surveillance. So Unit 8200 had, for a long time - had a very sort of basic facial recognition program running in Gaza. It wasn't nearly as broad as what they had in the West Bank because they didn't have the same system of cameras and soldiers on the ground. But they had been keeping track of people that they believed to be members of Hamas. They had been keeping track of people who they thought were relatives of those who were in Hamas or other, you know, extremist Islamist groups in Gaza. And that included their images.

After October 7, they really, you know, they stepped up that program - is the best way I can think of it. They immediately went through the videos that had been filmed on October 7, and any individual who had taken part in coming across the border they had tried to identify by name and also get multiple images of them, sometimes pulling from their social media. And then they had very quickly said, well, look. We are definitely sending soldiers in. Let's talk about how we're going to launch a broader facial recognition program in the Gaza Strip.

BRUMFIEL: And so they ended up partnering with this Israeli company called Corsight.

FRENKEL: Right. Corsight is one of the largest and most prominent facial recognition companies based in Israel. And this is a company that for years has kind of touted its ability to recognize faces even from photos that only showed a very small part of the face. And so this was very attractive for the Israeli military because they had a lot of images from October 7 in which only, you know, a very small part of a person's face were shown.

BRUMFIEL: And I got to say, when I read your article, that did raise my eyebrows because, like, facial recognition, as we discussed - it's very good when you're standing in front of a camera. But at high angles, with faces partially obscured, maybe with low resolution or at night, like, you know, all of this really degrades the software's ability to identify faces, right?

FRENKEL: Right. And, I mean, I think as anyone who has ever, you know, seen the PR claims of a company knows, there's a big gulf between what a company says it can do and what it can actually do. And Corsight's claims about being able to recognize people's faces that were, like, you know, almost fully covered by masks or bandanas or balaclavas or whatever, they were pretty extraordinary. And I think the Israeli military really saw an opportunity to test that.

BRUMFIEL: What did they find, I guess, when they tried this out?

FRENKEL: Well, they actually discovered that there was quite a high failure rate and error rate of faces being misidentified. This became especially clear to them because it wasn't just being used to identify members of Hamas and other militant groups, but it was also being used to identify Israelis who had been taken hostage and taken into the Gaza Strip. Israel wanted to be able to look through video footage and say, OK, this individual was taken from their home. And can we spot them anywhere else in Gaza?

One thing that became interesting to me as I did this reporting - and I spoke to quite a few members of Unit 8200 - is that they weren't as bothered when they misidentified Palestinians because they kind of had this attitude of, like, oh, well, you know, we'll take the wrong person in for questioning, and that's OK because in questioning, we'll figure out they're the wrong person.

BRUMFIEL: One of the other things I found really wild about your story is, like, because in part it seemed like Corsight's system wasn't working that well, I guess Unit 8200 was also, like, using Google Photos. Can you talk a little about that?

FRENKEL: So what they discovered - and this really came up specifically with the hostages - was that Google Photos, which is the same Google Photos you or I can have on our phone where you upload a bunch of photos, and then you can identify. In my case, I'll say, OK, this is my child. This is my cat. And then you can then search your photos and say, please find me other photos of my child or of my cat. Google Photos is really, really good at this because they're really developing the same kind of AI technology to match faces, to map faces that other companies are.

And the Israeli army discovered that Google's technology on this was actually so good that it was better than this custom-built software for Corsight - that Corsight had made for them. And so they started increasingly using Google Photos to identify specifically hostages that had been taken on that day, and they found that even with only a very small part of the face visible in a photo or in a video, Google's technology was excellent at identifying those faces.

BRUMFIEL: So they were just basically building photo albums, like, with a Google account.

FRENKEL: Exactly. They were just uploading photos into a photo album and then asking Google to find the faces that were the same. I will note here that this is the free, off-the-shelf technology that anybody can use. And so Google didn't know that its technology was being used in this way until I got in touch with them and said I'm hearing from multiple people in Unit 8200, multiple intelligence officers that they're using your technology. And I think they were pretty surprised that they were being used by the Israeli army.

BRUMFIEL: You know, my phone unlocks with Face ID. Google photo ID, airports - I mean, it's everywhere now, right, Sheera? Like, I just wonder kind of how you're thinking about facial recognition both being in Silicon Valley and seeing it everywhere and then going to Israel and seeing these really extreme cases of how it's being used in a place like Gaza.

FRENKEL: You know, much like AI or social media, which is something else I cover a lot of, I sort of just accept the technology is here. There's no putting the genie back in the bottle. No government that runs facial recognition programs is going to step back that facial recognition program. And I think what we need to think about as individuals is how much of our consent is going into this and how much surveillance are we willing to accept in our everyday lives.

So often, we en masse kind of accept this technology into our lives because it makes our lives so much easier and it really helps us connect to other people, or it helps us go through airport lines faster, or whatever it is. And then it's only years later that we kind sit up and go, oh, man, I gave up so much of my privacy. I had no idea I was consenting to all that. And so, really much like any other new type of technology, I just think it's a really important moment for people to say, like, OK, well, what kind of consent is being given and how much do I know about what types of privacy I'm giving up when I opt into this?

(SOUNDBITE OF MUSIC)

BRUMFIEL: Well, Sheera, thank you so much for joining us today. This has been a great conversation.

FRENKEL: Thank you so much for having me.

BRUMFIEL: This episode was produced by Rachel Carlson and edited by our showrunner, Rebecca Ramirez. It was fact-checked by me and Rachel. Gilly Moon was the audio engineer. Beth Donovan is our senior director and Collin Campbell is our senior vice president of podcasting strategy. I'm Geoff Brumfiel. Thanks as always for listening to SHORT WAVE, the science podcast from NPR.

(SOUNDBITE OF MUSIC)

Copyright © 2024 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.