See all LastWeekTonight transcripts on Youtube

youtube thumbnail

Facial Recognition: Last Week Tonight with John Oliver (HBO)

21 minutes 10 seconds

🇬🇧 English

S1

Speaker 1

00:00

♪♪ Our main story tonight concerns facial recognition. The thing that makes sure my iPhone won't open unless it sees my face or the face of any toucan. But that is it. Facial recognition technology has been showcased in TV shows and movies for years.

S1

Speaker 1

00:19

Denzel Washington even discovered a creative use for it in the 2006 action movie, Deja Vu.

S2

Speaker 2

00:24

We have facial recognition software? Yeah. Let's use it on the bag, cross-match it to all the bags on the south side of the city in the 48 hours leading up to the explosion, all right?

S3

Speaker 3

00:37

Don't think it's ever been used this way. Look, same thing.

S1

Speaker 1

00:42

Bingo. Bingo, indeed, Denzel. With smart, believable plot development like that, it's frankly no wonder that Deja Vu received such glowing IMDb reviews as... An Insult to Anybody Who Finished Elementary School, Worst Movie of All Time, More Like Déjà Pu, and my personal favorite, a one-star review that reads, Bruce Greenwood as always is great and so sexy and there's a cat who survives.

S1

Speaker 1

01:04

A review that was clearly written either by Bruce Greenwood or that cat. Now, the technology behind facial recognition has been around for years, but recently, as it's grown more sophisticated, its applications have expanded greatly. For instance, it's no longer just humans who can be the targets.

S4

Speaker 4

01:21

The iPhone sensor scans each fish...

S1

Speaker 1

01:24

S4

Speaker 4

01:25

and uses automatic image processing to uniquely identify each individual. A number of symptoms are recognized, including loser fish.

S1

Speaker 1

01:36

Yes, loser fish, which by the way, is an actual industry term. Now, that company says it can detect which fish are losers by facial scan, which is important, because can you tell which 1 of these fish is a loser and which 1 is a winner? Are you sure about that?

S1

Speaker 1

01:50

Because they're the same fish. This is why you need a computer. But the growth of facial recognition and what it's capable of brings with it a host of privacy and civil liberties issues. Because if you want a sense of just how terrifying this technology could be if it becomes part of everyday life just watch as a Russian TV presenter demonstrates an app called FindFace.

S5

Speaker 5

02:12

If you find yourself in a cafe with an attractive girl and you don't have the guts to approach her, no problem. All you need is a smartphone and the application FineFace. Find new friends, take a picture, And wait for the result.

S5

Speaker 5

02:32

Now you're already looking at her profile page.

S1

Speaker 1

02:35

Burn it all down. Burn everything down. I realize that this is a sentence that no 1 involved in creating that app ever once thought, but just imagine that from a woman's perspective.

S1

Speaker 1

02:46

You're going about your day when suddenly, you get a random message from a guy you don't know that says, Hello, I saw you in cafe earlier and used fine face app to learn your name and contact information. I'll pick you up from your place at 8. Don't worry, I already know where you live. But 1 of the biggest users of facial recognition is, perhaps unsurprisingly, law enforcement.

S1

Speaker 1

03:06

Since 2011, the FBI has logged more than 390,000 facial recognition searches. And the databases law enforcement are pulling from include over 117 million American adults, and incorporates, among other things, driver's license photos from residents of all these states. So roughly 1 in 2 of us have had our photos searched this way. And the police will argue that this is all for the best.

S1

Speaker 1

03:30

In fact, here is an official with the London police explaining why they use it there.

S6

Speaker 6

03:34

Here in London, we've had the London Bridge attack, the Westminster Bridge attack. The suspects involved in that, the people who are guilty of those offenses, were often known by the authorities. Had they been on some database, had they been picked up by cameras beforehand, we may have been able to prevent those atrocities, and that would definitely be a price worth paying.

S1

Speaker 1

03:51

Okay, look, it's hard to come out against the prevention of atrocities. This show is, and always has been, anti-atrocity. But the key question there is, what's the trade-off?

S1

Speaker 1

04:01

If the police could guarantee that they could prevent all robberies, but the only way to do that is by having an officer stationed in every bathroom watching you every time you take a shit, I'm not sure everyone would agree that it's worth it. And the people who do might want that for reasons other than preventing crime. And now is actually a very good time to be looking at this issue, because there are currently serious concerns that facial recognition is being used to identify Black Lives Matter protesters. And if that's true, it wouldn't actually be the first time, as this senior scientist at Google, Timnit Gebru, will tell you.

S7

Speaker 7

04:34

There was an example with the Baltimore police, in the Freddie Gray marches, where they use face recognition to identify protesters. And then, they try to link them up with their social media profiles and then target them for arrest. So right now, a lot of people are actually urging people not to put images of protesters on social media because there are people out there whose job is just to look up these people and target them.

S1

Speaker 1

05:03

It's true. During the Freddie Gray protests, police officers used facial recognition technology to look for people with outstanding warrants and arrest them, which is a pretty sinister way to undermine the right to assemble. So tonight, let's take a look at facial recognition.

S1

Speaker 1

05:19

Let's start with the fact that even as big companies like Microsoft, Amazon, and IBM have been developing it, and governments all over the world have been happily rolling it out, there haven't been many rules or a framework in place for how it is used. In Britain, they've actually been experimenting with facial recognition zones, even putting up signs alerting you to the fact that you're about to enter 1, which seems polite, but watch what happens when 1 man decided he didn't actually want his face scanned.

S6

Speaker 6

05:46

This man didn't want to be caught by the police cameras so he covered his face. Police stopped him. They photographed him anyway.

S6

Speaker 6

05:53

An argument followed.

S8

Speaker 8

05:55

What's your suspicion?

S6

Speaker 6

05:57

The fact that he walked past clearly

S8

Speaker 8

05:59

marked facial recognition

S9

Speaker 9

06:00

that he

S1

Speaker 10

06:00

covered his face. I would do

S2

Speaker 2

06:01

the same.

S8

Speaker 8

06:01

I would do the same.

S6

Speaker 6

06:02

So I walk past like that. It's a cold day as well. I've done that, the police officers asked me to come to him.

S6

Speaker 6

06:09

So I've got me back up. I said to him, I've got a now 90 pound fine. There you go. Look at that.

S6

Speaker 6

06:16

Thanks, lads. 90 pound. Well done.

S1

Speaker 1

06:18

Yeah, that Guy Ritchie character was rightly mad about that. And incidentally, if you are not British, and you're looking at that man, then at me, and wondering how we both came from the same island, let me quickly explain. British people come in 2 variations.

S1

Speaker 1

06:30

So emotionally stunted that they're practically comatose, and cheerfully telling large groups of policemen to fuck off and do 1 if you're gonna take a photo of me face! There's absolutely nothing in between the 2. And the UK's by no means alone in building out a system. Australia is investing heavily in a national facial biometric system called, The Capability.

S1

Speaker 1

06:50

Which sounds like the name of a Netflix original movie, although that's actually perfect if you want people to notice it, think, huh, that seems interesting, and then forget it ever existed. And you don't have to Imagine what this technology would look like in the hands of an authoritarian government, because China is unsurprisingly embracing it in a big way.

S5

Speaker 5

07:09

We can match every face with an ID card and trace all your movements back 1 week in time. We can match your face with your car, match you with your relatives, and the people you're in touch with. With enough cameras, we can know who you frequently meet.

S1

Speaker 1

07:24

That is a terrifying level of surveillance. Imagine the eye of Sauron, but instead of scouring Middle Earth for the 1 Ring,

S2

Speaker 2

07:31

he was just really into knowing

S1

Speaker 1

07:32

where all his orcs like to go to dinner. And some state-funded developers in China seem weirdly oblivious to just how sinister their projects sound.

S1

Speaker 11

07:40

SkyNet. What is that?

S3

Speaker 3

07:42

The Terminator is the favorite film of our founder. So, they use the same name, but they want to

S1

Speaker 12

07:51

put something good into this system.

S1

Speaker 11

07:53

So, okay, in The Terminator, Skynet is evil, rains down death from the sky. But in China, Skynet is good.

S3

Speaker 3

08:01

Yeah, that's the difference.

S1

Speaker 1

08:02

Oh, that's the difference, is it? You know, it's not exactly reassuring that you called your massive, all-encompassing AI network, Skynet, but a good version. Because it'd be like if the Today Show built a robot journalist and called it, Matt Lauer, but good.

S1

Speaker 1

08:16

Oh, yeah, this one's completely different. Sure, he does also have a button under his office desk, but all it does is release lilac air freshener. This is the good version. The point is, this technology raises troubling philosophical questions about personal freedom.

S1

Speaker 1

08:30

And right now, there are also some very immediate practical issues. Because even though it is currently being used, this technology is still very much a work in progress. And its error rate is particularly high when it comes to matching faces in real time. In fact, in the U.K., when human rights researchers watched police put 1 such system to the test, they found that only 8 out of 42 matches were verifiably correct.

S1

Speaker 1

08:54

And that's even before we get into the fact that these systems can have some worrying blind spots, as 1 MIT researcher found out when testing out numerous algorithms, including Amazon's own recognition system.

S8

Speaker 8

09:05

At first glance, MIT researcher Joy Buolamwini says the overall accuracy rate was high, even though all companies better detected and identified men's faces than women's. But the error rate grew as she dug deeper.

S1

Speaker 10

09:18

Lighter male faces were the easiest to guess the gender on, and darker female faces were the hardest.

S8

Speaker 8

09:26

1 system couldn't even detect if she had a face, and the others misidentified her gender. White guy, no problem.

S1

Speaker 1

09:33

Yeah, white guy, no problem, which, yes, is the unofficial motto of history, but it's not like what we needed right now was for computers to somehow find a way to exacerbate the problem. And it gets worse. In 1 test, Amazon's system even failed on the face of Oprah Winfrey, someone so recognizable, her magazine only had to type the first letter of her name, and your brain auto-completed the rest.

S1

Speaker 1

09:56

And that's not all. A federal study of more than 100 facial recognition algorithms found that Asian and African-American people were up to 100 times more likely to be misidentified than white men. So that is clearly concerning. And on top of all of this, some law enforcement agencies have been using these systems in ways they weren't exactly designed to be used.

S1

Speaker 10

10:17

In 2017, police were looking for this beer thief. The surveillance image wasn't clear enough for facial recognition software to identify him. So instead, police used a picture of a look-alike, which happened to be Actor Woody Harrelson.

S1

Speaker 10

10:31

That produced names of several possible suspects and led to an arrest.

S1

Speaker 1

10:36

Yeah, they used a photo of Woody Harrelson to catch a beer thief. And how dare you drag Woody Harrelson into this? This is the man that once got drunk at Wimbledon in this magnificent hat, made this facial expression in the stands, and in doing so, accidentally made tennis interesting for a day.

S1

Speaker 1

10:51

He doesn't deserve prison for that. He deserves the Wimbledon trophy. And there have been multiple instances where investigators have had such confidence in a match, they've made disastrous mistakes. A few years back, Sri Lankan authorities mistakenly targeted this Brown University student as a suspect in a heinous crime, which made for a pretty awful finals week.

S1

Speaker 11

11:11

On the morning of April 25th, in the midst of finals season, I woke up in my dorm room to 35 missed calls, all frantically informing me that I had been falsely identified as 1 of the terrorists involved in the recent Easter attacks in my beloved motherland, Sri Lanka.

S1

Speaker 1

11:29

That's terrible. Finals week is already bad enough, what with staying up all night, alternating shots of five-hour energy and Java monster Mean Bean, while trying to push your brain to remember the differences between Barack and Rococo architecture, without waking up to find out that you've also been accused of terrorism because a computer sucks at faces. Now, on the 1 hand, these technical issues could get smoothed out over time.

S1

Speaker 1

11:49

But, even if this technology eventually becomes perfect, we should really be asking ourselves how much we're comfortable with it being used. By police, by governments, by companies, or indeed, by anyone. And we should be asking that right now, because we're about to cross a major line. For years, many tech companies approached facial recognition with caution.

S1

Speaker 1

12:09

In fact, in 2011, the then chairman of Google said it was the 1 technology the company had held back because it could be used in a very bad way. And think about that. It was too Pandora's boxy for Silicon Valley, the world's most enthusiastic Pandora's box openers. And even some of the big companies that have developed facial recognition algorithms have designed it for use on limited data sets, like mug shots or driver's license photos.

S1

Speaker 1

12:34

But now, something important has changed, and it is because of this guy, Juan Tontat and his company, Clearview AI. And I'll let him describe what it does.

S1

Speaker 13

12:45

Quite simply, Clearview is basically a search engine for faces. So anyone in law enforcement can upload a face to the system and it finds any other publicly available material that matches that particular face.

S1

Speaker 1

12:58

Okay, so the key phrase there is publicly available material because Clearview says it's collected a database of 3000000000 images. That is larger than any other facial recognition database in the world, and it's done that by scraping them from public-facing social media like Facebook, LinkedIn, Twitter, and Instagram. So, for instance, Clearview System would theoretically include this publicly available photo of Tom Tatt at what appears to be Burning Man, or this 1 of him wearing a suit from the exclusive Santa Claus After Dark collection at Men's Warehouse.

S1

Speaker 1

13:27

And this very real photo of him shirtless and lighting a cigarette with blood-covered hands, which, by the way, is his profile photo on Tidal, because yes, of course, he's also a musician. I can only assume that that's the cover of an album called Automatic Skip if this ever comes up on a Pandora station. And Tom Tatt's willingness to do what others have not been willing to do, and that is scrape the whole internet for photos, has made his company a genuine game changer in the worst possible way. Just watch as he impresses a journalist by running a sample search.

S9

Speaker 9

13:57

So, here's the photo you uploaded of me. Mm-hmm. A headshot from CNN.

S1

Speaker 13

14:02

Mm-hmm.

S9

Speaker 9

14:03

So, first few images it's found, It's found a few different versions of that that's in picture But now as we scroll down we're starting to see pictures of me that are not from that original image Oh go to Wow. Oh my god. So this this photograph is from my local newspaper where I lived in Ireland and this photo would have been taken when I was like 16.

S9

Speaker 9

14:29

Wow. That's crazy.

S1

Speaker 1

14:32

Yeah, it is. Although here is some advice. If there is an embarrassing photo of you from when you were a teenager, don't run away from it.

S1

Speaker 1

14:39

Make it the center of your television show's promotional campaign and own it. Use the fact that your teenage years were a hormonal Stalingrad. Harness the pain. But the notion that someone can take your picture and immediately find out everything about you is alarming enough, even before you discover that over 600 law enforcement agencies have been using Clearview's service.

S1

Speaker 1

14:59

And you're probably in that database even if you don't know it. If a photo of you has been uploaded to the internet, there is a decent chance that Clearview has it, even if someone uploaded it without your consent. Even if you untagged yourself, or later set your account to private. And if you're thinking, hold on, isn't this against the terms of service for internet companies?

S1

Speaker 1

15:18

You should know, Clearview actually received cease and desist orders from Twitter, YouTube, and Facebook earlier this year, but it has refused to stop, arguing that it has a First Amendment right to harvest data from social media, which is just not at all how the First Amendment works. You might as well argue that you have an Eighth Amendment right to dress up rabbits like John Lennon. That amendment does not cover what I think you think it does. And yet, Tom Tapp insists

S2

Speaker 2

15:45

that this

S1

Speaker 1

15:45

was all inevitable, so we should all frankly be glad that he's the 1 who did it.

S1

Speaker 13

15:49

I think the choice now is not between, like, no facial recognition and facial recognition, it's between, you know, bad facial recognition and responsible facial recognition. And we want to be in the responsible category.

S1

Speaker 1

16:00

Well, sure, you want to be, but are you? Because there are a lot of red flags here. For starters, apps he developed before this included 1 called Trump Hair, which would just add Trump's hair to a user's photo, and another called Vidiho, that phished its own users, tricking them into sharing access to their Gmail account and then spanning all their contacts.

S1

Speaker 1

16:19

So, I'm not sure that I would want to trust my privacy to this guy. If, however, I was looking for someone to build an app that let me put Ron Swanson's mustache on my face as my checking account was quietly drained, Sure, then he'd be the top of my list." And despite Clearview's repeated reassurances that its product is intended only for law enforcement, as if that is inherently a good thing, he's already put it in a lot of other people's hands. Because in addition to users like the DEA and the FBI, he's also made it available to employees at Kohl's, Walmart, and Macy's, which has alone completed more than 6,000 facial searches. And it gets worse, because they've also reportedly tried to pitch their service to congressional candidate and white supremacist, Paul Neelan, suggesting that they could help him use unconventional databases for extreme opposition research, which is a terrifying series of words to share a sentence with white supremacists.

S1

Speaker 1

17:13

Now, Clearview says that that offer was unauthorized, But when questioned about who else he might be willing to work with, Ton-Tat's answer hasn't been reassuring.

S1

Speaker 13

17:22

There's some countries that would never sell to that are very adverse to the U.S. For example? Like China, and Russia, Iran, North Korea.

S1

Speaker 13

17:30

So those are the things that are definitely off the table. And...

S5

Speaker 5

17:33

What about countries that think that being gay should be illegal, it's a crime?

S1

Speaker 13

17:39

So, like I said, you know, we want to make sure that we do everything correctly, mainly focus on the U.S. And Canada. And the interest has been overwhelming, to be honest.

S1

Speaker 13

17:47

Just so much interest that, you know, we're taking it 1 day at a time.

S1

Speaker 1

17:51

Yeah, that's not terribly comforting. When you ask a farmer if he'd let foxes into the hen house, the answer you hope for is no, not, the interest from foxes has been overwhelming, to be honest. Just so much interest.

S1

Speaker 1

18:03

So, you know, we're taking it 1 day at a time. And unsurprisingly, reporters for BuzzFeed have found that Clearview has quietly offered its services to entities in Saudi Arabia and the United Arab Emirates, countries that view human rights laws with the same level of respect that Clearview seems to have for Facebook's terms of service. So facial recognition technology is already here. The question is, what can we do about it?

S1

Speaker 1

18:25

While some are trying to find ways to thwart the cameras themselves.

S1

Speaker 10

18:29

Hi, guys, it's me, Jillian, again, with a new makeup tutorial. Today's topic is how to hide from cameras.

S1

Speaker 1

18:36

Okay, first, that's probably not a scalable solution. And second, I'm not sure if that makes you less identifiable or the most identifiable person on Earth. Officers are on the lookout for a young woman, dark hair, medium build, looks like a mime who went through a shredder.

S1

Speaker 1

18:50

Look, clearly, what we really need to do is put limits on how this technology can be used. And some locations have laws in place already. San Francisco banned facial recognition last year. But the scope of that is limited to city law enforcement.

S1

Speaker 1

19:04

It doesn't affect state and federal use or private companies. Meanwhile, Illinois has a law requiring companies to obtain written permission before collecting a person's fingerprints, facial scans, or other identifying biological characteristics. And that is good. But we also need a comprehensive nationwide policy, and we need it right now.

S1

Speaker 1

19:24

Because again, there are worries that it is being used in the protests that we are seeing now. And the good news is that just this week, thanks to those protests and to years of work by activists, some companies did pull back from facial recognition. For instance, IBM says they'll no longer develop facial recognition, meanwhile Amazon said it was putting a one-year hold on working with law enforcement. And Microsoft said it wouldn't sell its technology to police without federal regulation.

S1

Speaker 1

19:50

But there is nothing to stop those companies from changing their mind if people's outrage dies down. And for the record, while Clearview says it's canceling its private contracts, It's also said it will keep working with the police, just as it will keep harvesting your photos from the internet. So, if Clearview is gonna keep grabbing our photos, at the very least, there may be a way to let them know what you think about that. So, the next time you feel the need to upload a photo, maybe throw in an extra 1 for them to collect.

S1

Speaker 1

20:20

Maybe hold up a sign that says, these photos were taken unwillingly and I'd rather you not be looking at them. Or if that feels too complicated, just, fuck Clearview. That really does get the message across. And remember, these photos are often being searched by law enforcement, so you may want to take this opportunity to talk to the investigators looking through your photos.

S1

Speaker 1

20:38

Maybe something like, I don't look like Woody Harrelson, but while I have your attention, defund the police. Really, whatever you feel is most important to tell them, you should put on a sign. That's our show. Thank you so much for watching.

S1

Speaker 1

20:51

We'll see

S2

Speaker 2

21:00

you