Clearview AI is redefining our privacy. The New York-based tech company is working to identify and compile the faces of every human being on the planet. Clearview AI claims that the database will serve as a force for good, helping to solve crimes and prevent espionage. But the risks it carries are immense. FRANCE 24’s Jessica Le Masurier and Romeo Langlois have this special report. In collaboration with ARIJ (Arab Reporters for Investigative Journalism).
Clearview AI says it's aiming to collect 100 billion images – that's 14 for every person on the planet – with the help of Artificial Intelligence (AI).How would you feel if photos of your own face – photos you didn’t even know existed – appeared in this growing database?What if Clearview AI’s powerful facial recognition software, that could potentially be used for mass surveillance and profiling, fell into the wrong hands? What if it already has?This is the untold story of Clearview AI, the story the firm did not want us to tell.On June 1, 2024, FRANCE 24’s Jessica Le Masurier and Romeo Langlois were awarded France's prestigious FIGRA human rights prize for their report 'Your face is ours'.
Visit our website:
http://www.france24.com
Like us on Facebook:
https://www.facebook.com/FRANCE24.English
Follow us on Twitter:
https://twitter.com/France24_en
Clearview AI says it's aiming to collect 100 billion images – that's 14 for every person on the planet – with the help of Artificial Intelligence (AI).How would you feel if photos of your own face – photos you didn’t even know existed – appeared in this growing database?What if Clearview AI’s powerful facial recognition software, that could potentially be used for mass surveillance and profiling, fell into the wrong hands? What if it already has?This is the untold story of Clearview AI, the story the firm did not want us to tell.On June 1, 2024, FRANCE 24’s Jessica Le Masurier and Romeo Langlois were awarded France's prestigious FIGRA human rights prize for their report 'Your face is ours'.
Visit our website:
http://www.france24.com
Like us on Facebook:
https://www.facebook.com/FRANCE24.English
Follow us on Twitter:
https://twitter.com/France24_en
Category
🗞
NewsTranscript
00:00 [MUSIC]
00:10 [BLANK_AUDIO]
00:20 [MUSIC]
00:30 >> Facial recognition, it's the future.
00:40 [MUSIC]
00:42 >> QSL both from like two and two, three.
00:44 >> It's like everything else with technology.
00:46 It makes our job a lot easier.
00:48 [BLANK_AUDIO]
00:50 >> 4221 Bravo.
00:53 >> When I came on, we didn't have body worn cameras, now we do.
00:56 We didn't have RTCC, which is our surveillance cameras in the city.
01:00 Before, if nobody saw anything, if no one knew anything,
01:04 we couldn't get a description of vehicle, description of a guy or
01:09 gal who's doing the crime.
01:11 [BLANK_AUDIO]
01:18 I really like to call it the nerve center of the police department.
01:21 That's where we monitor our public safety cameras.
01:26 We run our facial recognition, license plate readers,
01:30 everything that aids our patrol officers in doing their jobs.
01:34 [MUSIC]
01:46 We monitor right now 575 cameras, that number continues to grow.
01:51 [MUSIC]
01:53 The camera that you're gonna see now is on top of the Intercontinental Hotel.
01:58 Now I want you to see just how far in this camera can zoom.
02:02 [MUSIC]
02:05 >> And look at this one, same in the Overtown,
02:07 right next to a little convenience store.
02:09 They're just selling drugs like if it was candy.
02:11 Narcotics.
02:15 And they're just doing it there.
02:16 And remember how our camera can zoom in,
02:17 we can see everything that's happening there.
02:19 You'll see the baggies, you'll see what he's selling.
02:22 So we'll capture so much things.
02:25 [MUSIC]
02:33 >> Yesterday there was a carjacking.
02:35 The offender who stole the vehicle ran over the victim and
02:39 the owner of the vehicle as they were fleeing the scene and
02:41 the owner of the car died on the scene.
02:45 >> It's in the RTCC, should be under documents, right?
02:48 This is the murder from yesterday.
02:53 Okay, I didn't know we'd identified her on Clearview.
02:55 >> Clearview AI for artificial intelligence,
03:00 software that allows you to identify someone almost instantly from a photo.
03:05 It searches through a database of billions of images.
03:09 >> And then we pull it up right here.
03:13 >> Just very simple facial recognition, we get a picture of anybody, somebody.
03:16 We run it through the computer and
03:18 the computer gives us matches that are similar to that picture.
03:22 >> Something that may take weeks to identify somebody.
03:25 We could find them in a matter of minutes or days with facial recognition.
03:30 Because maybe they took a picture on social media.
03:33 >> Instagram, Facebook, LinkedIn,
03:38 many of the images are scraped from social media.
03:40 >> And why is there a line?
03:42 Do you respond to that very sometimes?
03:44 That's not what you're doing?
03:46 >> This particular program, Clearview AI, is the second most accurate face
03:51 recognition platform in the world.
03:52 The first being a program that's run by the Chinese Communist Party.
03:56 We don't want that program.
03:57 [MUSIC]
04:03 >> China uses facial recognition for real time mass surveillance in order to
04:08 monitor its population and suppress dissent.
04:11 [MUSIC]
04:19 Miami police say they use Clearview on all sorts of cases from murders to
04:23 shoplifting, but only after a crime is committed.
04:28 They pay an annual fee of $47,000 for access to Clearview.
04:32 Its strength lies in its colossal database.
04:36 There are almost no laws around the use of facial recognition by police in
04:41 the United States.
04:42 [MUSIC]
04:46 Clearview AI grew out of a company called SmartChecker,
04:51 founded in New York City in 2017.
04:54 On its website, Clearview claims it has the biggest facial
04:58 recognition database in the world.
05:01 The firm says it has plans to collect 100 billion images equivalent to 14 photos
05:07 for every person on the planet.
05:09 [MUSIC]
05:11 Clearview's principal clients are US law enforcement agencies.
05:15 [MUSIC]
05:22 Shortly after Russia invaded Ukraine,
05:24 Clearview gave Kiev free access to its technology,
05:28 a move that expanded its use from law enforcement to the military.
05:33 Clearview's being used there to identify the dead and to catch Russian spies.
05:38 >> There was a problem with Russian soldiers dressing up in Ukrainian
05:42 uniforms.
05:43 And so what the Ukrainians now do is they take out their phone and
05:47 they look at them through the phone and they say, no, you're not Yuri from Lvov,
05:51 you're Dmitry from Vladivostok.
05:54 They know who they are by using facial recognition.
05:57 And artificial intelligence.
05:59 [MUSIC]
06:01 >> Richard A.
06:02 Clark has expertise in this field.
06:04 He was the United States counterterrorism czar.
06:07 He served under three presidents before switching to the private sector.
06:12 He's on Clearview's advisory board.
06:14 >> Well, one of the things I do on the advisory board is make sure they don't do
06:17 anything unethical or immoral.
06:19 And we're very, very good about that.
06:23 We won't sell to certain countries.
06:25 We won't allow it to be used for certain purposes.
06:28 [MUSIC]
06:30 >> Clearview's CEO, Juan Tontat, is Australian with Vietnamese roots.
06:34 He claims he's descended from royalty.
06:37 He moved to San Francisco when he was 19.
06:39 Now, Juan lives in New York.
06:43 We first meet Juan for an interview in the summer of 2021.
06:47 >> So I've always loved computer programming since I was a kid.
06:50 I would stay at home from school to watch videos of MIT professors
06:55 teaching programming.
06:58 And so I ended up moving to Silicon Valley when I was young to,
07:01 I call it the Hollywood for computer nerds.
07:03 That's where all the best programmers want to go.
07:06 [MUSIC]
07:08 We're gonna do the demo of your face.
07:10 Is it okay if I run it?
07:11 >> Yes. >> Okay.
07:13 Let's just smile.
07:14 [MUSIC]
07:19 And as it's searching, it's going through all these photos online.
07:25 >> Can you find something?
07:26 Well, where's that from?
07:27 >> This one here?
07:28 >> Yeah.
07:29 >> So that is a group photo on Instagram.
07:34 >> That is amazing.
07:35 >> So that's you.
07:37 >> I've never seen that photo before.
07:38 >> One that comes up with your group photo.
07:40 >> I have never posted a photo on Instagram, but
07:42 Clearview AI finds one uploaded by a total stranger.
07:45 >> It works with a neural network.
07:48 Neural network's a type of artificial intelligence that is trained on millions
07:52 of face examples.
07:53 So older algorithms would be measuring the distance between the eyes and
07:57 the nose and the mouth.
07:58 But a neural network is trained on so many different types of photos
08:01 that it becomes very robust to change.
08:03 So people can upload photos from different angles, different lighting,
08:07 at different ages, sometimes with glasses or a beard.
08:09 So it's searching anything that's available on the open public Internet.
08:13 So it could be news websites, it could be mug shot websites,
08:16 it could be social media, anything that's publicly available or posted.
08:20 There's nothing private.
08:21 >> Under US law, scraping publicly accessible photographs is legal.
08:25 But the public were not aware that Clearview was collecting their images
08:29 until a tech researcher brought it to their attention.
08:32 >> My name is Freddy Martinez.
08:33 I'm a senior researcher at the Project on Government Oversight.
08:36 So I do a lot of research, particularly around police surveillance technologies
08:41 and how they're used.
08:44 As part of that project, I started doing a lot of public records requests
08:47 on facial recognition technology.
08:49 And that's how we came to learn about the existence of Clearview.
08:52 >> Freddy got back public records from over 100 police departments.
08:57 Clearview stood out from other facial recognition firms because it was
09:00 scraping social media for photographs.
09:03 >> This document was prepared by Clearview lawyers.
09:07 And it basically stated that the public has no right to their images.
09:13 And that the company can just collect them and sell them.
09:17 In a sense, it's creating dossiers of people online over time.
09:23 And that's when we reached out to the New York Times.
09:25 And that became the basis for their story.
09:29 >> Kashmir Hill wrote an expose for the New York Times in January 2020.
09:36 >> Well, the thing that Clearview has done that no other company in the United
09:38 States that I'm aware of has been willing to do, because it's so
09:42 controversial and so abusive, is to scrape the internet for billions of
09:46 photos of people from social media, from local news websites, from people's
09:51 employers, from anywhere else, and amass them into a gigantic face
09:56 recognition database.
09:58 That database is collected without anybody knowing that their faces are in
10:03 it, without getting people's consent for sure.
10:07 >> The ACLU took Clearview to court in Illinois for breaking the state's
10:11 biometric privacy law.
10:13 [MUSIC]
10:18 >> That's the photo of you.
10:19 >> Juan went on US news networks to defend his tech.
10:23 >> Do you understand why people find us creepy?
10:27 >> This all concerns an American company called Clearview AI.
10:31 >> [FOREIGN]
10:35 >> They're using these photos to help their clients learn everything about
10:39 you.
10:40 >> The Toronto Police Service has made a disturbing admission.
10:43 Officers use secret facial recognition technology.
10:45 >> It emerged that Clearview had offered its tech to police departments
10:48 outside the US and to private firms.
10:51 >> Verizon, Best Buy, casinos in Vegas, banks.
10:57 >> Clearview was hit by yet another scandal when Luke O'Brien wrote a story
11:01 for the Huffington Post that revealed the firm's ties to the far right.
11:06 [MUSIC]
11:13 >> Charles Chuck Johnson is an entrepreneur with a keen interest in
11:16 biometrics.
11:18 His critics consider him a far right activist.
11:21 [MUSIC]
11:26 >> He's keen to show off his French.
11:28 >> [FOREIGN]
11:36 [MUSIC]
11:39 >> Chuck claims he played a crucial role in Clearview's beginnings.
11:43 In March 2023, he filed a lawsuit against Clearview AI for breach of contract.
11:49 The suit claims he co-founded the original firm.
11:53 >> [FOREIGN]
11:58 [MUSIC]
12:01 >> Chuck's Twitter account was suspended in 2015 after he posted a series of
12:05 racist tweets.
12:07 [MUSIC]
12:17 [MUSIC]
12:23 >> Chuck has boasted about his connections to the rich and powerful,
12:26 particularly those to the right of the political spectrum.
12:29 [MUSIC]
12:32 >> Although now he claims he's changed and supports Joe Biden.
12:36 [MUSIC]
12:40 >> Here he is in 2016 with Trump's former chief strategist, Steve Bannon.
12:45 [MUSIC]
12:55 >> Chuck worked for Bannon's news site, Breitbart.
12:58 [MUSIC]
13:08 [MUSIC]
13:15 >> Chuck believes European countries should also be using Clearview.
13:18 [MUSIC]
13:23 >> Well, so you have this problem, right?
13:25 Like, you know, there's lots of, you know, migrants and all that from Syria that
13:29 have come, and there'll be more.
13:31 >> They can't all come to France or to Germany or to the United States, for
13:34 that matter.
13:35 And so we're going to have to figure out a solution to this problem.
13:38 And I think one of the things that can help on that is facial recognition.
13:42 >> In 2017, Chuck wrote on Facebook that he was building algorithms to ID
13:46 illegal immigrants for the deportation squads.
13:49 >> Because many of these people have iPhones.
13:51 They're not that destitute.
13:53 I mean, they're poor, relatively speaking, of course.
13:56 And so they have identities, and you can check who's who.
13:59 [MUSIC]
14:05 Chuck is a very menacing figure, extremely vitriolic.
14:09 And he does this in part by trying to be as terrifying and menacing as possible,
14:14 especially to people whom he considers lesser than himself, such as women,
14:19 people of color, people who have Jewish heritage, anyone who is not a
14:24 cis, white, heterosexual male.
14:28 I worked with Chuck for a period of four to four and a half years.
14:33 He just was constantly trying to impress upon people that he would be powerful
14:38 and he knew powerful people.
14:40 [MUSIC]
14:44 Artemis used to be a far-right activist.
14:48 We'll use a pseudonym for her, and we've altered her voice.
14:52 She's in hiding and fears for her safety.
14:57 I'm afraid to show my identity because I received many menacing threats
15:04 once I began to leave the far-right.
15:11 When we asked Chuck about his Facebook post, he said he'd just been joking.
15:17 Artemis explains how members of the far-right often use this defense.
15:24 It kind of, almost in a way for people just looking in, disarms the threat,
15:29 which is something that you need to be very aware of whenever you deal with them.
15:33 It's like, oh, they're posting memes and stuff like, you know, Pepe the Frog
15:38 and saying "Henlo, friends," and their bizarre catchphrases while they're
15:41 talking about exterminating Jewish people.
15:47 These people all speak the same old world, discussing racism, you know,
15:52 1920s white supremacy, and they combine that with the new far-right
15:58 Internet terrorism as well.
16:03 I do not know how Chuck became so wealthy.
16:07 I know that he has connections to powerful billionaires.
16:11 All right, on y va?
16:14 In his lawsuit, Chuck claims he procured the initial funding for Clearview AI.
16:21 I introduced Juan to a lot of my contacts, who all basically thought he was
16:26 a very bright technologist but not really a business person.
16:28 I was like, don't worry, I'll handle that part of it.
16:31 And so anyway, so we raised a bunch of money for the company.
16:33 The company started having successes.
16:35 It was sort of a very triumphant kind of period.
16:39 My father and our next president, Donald J. Trump.
16:49 Clearview AI is a mass surveillance firm born out of the Trump era.
16:55 In the summer of 2016, Chuck Johnson and Juan Tontat cheered Trump on
17:00 together at the Republican National Convention in Cleveland, Ohio.
17:05 I humbly and gratefully accept your nomination for the presidency of the
17:13 United States.
17:15 Chuck says this is where he introduced Juan to one of the biggest names in
17:19 tech.
17:21 The co-founder of PayPal and first investor in Facebook, entrepreneur
17:25 Peter Thiel.
17:28 Good evening.
17:30 I'm Peter Thiel.
17:32 I'm not a politician.
17:34 But neither is Donald Trump.
17:36 He is a builder.
17:38 And it's time to rebuild America.
17:44 Thiel gave the seed money for Clearview AI.
17:50 Juan was, you know, really wanted to meet Peter, wanted to talk to him.
17:54 And I was like, well, I know him, like, we'll set it up.
17:57 And so, you know, Juan's working on some interesting stuff, like, you
18:00 know, on facial recognition.
18:02 He's working on these things.
18:03 Peter's like, oh, that's really cool.
18:05 We talked a lot about, like, political correctness and communism.
18:08 I am proud to be a Republican.
18:11 But most of all, I am proud to be an American.
18:20 Peter Thiel is one of the founders of PayPal, along with Elon Musk.
18:25 Back in the '90s, he created this system of payments that we all know.
18:31 We know about Musk's career, but Peter Thiel is more discreet.
18:36 Holy fuck!
18:37 Oh, my God!
18:38 Oh, my God!
18:40 The 9/11 attacks ushered in an era of mass surveillance.
18:44 It was the start of Bush's War on Terror.
18:48 Six weeks after the attacks, Congress passed the Patriot Act,
18:52 making it easier for the US government, and in particular the NSA,
18:56 to spy on ordinary Americans.
19:00 Thiel played a major role in the mass surveillance ecosystem.
19:06 In 2003, he founded his big data analytics firm Palantir,
19:12 amongst its customers, US intelligence agencies.
19:20 The reason Palantir was founded was because the NSA launched
19:24 a huge mass surveillance program across the world.
19:28 The problem is that such a program collects such a massive amount of data
19:32 that in order to make sense of it, you need a new type of software, Palantir.
19:38 Palantir is big brother.
19:41 When Clearview's facial recognition technology was pitched to Thiel,
19:45 he took the chance to invest.
19:49 Clearview was very appealing to Peter Thiel.
19:52 It fit perfectly in this ecosystem of the highest-performing
19:55 surveillance technology on the planet.
19:59 Thiel is part of Silicon Valley's right wing,
20:02 and he doesn't think highly of people who vote.
20:05 You could basically measure people's IQ very simply by how optimistic
20:09 they were about politics.
20:11 The dumb people tended to think you should really all vote.
20:15 Thiel appears to put algorithms over politics.
20:19 Technology is this incredible alternative to politics.
20:23 "I no longer believe that freedom and democracy are compatible," he wrote.
20:27 Peter Thiel's vision for governance is one in which the sheep are herded by dogs.
20:37 A society in which the sheepdogs at the top of the pyramid control the sheep.
20:43 That's essentially his vision for society.
20:49 Thiel did not respond to our interview request.
20:55 For a lot of us, it's like our Woodstock, in a way.
20:59 We have to win.
21:01 In November 2016, Donald Trump is elected president.
21:06 Under Trumpism, Clearview took off.
21:13 European countries also took an interest in Clearview AI,
21:16 as the firm had scraped photos of their citizens.
21:20 Under the EU's General Data Protection Regulation, or GDPR,
21:24 it's illegal to collect people's biometric data, including their face prints,
21:28 without their explicit consent.
21:33 I heard about Clearview AI for the first time in January 2020,
21:37 in a New York Times article.
21:40 The company really caught my attention because it was the first time
21:43 I had heard of a facial recognition firm which was collecting its own database.
21:53 French lawyer Zoé Villain was convinced that the way the firm had built its database
21:58 by scraping photos from the web without consent was illegal in Europe.
22:09 Scraping, it's an automatic tool which is let loose on all the internet sites of the world,
22:15 and that allows you to scrape images on a massive scale.
22:24 All the photos that you've put on your Facebook profiles, on Twitter, on Instagram,
22:28 Clearview went and collected them en masse and saved them.
22:34 You might have since deleted those photos on your Instagram or Twitter accounts,
22:39 but they've already made a copy, so that's what they've done, but on a colossal scale.
22:47 And that's when I started asking myself if the personal data,
22:52 photos of French people or Europeans were also being collected by Clearview AI.
23:01 In order to find that out, Zoé decided to use her own case.
23:05 European laws grant her the right to request her data.
23:09 She wrote to Clearview.
23:11 The company sent her four photos.
23:17 The second photo is an internet memory.
23:20 It's a photo of me that is no longer online, but they saved it in their database.
23:25 I realized that they can find photos of me in news articles,
23:29 photos that I didn't even know existed.
23:32 So that's when I decided to inform France's privacy watchdog.
23:37 The CNIL is France's National Data Protection Commission.
23:41 It's the country's privacy watchdog.
23:43 The independent organization is in charge of ensuring that
23:46 Europe's data protection laws are respected.
23:51 Biometric data have a special statute because it's sensitive data.
23:56 It's linked to our most private information, our body, our face.
24:00 And so it's treated in very specific conditions.
24:03 So basically it's not legal to use someone's biometric data
24:07 unless under certain exceptions.
24:11 After examining the case, CNIL agents concluded that Clearview had violated GDPR
24:17 because the firm had scraped photos of European citizens from the internet
24:20 without their consent.
24:23 The CNIL told Clearview to delete the images, but got no response.
24:28 Clearview doesn't believe GDPR rules apply to it because the firm is not based in Europe
24:33 and because it doesn't sell its software to European entities.
24:37 We've not managed to have any constructive or fruitful talks with them.
24:43 The CNIL fined Clearview AI 20 million euros for illegally collecting
24:47 and processing face prints of French citizens.
24:50 In May 2023, it issued an overdue penalty of 5.2 million euros.
24:56 France, Italy, Greece, the UK, Canada and Australia
24:59 have all launched legal proceedings against the firm.
25:02 Our legal team is handling those complaints,
25:04 but what I can say is that it's only publicly available information.
25:09 With all of these social media accounts, your account can be public or private.
25:14 And if you elect to have your account public, then it's public,
25:20 and then people can see it.
25:22 It's a bit like walking down the street.
25:25 If you walk down the street in public, people can take your picture, and they can use it.
25:32 [music]
25:58 Hey there, are you interested in learning about facial recognition technology?
26:02 Have a nice day.
26:04 Advocates for free speech and privacy are calling for a ban on facial recognition
26:09 until proper safeguards can be put in place to prevent its abuse.
26:14 There's one over there. If you look right under that building there,
26:17 there's four right there. One, two, three, four, five, six, seven.
26:22 It's pretty wild. And if you look at the tops of buildings, there are more.
26:26 Amnesty International mapped 25,500 NYPD cameras across the city.
26:32 We know that those use facial recognition technology.
26:35 That's just the ones that Amnesty International was able to map.
26:37 Another danger posed by facial recognition is false identification.
26:41 The tech has proven to be less accurate when it comes to identifying black people.
26:46 It's likely to misidentify people, and so people are getting arrested who didn't do anything.
26:51 Are there cases where people are arrested just solely on facial recognition?
26:56 Yeah, there have been multiple.
26:58 We saw a man in New Jersey held for weeks because of facial recognition.
27:04 He was nowhere near the crime scene.
27:06 We saw a man in Detroit being arrested because of facial recognition.
27:11 And it was just because of a false match.
27:13 [police sirens]
27:15 In all these cases with arrests, they're using older facial recognition technology.
27:19 On top of that, there's been very few wrongful arrests compared to the overall number of cases that have been cracked.
27:25 These activists say police use of facial recognition suppresses civil liberties.
27:32 It's the perfect tool for authoritarianism because the thing in a democracy,
27:36 if you let the government track where everyone goes at all times,
27:40 then suddenly free speech, freedom of association, freedom of religion,
27:44 all of those things start to go out the window.
27:47 [sirens]
27:51 [protesters chanting]
27:54 At least six federal agencies used facial recognition technology to surveil protesters
28:00 during civil unrest following the death of George Floyd in 2020,
28:05 according to the Government Accountability Office, which is part of the US legislative branch.
28:11 [sirens]
28:18 Clear abuse technology is so powerful and so prone to abuse,
28:23 it shouldn't be a tool available to police at all,
28:27 and certainly not to use to identify people who are taking part in a protest,
28:33 even if there's an allegation of some illegal activity there.
28:37 If protesters think that they're at risk of being instantaneously identified and tracked by police,
28:42 that's a huge disincentive to exercise our constitutional rights to go out on the streets.
28:47 [sirens]
28:54 Protesters in New York also accused the police of targeting them with facial recognition.
29:00 Black Lives Matter protester Derek Ingram cannot prove that Clearview's software was used against him,
29:06 but the NYPD did admit that it used facial recognition on him.
29:13 I do think Clearview AI was utilized in my case, and I think it was weaponized.
29:19 I was a huge leader in a lot of the activism work that was happening here in New York City,
29:24 and I think they were strategically trying to silence me.
29:27 [sirens]
29:31 He participated in demonstrations right here in Times Square.
29:36 [sirens]
29:40 Police accused him of yelling in an officer's ear with a loudspeaker.
29:45 [sirens]
29:50 On August 7, 2020, heavily armed police officers turned up at his door.
29:56 [sirens]
29:58 This started at 7 a.m.
30:01 There were approximately 30 officers surrounding my apartment.
30:06 They had battering rams outside of my door.
30:09 There were drones outside of my apartment with cameras in the windows.
30:13 There were sharpshooters on the top of buildings.
30:16 I believe I could possibly be killed if I left my apartment.
30:20 You even locked up one person illegally.
30:22 Get the fuck out of here.
30:23 This is what people pay taxes for.
30:25 For one protester.
30:28 For one protester.
30:30 The police did not have a warrant.
30:33 Derek alerted activists on social media, and they came down to support him.
30:38 [shouting]
30:40 Where's the warrant?
30:42 Where's the warrant?
30:44 Where's the warrant?
30:46 [shouting]
30:48 Under pressure, the police retreated.
30:50 [shouting]
30:53 All of the charges were dropped within a year of the incident,
30:56 and now I have a counter-lawsuit against the NYPD as well.
31:00 [music]
31:11 A still frame taken from footage of the siege filmed by Freedom News TV
31:16 showed an officer holding a document that said "Facial Identification Section."
31:21 On it, there was a photo from Derek's Instagram.
31:24 [music]
31:36 I think they utilized the CCTV cameras to then scan my social media
31:41 and other social media to match me with the video footage from that day.
31:46 And once they were able to do that, they were able to find out
31:49 exactly where I was located and where I live.
31:52 The NYPD confirmed in this email that it had used facial recognition technology
31:57 in the case of Derek Ingram.
32:00 [music]
32:02 Derek suspects Clearview was used, but a lack of transparency from the NYPD
32:07 makes this impossible to verify.
32:10 [music]
32:15 The NYPD is tight-lipped about its use of Clearview AI.
32:19 The Legal Aid Society filed a public records request for emails
32:23 exchanged between the NYPD and Clearview AI in 2020.
32:28 The documents prove the NYPD was using Clearview AI.
32:33 Clearview employees give advice on how to use the software.
32:37 Many of the emails are signed by Juan Tontat.
32:40 [music]
32:43 Other emails are signed by Clearview co-founder Richard Schwartz.
32:48 Unlike Tontat, Richard Schwartz avoids the limelight.
32:52 [sirens]
32:59 He has very little Internet trace, this despite the fact he played
33:02 an important public role in New York City.
33:05 [music]
33:08 Schwartz was former New York Mayor Rudy Giuliani's top aide,
33:12 a key operative in the New York Republican sphere.
33:15 Schwartz has close ties to law enforcement.
33:18 [music]
33:21 Giuliani was best known as the mayor who pushed a zero-tolerance approach
33:25 to fighting crime.
33:27 [music]
33:29 [car horn]
33:31 [applause]
33:33 Richard, welcome.
33:34 Thanks, Daphne.
33:35 What was a day in the life of Richard Schwartz in the Giuliani administration?
33:39 It was terrific, I have to say.
33:42 Clearview's PR agent told us Schwartz does not give interviews.
33:47 [ambient street noise]
33:52 Chuck Johnson says he brought Richard Schwartz into the firm.
33:56 [ambient street noise]
33:59 Schwartz is very plugged in with sort of right-wing circles in New York.
34:03 You know, nothing really happens in New York unless you have
34:05 like an old Jewish guy helping you.
34:07 [ambient street noise]
34:09 Chuck claims that there were three founders of Clearview AI.
34:13 Himself, Juan Tontat, and Richard Schwartz.
34:17 [ambient street noise]
34:20 But what became clearer and clearer to me was that he wanted to basically
34:23 just take the money that I'd raised and sort of throw me by the wayside.
34:26 I mean, my sense is that they wanted to get rid of me.
34:28 I was a problem for them.
34:30 [ambient street noise]
34:33 That's not Juan's version of the story.
34:36 [ambient street noise]
34:38 What do you have to say to those claims? Is it true?
34:40 It's not true. It's like the company was founded by myself and my partner,
34:44 Richard Schwartz, who's a Jewish person.
34:46 And we started this company to help really help law enforcement.
34:51 We were developing the technology first.
34:53 But all those things are just not true.
34:55 So did you have any dealings with Charles Johnson?
34:58 Charles Johnson is someone I know, but, you know, he's not a co-founder
35:02 of the company. He's not an employee. He's not a board member.
35:05 He's not an investor, and he never was.
35:07 And that's all I have to say on the topic.
35:10 [ambient street noise]
35:12 Yeah, he's lying, unfortunately. It's sad.
35:14 It's sad to see somebody lie to try to make money.
35:16 Very pathetic. But very American, too, in a way.
35:20 Maybe he's assimilating, you know?
35:22 Chuck shared two videos with us.
35:25 So let me show you how to proper, you know, stance, OK?
35:30 This one from a shooting range in California.
35:33 Stabilize it.
35:35 [gunshot]
35:37 What do you think?
35:38 Gun's rock.
35:39 [chuckle]
35:41 Cut him.
35:42 ♪ His name was Chuck ♪
35:45 ♪ He sent a tweet ♪
35:47 Here, Juan is in the company of far-right activists
35:50 singing a song mocking Chuck's Twitter ban.
35:53 ♪ But now he's banned on every social media platform ♪
35:58 ♪ It sucks ♪
36:00 ♪ Poor Chuck, poor Chuck ♪
36:03 ♪ Oh, wait a minute, it's like an endless shower ♪
36:05 The tone is anti-Semitic, racist, and misogynistic.
36:10 He starts screaming like Hitler speeches from everything.
36:13 We so need to go up to her.
36:15 A pro-Trump art party in 2016 in New York.
36:19 Ali Alexander, the founder of Stop the Steal,
36:22 far-right provocateur Milo Yiannopoulos,
36:25 Proud Boys founder Gavin McInnes,
36:27 and Juan Tontat are all at the event.
36:33 This video is no longer online.
36:35 Do you share their political ideas?
36:37 Absolutely not.
36:38 I'm not a white nationalist or a white supremacist.
36:41 I'm not even white, right?
36:42 And Clearview AI is not a politically driven company.
36:47 We're good, I think, right?
36:49 Yep.
36:51 A few days later, we received an email from Juan's PR agent
36:55 saying that if we plan to air the far-right part of our interview,
36:59 we will need to get our legal counsel to speak to Clearview's.
37:05 I met Juan Tontat a couple of times
37:07 in the presence of white supremacists.
37:09 I had never encountered him outside the presence of white supremacists,
37:13 even though I understand he is trying to launder
37:15 his reputation into the mainstream.
37:19 Juan's roots, his extremist roots, are not acknowledged at all.
37:23 And it's particularly disturbing considering that he does contract out
37:27 to at least 600 different police departments
37:29 who are also largely unaccountable in America.
37:36 [Music]
37:40 Jack Paulson runs a non-profit called Tech Inquiry,
37:43 which promotes transparency in federal and law enforcement contracts
37:47 with private firms.
37:51 In July of 2021, Clearview AI was reported as being valued at $130 million.
37:57 It would be difficult to say what their current valuation would be,
38:00 but probably not more than $250 million.
38:04 [Music]
38:09 Jack filed public records requests for Clearview's contracts
38:12 with the FBI, Department of Defense, and Homeland Security.
38:17 [Music]
38:29 To some degree, what you see in government records
38:32 is just the tip of the iceberg.
38:34 So obviously a huge amount of Clearview AI's effort has been
38:38 establishing relationships and selling directly to police
38:41 all around the country.
38:43 Frankly, that's just a lot harder to monitor.
38:47 Clearview AI lobbies hard with law enforcement.
38:54 Its site lists many of its customers and advertises events
38:57 where Clearview will be present.
39:01 Clearview is attending the Sheriff's Winter Conference.
39:04 The event is taking place in a hotel in Washington, D.C.
39:09 It's closed to the media.
39:16 So this is Jack Poulsen.
39:18 I'm at the National Sheriffs Association Winter Conference.
39:24 One of the main topics of conversation is on policing
39:27 the United States' southern border with Mexico.
39:32 It's a message. Do not cross into our territory.
39:35 We control this cartel. This syndicate controls this area.
39:41 Jack is trying to pass incognito.
39:44 He heads to the private vendors' hall.
39:47 Next to Clearview AI, the people I spoke to earlier.
39:54 Clearview's there next to Axon, a company that makes tasers
39:57 and body cams.
40:00 We continue to work with the U.S. government.
40:05 Clearview's employees scan Jack's face and unmask him.
40:13 They pretty quickly used facial recognition on me
40:16 and then told me who I was.
40:19 I mean, it's disturbing because if someone can that easily identify you.
40:24 I actually would prefer if you didn't record me.
40:27 Oh, you guys took my picture and ran facial recognition on it.
40:32 Is it OK to run facial recognition on me but not for me to have your picture?
40:39 Frankly, I was at a policing conference, you know,
40:42 not particularly wanting anyone to know who I was.
40:45 And so to some degree, you know, I felt exposed just talking to them.
40:55 Clearview has a $1.5 million contract with Immigration and Customs Enforcement.
41:01 ICE has now run thousands of searches,
41:03 many of them conducted by agents who arrest and deport undocumented immigrants.
41:10 If Clearview AI tries to scrape all social media from all around the world
41:15 and there is a picture of an immigrant to the United States,
41:18 then ICE could try to find out who that person is
41:22 and then if they're someone that's determined not to be a citizen, then deport it.
41:28 On a contract Jack obtained by filing a public records request,
41:32 an ICE agent signing off on a deal has left a note.
41:37 The sheep resent the sheepdog until they are in the midst of wolves.
41:50 So that terminology in terms of sheep and sheepdogs and wolves
41:54 is common in nationalistic circles in the United States.
42:01 The argument of the phrase is that you may criticize what ICE is doing
42:05 or what Clearview AI is doing until a wolf or a criminal attacks you
42:12 and then you start to respect the sheepdogs.
42:15 [Music]
42:19 [Music]
42:23 [Music]
42:50 Growing up undocumented is constantly trying to figure out
42:55 how to survive, how to make ends meet, how to really, you know, go to college,
43:02 how to get a job, how to just have a life here.
43:07 My name is Reyna Maldonado. I'm 29 years old. I'm a business owner in California.
43:14 Do you want spicy salsa or no spicy?
43:17 Reyna Maldonado moved from Mexico to the US with her mother when she was 6 years old.
43:22 She runs a restaurant in California.
43:25 She's a plaintiff in a lawsuit against Clearview AI.
43:28 Reyna has been an activist fighting for immigrant rights since her teens.
43:36 Shoot down ICE! Shoot down ICE! Shoot down ICE! Shoot down ICE!
43:41 [Spanish]
43:51 Despite the fact that California law bans most police use of facial recognition,
43:57 federal agencies like ICE can still use it.
44:00 I stopped using Facebook. I'm very limited to the information that I share on Instagram
44:07 as well as I'm no longer active on Twitter.
44:11 There's so many ways in which my life changed once I started to understand Clearview AI.
44:19 The lawsuit Reyna's a part of alleges that Clearview's surveillance tech violates privacy rights
44:27 and facilitates government monitoring of protesters, immigrants and communities of color.
44:33 This chilling effect on activists is a very strong way to control and manipulate
44:40 the ways in which society can function.
44:42 They want to intimidate us. They want to scare us.
44:47 They want to sell our data to people who have that control that can use that against us.
44:54 [Siren]
45:04 ISC East.
45:06 A trade conference for surveillance, especially video surveillance and facial recognition
45:13 at New York's Javits Center.
45:15 [Music]
45:23 She's looking at the wine. She picks up a wine bottle and there she goes.
45:27 She puts it right in her purse. So we just solved this problem, this case, in seconds.
45:32 Okay?
45:33 [Music]
45:42 Clearview AI is here.
45:44 Hi. How are you doing?
45:47 How's business going?
45:48 Well, thank you very much.
45:49 How many images do you now have in your database?
45:52 It's 30 billion now.
45:53 30 billion?
45:54 Yeah, it's public.
45:55 And what about your plans to capture the faces of everybody in the world?
46:00 You know, every photo is a clue that helps solve a crime.
46:03 But again, we're not doing any interviews here, so just enjoying the trade shows.
46:07 But what's the reason? Why are you so annoyed?
46:09 You can turn off the camera.
46:10 No, but I want you to ask me why you're so annoyed,
46:12 because I've been trying to get an interview with you.
46:14 What about giving your technology to Ukraine? How has that worked?
46:17 Juan.
46:21 Juan, don't run away. Don't run away. Just talk to me.
46:24 Why not? You guys have scraped the faces of everyone in the world,
46:29 and I can't take a video of you. Why not?
46:32 Clearview has several foreign clients, including Middle Eastern dictatorships.
46:40 Our investigation reveals that Clearview AI has sold its technology
46:44 to Dubai in the United Arab Emirates.
46:47 The firm's lawyer admitted this in court in London in 2022.
46:51 The United Arab Emirates is a tiny Gulf state with 10 million inhabitants.
46:57 Only one million of them are Emirati. The others are foreigners.
47:01 The UAE is a monarchy ruled by Sheikh Mohammed bin Zayed al-Nayan.
47:06 One of the wealthiest countries in the Middle East, it has close ties to the West.
47:11 The government is autocratic. Internal dissent is forbidden.
47:16 People tend to say that the UAE is some sort of massive,
47:20 really repressive government. They haven't been there, if they say that.
47:25 It's just not a democracy. And I come back to that.
47:29 Not every country has to be a democracy.
47:33 The UAE spends tens of millions of dollars every year in the US
47:40 to whitewash their reputation.
47:42 They do not want us to think about human rights abuses there.
47:45 They do not want us to think about the illegal detention of foreign academics there.
47:50 They want us to only think about fly Emirates
47:53 and all the positive things about the UAE.
47:56 I don't think the United Arab Emirates is a force for anything
48:00 except stability, peace and progress.
48:09 Bolstered by oil and gas revenues, the UAE has spent its wealth on weapons,
48:14 many of them high-tech.
48:16 IDEX, an international arms fair in Abu Dhabi,
48:23 a window into a world of high-tech weaponry.
48:34 Amongst the vendors, numerous Israeli firms.
48:37 In 2020, the UAE signed the Abram Accords,
48:40 a deal to normalise relations with Israel.
48:43 We are very proud to be here under the Abram Records peace agreements.
48:53 And we feel very safe here. The hospitality is very nice.
48:58 The UAE is developing its own weapons industry.
49:02 They are pioneers in facial recognition technology.
49:05 We can find you from anywhere.
49:13 In facial detection, you can find anybody that has anything.
49:16 Joining Adib is now very simple.
49:21 You only need to sign a contract with the UAE
49:24 and your Emirates ID and passport, and yes, your face.
49:28 Your identity is verified and protected...
49:31 Facial recognition is increasingly becoming part of daily life in the UAE.
49:35 ..and officially accredited.
49:37 Then just answer a few simple questions...
49:41 The UAE is maybe the big brother.
49:47 They try to control everything.
49:50 I can call it a golden cage.
49:52 Those who dare criticise the regime end up in prison or in exile.
49:59 That's what happened to human rights defender Hamad Al Shamsi.
50:03 The UAE has cameras everywhere in the UAE, like on every street,
50:09 and they track everyone in the UAE.
50:11 So they spread a kind of intimidation.
50:14 They try to make you feel like you are a terrorist.
50:17 They have a system called Falcon Eye.
50:20 They use that to track everyone in the UAE.
50:23 Surveillance cameras installed on roads and buildings across the UAE
50:30 allow the authorities to track people at all times.
50:33 This surveillance system is now used by the UAE's security forces.
50:37 It's a very simple system.
50:39 You can see the security cameras on the road,
50:42 and you can see the security cameras on the road.
50:45 This surveillance system is called Falcon Eye.
50:49 So when it comes to monitoring offences by drivers,
50:56 it's all being automated, and you get almost an immediate message on your phone.
51:00 They're saying, "You're going too fast, you don't have this, you're talking on your phone."
51:04 It's not there to police the community.
51:08 It's more to facilitate and make everybody's life easy.
51:14 As this propaganda video illustrates, Dubai police are keen on high-tech gadgets.
51:19 We tried to contact them to ask about their use of facial recognition technology,
51:26 but got no response.
51:28 That would be fine with me, if Clearview AI sold to Dubai.
51:42 If it's used by the Dubai police the way it's used by police here in the United States,
51:47 then I think that's fine.
51:49 But Richard Clark is not entirely objective.
51:57 He's close to the Emirati authorities and has been accused of lobbying for them,
52:02 something he denies.
52:05 Richard Clark's the founder of Good Harbour Consulting,
52:11 which does a significant amount of work in the UAE,
52:13 including helping to build the surveillance state there.
52:16 We think they're a key part of UAE influence operations here in the US,
52:20 and possibly abroad.
52:22 Clark says Good Harbour is not a lobbying organisation,
52:30 but rather a cyber security consulting company,
52:33 working only with private sector clients, and only in the US.
52:38 Good Harbour may not be a lobbying organisation,
52:41 however in 2019 Reuters revealed that from 2008 to 2010,
52:46 Clark helped the UAE build a secret spying unit called DREAD.
52:51 It was touted as an effort to counter terrorism,
52:54 but ultimately ended up being used to target human rights defenders,
52:58 journalists and dissidents.
53:03 Richard Clark is the architect of the NSA's post 9/11 technology.
53:08 And from 2005 he thought that he might make more money in the private sector,
53:13 so he went to work for the Emiratis and built them the same thing.
53:17 Tools of mass surveillance once believed to be widespread only in China,
53:27 have made their way to the United Arab Emirates.
53:30 Mohammed Bin Zayed is a fan of artificial intelligence.
53:34 Since the Arab Spring, the Emirati cyber army is no longer really focused
53:50 on tackling terrorism, but rather on crushing democracy in the Arab world,
53:55 and crushing dissent at home and abroad.
53:59 A tweet from Juan Tontat, unearthed by investigative journalist Luke O'Brien.
54:05 Democracy won't work, only dictatorship will.
54:09 In Western democracies there's increasing unease over potential misuse
54:16 of artificial intelligence.
54:18 We're not going to stay quiet about this.
54:21 We're going to stay quiet about this.
54:24 We're not going to stop here.
54:26 Our community is not going to stay quiet and allow them to continue
54:29 to sell our biometrics and database.
54:32 Christian nationalism is on the rise in the United States.
54:38 As reproductive rights come under threat,
54:43 some women fear facial recognition could be used to track them.
54:52 Facial recognition and surveillance cameras could be used for,
54:56 you know, controlling women.
54:58 Women at the clinic are scared they might be profiled.
55:04 Even if you have a mask on or a burka covering,
55:06 they can still identify you, and it's just really scary
55:09 how fast this technology is improving.
55:11 Have you read 1984?
55:16 I have read 35 pages of the book.
55:19 But we live in a very different world than 1984 today, I think.
55:23 In March, the ACLU released documents that revealed new details
55:27 about an FBI and Defence Department project to develop facial recognition
55:31 software that could be used to identify and track people in real time
55:35 using video surveillance and drones.
55:38 We have lots of concerns about lots of uses of face recognition technology,
55:42 but this is really the nightmare one that starts to look like
55:45 what totalitarian societies in some places in the world today
55:48 have been rolling out.
55:50 People that are doing the right thing, that are not doing criminal activity,
55:53 have nothing to worry about that.
55:55 Times are tough for Clearview AI.
55:59 The firm settled its suit with the ACLU and is now banned
56:02 from selling its services to most private companies.
56:05 It continues to face legal battles in Europe and in the US.
56:09 Places like New Jersey and some other jurisdictions
56:12 have barred police from using Clearview in particular,
56:15 so those debates need to continue.
56:17 Juan says he might open an office in Ukraine.
56:23 Chuck claims he's distanced himself from the far right.
56:32 You look at, like, the angle of the nose, right?
56:36 I think this is naquillin.
56:38 What I'm very interested in is the fact that,
56:40 in the future, there will be a use of genetics to predict the face,
56:43 and we are way, way, way, way far away from that now.
56:46 As Chuck awaits the outcome of his legal battle with Clearview,
56:50 he has other biometric projects on the go.
56:53 I have a DNA company, actually several DNA companies
56:56 that I've invested in, and we're building a very large genetics database.
57:00 And I think eventually the US, most major governments,
57:03 will have everybody's DNA on file.
57:05 CI has always been very interested in biometrics.
57:09 (BEEPING)
57:11 Many of Clearview's dealings remain secret,
57:18 including any it might have had with the CIA.
57:21 We contacted them.
57:25 Shortly afterwards, we got a call.
57:35 We cannot help you in an unclassified way on this topic.
57:39 But the CIA does use facial recognition technology?
57:43 I'm not saying that.
57:46 I'm just saying that this isn't a topic that we can help you with.
57:49 Thank you again for reaching out.
57:52 Bye-bye.
57:54 (SUSPENSEFUL MUSIC)
57:57 (MUFFLED VOICES)
58:00 (MUFFLED VOICES)
58:02 (MUFFLED VOICES)
58:10 (MUFFLED VOICES)
58:17 (MUFFLED VOICES)
58:21 (MUFFLED VOICES)
58:25 (MUFFLED VOICES)
58:28 (MUFFLED VOICES)
58:31 (MUFFLED VOICES)
58:33 [Music]