Clearview: the app that helps strangers find your name and personal info from a single photo

Clearview has assembled the world’s largest identification database by scraping publicly accessible photos from the internet, and it’s matching faces to names like there’s no tomorrow

The Clearview app combs your online profiles for your personal information. 
Powered by automated translation

The apparent transformation of China into a surveillance society has been watched with interest. The facial recognition systems being rolled out in Chinese schools, shopping centres and public transport hubs is ranked as the most invasive in the world. But there's growing concern that citizens of all nations are sleepwalking into a privacy nightmare of their own. Earlier this month it was revealed that an American firm, Clearview, has assembled the world's largest identification database by scraping publicly accessible photos from the internet. It's been selling access to that database to law enforcement agencies. And it's matching faces to names like there's no tomorrow.

How? Firstly, by combining the database with an accurate recognition system. Feed any photo in, and it’ll match it with other photos of the same person with a reported 75 per cent accuracy. And the identification part? The public has already done all the hard work. We’ve helpfully uploaded photos and videos to the internet with all kinds of information – our names, our friends’ names, our hobbies and interests, the places where we live, work and socialise. Clearview spits out all the links to those pages. Law enforcement does the rest.

It's the only technology Google has built [where] after looking at it we decided to stop

Journalist Yashar Ali had two demonstrations of Clearview. "The results were frightening/stunning in their accuracy," he tweeted. "Both demos involved me giving blurry screenshots of a video, and Clearview was able to identify both people even though they barely have a presence online." Police forces from Florida to New Jersey have praised the efficacy of the system, and perhaps that's not surprising. The FBI database has 411 million photos; Clearview's has 3 billion. And it doesn't contain only mugshots; the images are taken from Facebook, YouTube, Twitter, LinkedIn and countless other services. Clearview has gathered them up, made them searchable and turned them into a product.

It's legally questionable whether all those pictures are there for the taking, but given that it's technologically possible, it seems extraordinary that it hasn't already been done. In truth, plenty of companies have had the capability, but have refrained from doing so on moral grounds. "It's the only technology Google has built [where] after looking at it we decided to stop," said former Google chief executive Eric Schmidt back in 2011.

A report by the Federal Trade Commission the following year echoed the dangers of the technology to society: "Companies should not use facial recognition to identify anonymous images of a consumer to someone who could not otherwise identify him or her, without obtaining the consumer's affirmative express consent," it read. But it's unwise to rely on the moral fortitude of ambitious entrepreneurs. Hoan Ton-That, the Australian founder of Clearview, has let the genie out of the bottle. And now the database exists, there's no going back.

Last week Twitter sent a cease-and-desist letter to Clearview, explaining that scraping of images is against its terms of service and ordering the company to delete any data it has collected. But when LinkedIn, which is owned by Microsoft, sued a company last year for scraping data and selling it to third parties, it lost the case. Clearview may have seen that judgment as carte blanche to proceed. "A lot of people are doing it," Ton-That told the New York Times. "Facebook knows."

Privacy campaigners have warned of the creep of surveillance for years, but the law is always slow to catch up with technological progress. Few parts of the world have regulations covering facial recognition, and Clearview has taken advantage of that, along with the public's enthusiasm for uploading pictures for the world to see. Those with an allergy to social media have expressed disbelief that the public could be so stupid ("How can people expect privacy when they are baring themselves publicly to one and all?" asks a commenter on one online forum) but it's not only selfie addicts and budding influencers who appear in Clearview's database. It could be anyone whose picture and name happen to appear next to each other online.

It has also been pointed out by computer security expert Bruce Schneier that to focus on people's lax attitudes to photo privacy is to misunderstand the problem. "People can be identified at a distance by their heartbeat or by their gait," he writes. "Cameras are so good that they can read fingerprints and iris patterns from metres away. And even without any of these technologies, we can always be identified because our smartphones broadcast unique numbers." We are generating this data. We're aware it's being stored. But Clearview has prompted the urgent question of who should be allowed to gather, analyse and sell data that's publicly available.

One Clearview investor, David Scalzo, made his feelings on the subject clear to the New York Times. "I've come to the conclusion that because information constantly increases, there's never going to be privacy," he said. "Laws have to determine what's legal, but you can't ban technology. Sure, that might lead to a dystopian future or something, but you can't ban it." But there are cases of facial recognition technology being banned. The California cities of San Francisco and Oakland have taken the step. The EU is considering a ban in public areas. Activists are calling for action worldwide.

Technology doesn't erode privacy in an instant; it's worn away over months or years. It's long been said that if a service is free to use, that's because our data is being used to generate revenue instead. The ways it has been used – mainly for online advertising – seemed rather benign, and than led us to stumble into the so-called privacy paradox: we dislike our privacy being compromised, but we'll sacrifice some of it in return for convenience. What we rarely consider is what might happen when the technology becomes more powerful.

Today, facial recognition systems smooth our passage through airports. Billboards can show us advertisements targeted to our rough demographic. Tomorrow, Clearview may be paired with augmented reality glasses, giving the wearer instant ID capability. Its database is growing, as it accumulates images uploaded by law enforcement agencies. There's the ever-present risk of misidentification, and a likelihood that such databases encourage discrimination. And there's the inevitability of the system being misused, or – as some Clearview investors predict – that it will be sold to the general public. We take our freedom to be anonymous in public spaces for granted. It would seem that Clearview has put a price on that freedom.