The Londoners using face paint to trick facial recognition
It’s a rainy night in Deptford, a fast-gentrifying area of south-east London. Sheltering from the drizzle in the lobby of a theatre, I find a group of young people daubing blue, black and red paint on their faces in strange, geometric shapes.
“Get some paint on, and we’ll be ready leave at 7,” someone says, handing me a mirror, a brush and a sticky black cake of paint.
This is London’s Dazzle Club. They meet once a month to walk through the city for an hour, wearing face paint to defy dozens of public and private security cameras.
Founder Emily Roderick, 24, paints an attendee’s face with blue and black oblongs. “I’m painting his face in a way that would stop facial recognition algorithms from recognizing there’s a face there,” she tells me. “Your face is very symmetrical; it has specific areas of light and dark — and I’m trying to use the paint to reverse that.”
London is the second-most monitored city in the world after Beijing, with more than 600,000 surveillance cameras in operation, and the average Londoner filmed around 300 times a day.
In 2016, London’s Metropolitan police began trialing facial recognition technology to monitor and apprehend people outside Tube stations, at the annual Notting Hill Carnival and a Remembrance Sunday parade in commemoration of World War I.
According to a review of the Carnival trial by privacy campaigners Big Brother Watch, the technology wrongly flagged innocent people 98% of the time.
On Friday, the Metropolitan Police announced they were rolling out the controversial technology across the city. “I believe that we have a duty to use new technologies to keep people safe in London,” the Metropolitan Police’s Assistant Commissioner Nick Ephgrave said in a statement. “Equally I have to be sure that we have the right safeguards and transparency in place to ensure that we protect people’s privacy and human rights,” he added. The Met said the technology will be on the look-out for a “bespoke watch list” of wanted individuals.
“No other European country has a face surveillance epidemic like the UK, aligning us with the likes of China rather than our democratic counterparts,” said Silkie Carlo, director of Big Brother Watch.
“There is an element of tricking the AI,” says Roderick. The group is using the ‘CV dazzle’ method, created by the Brooklyn artist Adam Harvey. The technique dupes facial recognition cameras by blotting out the nose, deforming the eyebrows and warping the eyeline.
Roderick adds a blue dot to her subject’s face. “These systems are still quite stupid. It’s mad that you can just paint your face in this way and suddenly the technology can’t function anymore,” she says.
I wasn’t planning to join in, but suddenly I’m swishing paint onto my own face in what I’d call an “expressionist” style, going for black, orange and red in garish combination. The result is clownish; not exactly the ethereal look the rest of the group have managed to achieve.
As we swipe paint across our features, we test the results using our smartphones. We use face detection filters, like Snapchat’s, to see if the camera can still recognize us as a human. It’s a satisfying moment when it can’t.
Roderick, alongside her co-founders Georgina Rowlands, 23, Anna Hart, 51, and Evie Price, 22, began practicing this method three years ago at London’s Central St Martins school of Art. “We just sat in the studio just trying to paint our faces and make it function,” she said. “It was quite fun just looking at Snapchat and trying to get it to stop working.”
Roderick and Rowlands founded the Dazzle Club so that other people could learn, too. The designs are simple, clean, and decisively strange. “I look like a tube map!” says one of the attendees excitedly as he examines himself.
“We’re trying to find a way to do it that’s not so intimidating and garish — I wouldn’t want to go to work like this,” Roderick says.
Roderick and Rowlands are now so experienced at this that they can sometimes trick the cameras simply with a few dots and a swipe of paint.
In September, it emerged that Kings Cross Central development, a privately-owned company, had been employing facial recognition technology near King’s Cross station. After public outcry followed the revelations, the development company released a statement saying the technology was being used to aid the police to “prevent and detect crime in the neighborhood and ultimately to help ensure public safety.”
“Tens of millions of people will now have been scanned by facial recognition cameras in this country, yet very few of us even know about it,” said Carlo of Big Brother Watch.
It’s a thought preoccupying me as we gather outside the theatre, wearing our new, anonymous face paint. We walk in silence through the dark London streets, past council estates and new developments, train stations and docks, all watched over by cameras.
We’re an eerie sight, walking in a long line. One local follows us for a while. “What’s going on?” he asks, falling in line with the procession.
Cameras peer at us, but there’s a feeling of defiance — almost mischief — knowing we have perhaps evaded detection.
This article has been updated to include the Metropolitan Police’s facial recognition rollout announcement.
The story you just read is a small piece of a complex and an ever-changing storyline that Coda covers relentlessly and with singular focus. But we can’t do it without your help. Show your support for journalism that stays on the story by becoming a member today. Coda Story is a 501(c)3 U.S. non-profit. Your contribution to Coda Story is tax deductible.
How the global anti-LGBTQ movement found a home in Turkey
The smart city where everybody knows your name
Silicon Savanna: The workers taking on Africa's digital sweatshops
Sectarian violence in Manipur is a mirror for Modi's India
How space traffic in orbit could spell trouble on Earth