Raconteur

Fraud & Privacy 2020

Issue link: https://raconteur.uberflip.com/i/1271462

Contents of this Issue

Navigation

Page 3 of 15

F R A U D & P R I V A C Y 04 Activists are finding ways to get around facial recognition software as the debate around ethical surveillance rages on Meet the people fooling facial recognition Cocoa Laney S U R V E I L L A N C E The Dazzle Club uses a facepaint technique developed by artist and activist Adam Harvey to trick facial recognition software Right: The IRpair glasses by Reflectacles are designed to block 3D infrared facial scanning It's about making that invisible tech visible... especially as the Met Police are starting to deploy these cameras in the city PUBLIC AT TIT UDES TOWARDS FACIAL RECOGNITION Share of US adults who find the use of facial recognition acceptable/not acceptable in the following situations Law enforcement assessing security threats in public spaces Acceptable Pew Research Center 2020 Not acceptable Not sure Companies automatically tracking employee attendance Advertisers seeing how people respond to public ad displays 59% 15% 13% 30% 41% 15% 15% 54% 16% decade ago, it was possible to attend a protest in relative anonymity. Unless a person was on a police database or famous enough to be identified in a photo- graph doing something dramatic, there would be little to link them to the event. That's no longer the case. Thanks to a proliferation of street their promotional literature about the anti-spoofing mechanisms they are working on. "They want to show they're thwarting the 'rebels' or 'hackers' and this has led to further developments in the technologies," he says. This was the case with CV Dazzle, which uses face paint to trick or dazzle the computer vision by dis - rupting the expected contours, sym- metry and dimensions of a face. The technique was invented by the American artist and activist Adam Harvey in the early-2010s and it proved to be effective at confusing the software that was emerging at the time, though it's creator has noted it doesn't always fool present-day tech. Yet, it does still disrupt the facial He's created a so-called invisibility cloak, though in reality it looks more like an incredibly garish hoodie. The cloak, a research tool which is also sold online, works by fooling facial recognition software into thinking there isn't a face above it. In 2015, when Scott Urban, founder of Chicago-based privacy eyewear brand Reflectacles, saw facial recog - nition becoming "more popular and intrusive", he set out to make glasses that would "allow the wearer to opt out of these systems". He created a model designed to block 3D infrared facial scanning, used by many security cameras, by turning the lenses black. While another model reflects light to make it harder to identify a user's face data from a phone picture. Other anti-surveillance designs include a wearable face projector, which superimposes another face over that of the person wearing the device, a transparent mask with a series of curves that attempts to block the facial recognition software while still showing the user's facial expressions, balaclavas with a mag - nified pixel design and scarves cov- ered in a mash up of faces. Benjamin says the problem with all these techniques is that the companies making the facial recog- nition technology are always trying to improve their systems and over- come the tricks, often boasting in without public consent or knowledge. In response, designers and privacy activists have sought to make clothing and accessories that can thwart facial recognition technology. According to Garfield Benjamin, a post-doctoral researcher at Solent University, who specialises in online privacy, they rely on two main techniques. "Either they disrupt the shape of the face so that recognition software can't recognise a face is there or can't identify the specific face," he says. "Or they confuse the algorithm with different patterns that make it seem like there are either hundreds or no faces present." At the University of Maryland, Tom Goldstein, associate professor in the Department of Computer Science, is working on the second technique. cameras and rapid advances in facial recognition technology, private com - panies and the police have amassed face data or faceprints of millions of people worldwide. According to Big Brother Watch, a UK-based civil liber- ties campaign group, this facial biom- etric data is as sensitive as a finger- print and has been largely harvested A Sam Haddad Reflectacles

Articles in this issue

Archives of this issue

view archives of Raconteur - Fraud & Privacy 2020