The latest challenge to facial recognition technology is “anti-surveillance clothing,” aimed at confusing facial recognition algorithms as a way of preserving “privacy.” The clothing, covered with ghostly face-like designs to specifically trigger face-detection algorithms, are a backlash against the looming possibility of facial recognition being used in retail environments and for other commercial purposes.

Increasingly common facial recognition technology

It’s another possible obstacle to the use of a technology that has faced many challenges already. Successful implementations of facial recognition technology have been elusive over the years, especially when it comes to identifying “faces in a crowd.” Variables such as partial obstructions, movement, lighting, etc., have been a challenge to the algorithms tasked with identifying faces. Most successful face recognition systems have been those that have controlled the positioning of each face in order to maximise recognition. People walking single-file through a door, for example, provide faces one by one to make them easier to recognise.

Despite the challenges, face recognition technology is becoming more common. The ability to recognises faces (and identities) is central to new marketing technologies such as automated customer service or signage that targets an individual’s buying habits. For example, face recognition is an element in Amazon Go’s recently announced automated convenience store concept.

In the online world, many of
us are willing to give up a
level of privacy if we perceive
a benefit tradeoff. But what
about the physical world?

Infringement on privacy?

Clothing or other textiles with face-like patterns are being positioned as a way for consumers to fight back against any perceived invasions of privacy. Berlin-based artist and technologist Adam Harvey is designing the clothing to confuse face recognition software algorithms. In effect, the clothing “overloads an algorithm with what it wants, oversaturating an area with faces to divert the gaze of the computer vision algorithm,” Harvey told a recent hacking conference in Hamburg. One pattern would reportedly give a computer more than 1,200 possible facial detections.

Concerns about face recognition algorithms extend beyond the ability to identify a person. Algorithms can also identify traits such as “calm” or “kind,” or demographics such as age and gender. Using such small bits of data to drive marketing efforts is problematic for many. Harvey points to the capabilities as another way that facial recognition is changing our expectations of privacy – and even more reason to fight back.

Harvey’s other anti-surveillance work

It’s not the first time Harvey has been involved in trying to foil facial recognition systems; he previously proposed using makeup and hairstyling to prevent machines from detecting a face.

There are also some “anti-drone stealth wear” fashions on the market, made of silver-plated fabric that reflects thermal radiation to enable the wearer to avert overhead thermal surveillance.

Privacy is an ongoing concern in the physical security and video surveillance market, and public opinion is evolving. In the online world, many of us are willing to give up a level of privacy if we perceive a benefit tradeoff. But what about the physical world?

Face detection technology is a common feature of today’s video surveillance cameras, and facial recognition algorithms are becoming more sophisticated.

Questions of privacy in the physical world have not yet evolved, and it’s unclear what benefits (if any) there are for consumers to allow machines to invade their privacy. Until those benefits become apparent, it’s not surprising there would be backlash.

However, anti-surveillance clothing would have to really catch on to make any real difference, wouldn’t it?