Drop off your adversarial hats, eyeglasses, and tee shirts to the second-hand store. The latest fashion statement is adversarial makeup.
What’s new: Researchers at Ben-Gurion University and NEC developed a system for applying natural-looking makeup that makes people unrecognizable to face recognition models.
How it works: Working with 20 volunteers, the researchers used FaceNet, which learns a mapping from face images to a compact Euclidean space, to produce heat maps that showed which face regions were most important for identification.
- They used the consumer-grade virtual makeover app YouCam Makeup to adapt the heatmaps into digital makeup patterns overlaid on each volunteer’s image.
- They fed iterations of these digitally done-up face shots to FaceNet until the subject was unrecognizable.
- Then a makeup artist physically applied the patterns to actual faces in neutral tones.
- The volunteers walked down a hallway, first without and then with makeup, while being filmed by a pair of cameras that streamed their output to the ArcFace face recognizer.
Results: ArcFace recognized participants wearing adversarial makeup in 1.2 percent of frames. It recognized those wearing no makeup in 47.6 percent of video frames, and those wearing random makeup patterns in 33.7 percent of frames.
Why it matters: This new technique requires only ordinary, unobtrusive makeup, doing away with accessories that might raise security officers’ suspicions. It offers perhaps the easiest way yet for ordinary people to thwart face recognition — at least until the algorithms catch on.
We’re thinking: You can’t make up this stuff. Or can you?