Marketers are using computer vision to parse customers by skin color and other perceived racial characteristics.

What’s new: A number of companies are pitching race classification as a way for businesses to understand the buying habits of different groups, according to the Wall Street Journal. This capability is distinct from face recognition, which seeks to identify individuals. Similar systems classify physical or demographic characteristics such as age, gender, and even attractiveness.

What they found: The report identified more than a dozen companies marketing race classification for commercial use. Among them:

  • Face++, one of the world’s biggest face detection companies, offers race classification for tracking consumer behavior and targeting ads.
  • Spectrico said that billboard companies use its software along with gaze tracking models to learn which demographic groups look at their ads. Dating apps also use the technology to ensure their users are labeled accurately by race.
  • Cognitec Systems offers race, age, and gender classification for retailers hoping to collect data about their visitors. None of its customers, which include law enforcement, has used its race classification, the company said.
  • British AI company Facewatch installs face recognition cameras inside retail stores to spot suspected thieves on a watch list. It recently stopped tracking the race, gender, and age of faces deemed suspicious.

Yes, but: Experts worry that this capability could be used to discriminate against particular groups. For instance, a retailer might charge certain people higher prices. More troubling, there are signs that such systems are being used by oppressive regimes to target specific ethnicities.

Why it matters: Machine learning can be a valuable tool for identifying and analyzing demographic trends. But these tools risk invasions of privacy, discrimination both accidental and deliberate, and misuse by authorities.

We’re thinking: We can imagine a system that effectively helps detect and avoid racial bias in, say, law enforcement, yielding a net social benefit. Still, the practice of sorting people by their perceived race has a largely odious and sometimes murderous history. Machine learning engineers working in this field should tread very carefully.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox