The Hammer Drops Government Fines Company for Misusing Child Data

Published
Reading time
2 min read
Mobile app with forbidden sign over it | Federal Trade Commission (FTC) logo

The U.S. government punished an app vendor for building an algorithm based on ill-gotten data.

What’s new: The Federal Trade Commission (FTC), the U.S. agency in charge of consumer protection, ruled that an app developed by WW International (formerly Weight Watchers) violated data-collection laws. In a settlement, the company agreed to pay a fine, destroy data, and deactivate the app, the tech-news website Protocol reported.

How it works: The FTC is empowered to take action against companies that engage in deceptive business practices. Combined with other laws that protect specific classes of people — in this case, children — the agency exercised its authority to combat misuse of data.

  • WW International launched Kurbo in 2019 in a bid to help children between ages 8 and 17 develop healthy eating habits.
  • The app collected personal information such as age, gender, height, weight, and lifestyle choices. Upon registering, users were asked to identify themselves as either an adult or signing up with an adult’s permission. However, the app didn’t verify this input.
  • The lack of verification violated a 1998 law that restricts collecting data from children younger than 13 without permission from a parent or guardian.
  • The app already had drawn criticism from parents and healthcare professionals who decried its potential to encourage eating disorders.

Behind the news: The FTC has punished companies for using improperly collected data twice before. In 2021, it forced the developer of photo-sharing app Everalbum to destroy models it developed using images uploaded by users who hadn’t consented to face recognition. Two years earlier, it demanded that Cambridge Analytica, a UK political consultancy, destroy data it had collected illegally from Facebook users.

Why it matters: The U.S. lacks comprehensive national privacy laws that protect consumer data, but that doesn’t mean it won’t act against companies that abuse personal data. The FTC can prosecute algorithmic abuse based on several interrelated laws, and lately it has done so with increasing frequency.

We’re thinking: If the public is to trust the AI community, it’s necessary to respect privacy and obtain permission for any data that goes into building a model. If the FTC’s willingness to prosecute developers of unruly algorithms provides further incentive, so be it.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox