Avatars Gone Wild What's going on with Lensa, the AI-powered selfie app?

Published
Reading time
2 min read
Sequence of cellphones showing a photograph turned into art on Lensa AI app

A blockbuster app produces sexualized avatar images, even when the original portraits were safe for work.

What's new: Lensa AI, a photo editor that turns face photos into artistic avatars, sometimes generates sexualized images from plain selfies, according to several independent reports. It can also be manipulated to produce more explicit imagery, raising concerns that it may be used to victimize people by generating lewd images of their likeness.

How it works: Users upload 10 to 20 photos and choose a gender. The app uses the open source Stable Diffusion image generator to produce images in various art styles including fantasy, comic-book, and faux-3D rendering. Users must buy a $36 annual subscription to use the image generator, which costs an additional $3.99 for 50 images, $5.99 for 100, or $7.99 for 200. The terms of service disallow nudes and photos of minors, and the app requests that users verify that they are adults.

NSFW: Journalists conducted tests after hearing complaints from users.

  • A reporter for MIT Technology Review, who is Asian and female, generated 100 avatars. Sixteen of them were topless and another 14 dressed her in revealing outfits. The app produced fewer sexualized images of white women, and fewer still when she used male content filters.
  • A Wired reporter, who is female, uploaded images of herself at academic conferences, and the app produced nude images. When she uploaded childhood face portraits of herself, it produced depictions of her younger self in sexualized poses.
  • A TechCrunch reporter uploaded two sets of images. One contained 15 non-sexual photos of a well-known actor. The other included the same 15 photos plus five in which the actor’s face had been edited onto topless female images. The first set generated benign outputs. Of the second set, 11 out of 100 generated images depicted a topless female.

Behind the news: Image generators based on neural networks have churned out nonconsensual nude depictions of real people at least since 2017. Open-source and free-to-use models have made it easier for the general public to create such images. In November, Stability AI, developer of Stable Diffusion, released a version trained on a dataset from which sexual images had been removed.

Why it matters: Text-to-image generators have hit the mainstream: Lensa was the Apple Store’s top download last week, and three similar apps were in the top 10. People who fear deepfakes now have cause for a once-hypothetical concern: Anybody who has access to photos of another person could hijack their images.

We're thinking: Image generation has widespread appeal and it’s easy to use. That’s no excuse for misusing it to degrade or harass people. Creating or sharing a nude depiction of someone without their permission is never okay.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox