Examples of high-resolution versions of low-resolution images.
Bias

Image Resolution in Black and White

A new model designed to sharpen images tends to turn some dark faces white, igniting fresh furor over bias in machine learning. Photo Upsampling via Latent Space Exploration (Pulse) generates high-resolution versions of low-resolution images.
Face recognition system in a supermarket
Bias

Tech Giants Face Off With Police

Three of the biggest AI vendors pledged to stop providing face recognition services to police — but other companies continue to serve the law-enforcement market.
Partnership in AI, Amazon, Baidu, Google, Facebook, IBM, Microsoft logos
Bias

Baidu Leaves Partnership on AI

Baidu backed out of a U.S.-led effort to promote ethics in AI, leaving the project without a Chinese presence. The Beijing-based search giant withdrew from the Partnership on AI, a consortium that promotes cooperation on issues like digital privacy and algorithmic bias.
Illustration of a doctor and a nurse
Bias

Gender Bender

AI learns human biases: In word vector space, “man is to computer programmer as woman is to homemaker,” as one paper put it. New research helps language models unlearn such prejudices.
Data related to methods for curating news feeds
Bias

Algorithms Choose the News

Machines took another step toward doing the work of journalists. Microsoft laid off dozens of human editors who select articles for the MSN news service and app. Going forward, AI will do the job.
Angry emoji over dozens of Facebook like buttons
Bias

Facebook Likes Extreme Content

Facebook’s leadership has thwarted changes in its algorithms aimed at making the site less polarizing, according to the Wall Street Journal. The social network’s own researchers determined that its AI software promotes divisive content.
Road sign with the word "trust"
Bias

Toward AI We Can Count On

A consortium of top AI experts proposed concrete steps to help machine learning engineers secure the public’s trust. Dozens of researchers and technologists recommended actions to counter public skepticism toward artificial intelligence, fueled by issues like data privacy.
Women in AI in academia and industry chart
Bias

AI’s Gender Imbalance

Women continue to be severely underrepresented in AI. A meta-analysis of research conducted by Synced Review for Women’s History Month found that female participation in various aspects of AI typically hovers between 10 and 20 percent.
Text "You only live once. #YOLO" written over an orange background
Bias

Code No Evil

A prominent AI researcher has turned his back on computer vision over ethical issues. The co-creator of the popular object-recognition network You Only Look Once (YOLO) said he no longer works on computer vision because the technology has “almost no upside and enormous downside risk.”
Heart shape made with two hands
Bias

That Swipe-Right Look

In an online dating profile, the photo that highlights your physical beauty may not be the one that makes you look smart or honest — also important traits in a significant other. A new neural network helps pick the most appealing shots.
Person on an online job interview
Bias

Limits on AI Job Interviews

As employers turn to AI to evaluate job applicants, a U.S. state imposed limits on how such tools can be used. The Illinois legislature passed the AI Video Act, which gives candidates a measure of control over how hiring managers collect and store video interviews.
Excerpt of a video showing how HireVue works
Bias

HR’s Robot Helper

For some college graduates, landing a first job means making a good impression on a chatbot. University guidance counselors around the U.S. are preparing students for interviews with AI-powered screening algorithms.
ImageNet face recognition labels on a picture
Bias

ImageNet Gets a Makeover

Computer scientists are struggling to purge bias from one of AI’s most important datasets. ImageNet’s 14 million photos are a go-to collection for training computer-vision systems, yet their descriptive labels have been rife with derogatory and stereotyped attitudes toward race, gender, and sex.
Zhi-Hua Zhou
Bias

Zhi-Hua Zhou: Fresh Methods, Clear Guidelines

I have three hopes for 2020.
Oren Etzioni
Bias

Oren Etzioni: Tools For Equality

In 2020, I hope the AI community will grapple with issues of fairness in ways that tangibly and directly benefit disadvantaged populations.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox