Spotlight on Unreproducible Results Papers Without Code collects unreproducible AI research.

Published
Reading time
2 min read
Square brackets with lines disappearing inside

A new website calls out AI research that may not lend itself to being reproduced.

What’s new: Papers Without Code maintains a directory of AI systems that researchers tried but failed to reproduce. The site (the name of which is a play on the indispensable Papers With Code), aims to save researchers time wasted trying to replicate results published with insufficient technical detail.

How it works: Users can submit a link to a paper, a link to their attempt to reproduce it, and an explanation of why their effort failed.

  • After reviewing the links, the site’s administrators contact the original authors and request data, code, and pointers necessary to reproduce their work. If the authors don’t reply or provide insufficient information, the administrators add the paper to a public list.
  • To date, the website has received more than 10 submissions, six of which have been posted. Two authors have uploaded their code. Once a paper passes muster, its author is encouraged to post it to Papers With Code, which documents 40,000 replicated computer science studies.
  • The researcher behind Papers Without Code, who goes by the user name ContributionSecure14 on Reddit, started the website after wasting a week trying to replicate a machine learning study.
    They advise authors who can’t release their code, data, or infrastructure for proprietary reasons to work directly with others trying to replicate their efforts. “There’s no point in publishing the paper in the public domain if others cannot build off it,” they told TechTalks.

Behind the news: Google engineer Pete Warden proclaimed a “machine learning reproducibility crisis” in 2018. Since then the issue has emerged as a widespread concern.

  • Last year, 31 researchers criticized the lack of technical detail in a Google paper that describes a cancer system that purportedly outperformed human doctors.
  • One of that paper’s coauthors, Joelle Pineau of McGill University and Facebook, worked with NeurIPS to ensure that papers submitted to the conference come with working code and data. She also published a Machine Learning Reproducibility Checklist.
  • Rescience C is a peer-reviewed journal that publishes replication efforts of computer science papers.

Why it matters: Reproducibility is an essential part of science, and AI is one of many fields facing a so-called replication crisis brought on by growing numbers of papers that report unreliable results.

We’re thinking: While we applaud the spirit of this effort, without a transparent review process and a public list of reviewers, it could be used to demean researchers unfairly. We urge other research venues and institutions to take up the cause.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox