Generate Articles, Publish Errors CNET pauses its practice of writing news articles with AI.

Published
Reading time
2 min read
Open water tap providing sentences instead of water

A prominent tech-news website generated controversy (and mistakes) by publishing articles written by AI.

What’s new: CNET suspended its practice of publishing articles produced by a text-generation model following news reports that exposed the articles’ authorship, The Verge reported.

What happened: Beginning in November 2022 or earlier, CNET’s editors used an unnamed, proprietary model built by its parent company Red Ventures to produce articles on personal finance. The editors, who either published the model’s output in full or wove excerpts into material written by humans, were responsible for ensuring the results were factual.

Nonetheless, they published numerous errors and instances of possible plagiarism.

  • Insiders said that CNET had generated articles specifically to attract search engine traffic and increase its revenue by providing links to affiliates.
  • The site had initially published generated articles under the byline “CNET Money Staff.” The linked author page said, “This article was generated using automation technology and thoroughly edited and fact-checked by an editor on our editorial staff.” It updated the bylines earlier this month to clarify authorship and provide the editor’s name after Futurism, a competing news outlet, revealed CNET’s use of AI.
  • Futurism determined that several generated articles contained factual errors. For instance, an article that explained interest payments repeatedly misstated how much interest an example loan would accrue. Moreover, many included passages or headlines that were nearly identical to those in articles previously published by other sites.
  • CNET published 78 generated articles before halting the program. Red Ventures said it had also used the model to produce articles for other sites it owns including Bankrate and CreditCards.com.

Behind the news: CNET isn’t the first newsroom to adopt text generation for menial purposes. The Wall Street Journal uses natural language generation from Narrativa to publish rote financial news. Associated Press uses Automated Insights’ Wordsmith to write financial and sports stories without human oversight.
Why it matters: Text generation can automate rote reporting and liberate writers and editors to focus on more nuanced or creative assignments. However, these tools are well known to produce falsehoods, biases, and other problems. Publications that distribute generated content without sufficient editorial oversight risk degrading their reputation and polluting the infosphere.

We’re thinking: Programmers who use AI coding tools and drivers behind the wheels of self-driving cars often overestimate the capabilities of their respective systems. Human editors who use automated writing tools apparently suffer from the same syndrome.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox