The new remote administrative assistant is a little too perky, hardworking, and efficient. Is it because he’s a bot?
The fear: Virtual employees are infiltrating the distributed office. Outfitted with programmed personalities and generated smiles, they’re increasingly difficult to tell from flesh and blood. Managers, pleased by the productivity boost, will stop caring which is which, leaving you surrounded by colleagues who cheerfully work 24/7, never make a mistake, and decline invitations to meet up for happy hour.
Horror stories: What started in the middle of the last decade with programs like Clara — who schedules meetings via emails so cordial they might fool the uninitiated — has evolved into human-like agents dressed up with names, faces, and fake resumes.
- WorkFusion offers a line of virtual teammates in six specialized roles, including customer service coordinator, insurance underwriter, and transaction screening analyst. Each digital worker has a persona portrayed by a human actor.
- Synthesia uses generative adversarial networks to synthesize videos that feature photorealistic talking heads that read scripts aloud in 34 languages. Customers use the service to generate training and sales videos without a human actor.
- Marketing companies LIA (for LinkedIn Lead Generation Assistant) and Renova Digital offer avatars that enable real salespeople to close multiple deals at once. Stanford researchers discovered over 1,000 LinkedIn profiles, many of them in marketing, that turned out to be false personas bearing face portraits produced by generative adversarial networks.
Fraudulent friends: White-collar bots pose threats more serious than a proliferation of workplaces with addresses in the uncanny valley. In 2020, fraudsters used a generative audio model to clone the voice of a company director and convince a Hong Kong bank to fork over some $35 million. Con artists using a similar play stole $243,000 from a UK energy firm in 2019.
Facing the fear: Ceaselessly cheerful, perpetually productive automatons might leave their human colleagues feeling demoralized. If you’re going to anthropomorphize your algorithms, at least program them to be late for a meeting once in a while.