HEALTH

Why You Hated Clippy, That Annoying Microsoft Paperclip

Nobody likes anthropomorphized digital assistants. Death to Clippy!

HEALTH
Illustration: R. A. Di Ieso
May 21, 2016 at 10:25 AM ET

Remember Clippy? The know-it-all animated paperclip, officially named “Clippit,” who once haunted unfinished Microsoft Office documents across the country, quickly becoming the most despised computer-generated helper. Although Clippy was formally discontinued in 2007, disparaging memes live on across the internet, often mocking the paperclip’s infuriating catchphrase: “Would You Like Some Help With That?” (We never did).

Now, a new study in the Journal of Consumer Research attempts to figure out why we harbor such disdain for Clippy and other human-like digital assistants. The researchers conclude that Clippy Hate may be a symptom of a much larger problem, especially for web developers intent on providing helpful hints with cutesy characters. The takeaway? We hate them. All of them.

“Across six experiments, including one pilot study and five main studies, the present research showed that anthropomorphism can decrease game enjoyment,” the authors write. Why? Because when digital assistants seem too human, we start hating them just like we hate human strangers who try to help us! In other words: “we propose that computerized helpers with humanlike features can undermine individuals’ autonomy when they provide assistance.” 

More Why Siri Won’t Respond To Serious Cries For Help

Digital assistants have come a long way since Clippy (and the unsung hero of the ’90s desktop—Bonzi Buddy). Siri and Cortana are effectively modern, mobile incarnations of the original desktop assistants, Google just announced its own digital helper, Allo, and most video games guide users through quests with hints delivered by human-like “helpers,” with their predictable monotones and friendly faces. Except…that face isn’t always perceived as friendly.

In fact, according to this new study, we believe that Clippy is judging us even as he begrudgingly doles out hints. And this isn’t the first study to suggest that we hate human-like digital assistants precisely because we assign nefarious, human motivations to their software. In 2015, The New Yorker reported that the software team behind Clippy once held a focus group to figure out why people hated the paperclip so much. They found that, “most of the women thought the characters were too male and that they were leering at them.”

This new research suggests that the same problem exists for virtually all anthropomorphic digital assistants. For the study, scientists asked several hundred students to play a series of simple computer-based puzzle games. When participants got stuck, they were encouraged to click on a help icon that was randomized to either show either a smiling, helpful computer or a faceless computer that delivered the hint. Researchers found that participants enjoyed the game less when the advice was related by a happy, anthropomorphic computer than when it was delivered more simply through a faceless interface.

Interestingly, players also said they felt less in-control of the gameplay when they received advice from the human-like interface, even though the other group received the exact same advice from the faceless icon and seemed to feel entirely in-control. “After receiving help from an anthropomorphized (vs. non-anthropomorphized) helper, participants were less likely to feel that the game outcome was determined by their own actions,” the authors write.

Broadly, the findings suggest that anthropomorphic digital assistants like Clippy make playing computer games (and, of course, writing letters) less enjoyable, and challenge our autonomy. But why?

“We offer a novel mechanism,” the authors write. “That the presence of an anthropomorphized helper can undermine individuals’ perceived autonomy during a computer game…[because] individuals treat non-human entities as human beings when they are anthropomorphized.”

That is, when we see a face on a digital character we subconsciously assign it basic human qualities. And, as with humans, we aren’t always grateful for the help we receive. Prior studies have shown that, “individuals often avoid seeking help from others due to the associated psychological costs of seeking help, such as perceived dependence on others,” the authors write. “Dependence on others implies inferiority and a lack of autonomy.”

So Clippy—perhaps due to his leering eyes—makes us feel inferior, and simply renders gameplay and office work a lot less pleasant than it could be if he’d just lose the judgmental face. And this may explain why Siri has seen far more success than Clippy ever dreamed of. As the authors put it, taking advice from a faceless digital assistant like Siri or Cortana makes us feel less like we’re dependent on another human being and more like we’re “using a tool.”

Nonetheless, if Siri ever tries to help us write an essay—we’ll pass.