“Sympathy sockpuppets” are people who fake their identity when participating in online communities in order to gain care or sympathy from other community members. They cause real and lasting harm to individual members and the online community as a whole, according to new research co-authored by SC&I Assistant Professor of Library and Information Science Kaitlin L. Costello and Devon Greyson of the University of British Columbia in Vancouver, Canada.
Significantly, this research also shows that sympathy sockpuppets are particularly harmful to online communities that provide support for people with chronic illnesses or members of marginalized populations, such as LGBTQ+ youth, where members’ online relationships are often equally, or even more, important as their offline relationships.
Their 2021 paper, “Emotional Strip-Mining”: Sympathy Sockpuppets in Online Communities” was published in the journal New Media and Society.
The aim of their research, Costello said, was to explore and characterize the phenomenon of sympathy sockpuppetry and to provide guidance for moderators, administrators, and informal leaders of communities that encounter this form of online deception.
Their research adds valuable new insights, Costello said, because sympathy sockpuppetry has been understudied compared to other online behaviors that have a primary objective of material gain, such as catfishing or trolling.
In addition, Costello said, for the first time their study focuses “on the impact that deception has on its targets and online communities’ responses, rather than on the individual motivations and goals of deceivers.”
In their qualitative study, Costello said they “asked three research questions: (1) How are sympathy sockpuppets discovered? (2) How do communities respond? (3) What guidance do we have for communities and community moderators who discover sympathy sockpuppets in their group? To answer these questions, we interviewed 7 participants from 5 different online communities where a sympathy sockpuppet had been discovered. Their experiences occurred in all types of groups with a variety of foci, including 1990s message boards, mid-2000s comment sections, and contemporary multiplatform groups. We used thematic content analysis, focused coding, and theoretical coding to analyze our data.”
A few examples of the sympathy sockpuppets Costello and Greyson said they came across during their research include a high school student pretending to be a college student to gain entrance into an adult-only online community; a childless woman pretending to have children in an online community for mothers; and a never-married man pretending to have a deceased wife to elicit sympathy and help from an online community providing support to grieving people.
Costello said they found that the emotional consequences for the members of online communities who were unknowingly fooled and manipulated by the sympathy sockpuppets in their midst were devastating.
“Participants felt ‘emotionally strip-mined’ by sympathy sockpuppets, who extracted emotional labor and support from community members,” Costello said. “The more extensive the deception was, the more care work the person was able to extract, and the more harmful they were. We found that communities experienced a loss of innocence following discovery, and rarely was meaningful closure for the group achieved. Instead, people became more private and shared less, and in some cases, communities dissolved in the aftermath of discovery. In others, communities reaffirmed their norms and values and grew stronger, often in part by making the story of the sympathy sockpuppet into a community in-joke.”
Costello said their findings also indicate that technological solutions, such as moderator tools, did not seem to be associated with whether a community would remain resilient after discovery, become less active, or dissolve altogether. As a result, they don’t recommend that online groups rely on technology to verify that members are who they say they are. Because this is a social problem and not a technical one, Costello recommends that communities focus on prosocial and reparative solutions.
“For example, Costello said, “communities in this study that had policies about deception often focused on punishment and consequences in those policies. Instead, we suggest that communities focus on repairing the community when deception is discovered by acknowledging the incident, facilitating communication between and among members, and clearly conveying shared values around trust, identity, and disclosure.”
Discover more about the Library and Information Science Department at the School of Communication and Information on the website.