Newswise — More than half of the U.S. population will, at some point, grieve a loved one who has died by suicide, and nearly three quarters of the bereaved will turn to the internet for support.

But do online grief support platforms like Reddit and Facebook help or harm?

New University of Colorado Boulder research suggests they can do both.

“All grief is hard, but suicide is often sudden, traumatic and has a lot of social stigma around it. No one knows what to say, so you can feel really isolated,” said Dylan Thomas Doyle, a Ph.D candidate in the Department of Information Science at CU Boulder. “It’s comforting to go to these spaces and have people say, ‘I’ve been through that. I know what you’re feeling.’ But right now, it’s sort of a free for all.’”

As Doyle reports in two new studies, such spaces, while supportive in many respects, can also expose emotionally vulnerable people, including children, to graphic stories, unhelpful comments and other potentially re-traumatizing content.

The studies were published in the Proceedings of the ACM on Human-Computer Interaction.

Doyle and his co-authors examined nearly 2,600 posts and 16,502 comments in the r/SuicideBereavement subreddit on Reddit.

The team used AI natural language processing (NLP) technology to get insight into the emotional state of users and identify different kinds of posts, from lengthy stories to short questions or requests for resources.

Nearly half of content posted was narrative storytelling and many of those stories were extremely graphic.

When the team noticed a large subset of users were writing letters to the deceased, they launched a companion study in which they read through 189 such posts and 652 comments.

The posts were anonymized and the research team took steps to care for their own mental health.

“Even as researchers, we struggled to read some of these,” said Doyle, a former hospital chaplian and Unitarian Universalist minister who has, himself, lost oved ones to suicide.

Some letter-writers shared how they had found out and how it affected them. Others asked for explanations or sought forgiveness for not doing enough. One shared fond memories about a final trip they and the deceased had taken. Many commenters responded with comfort, gratitude and offers of support outside the platform.

But some shared detailed descriptions of the way they had found their loved ones or the way their death had been carried out. Some expressed rage and hatred.

The team was heartened to find almost no deliberately abusive comments but they did find some they deemed “unsupportive,” in which commenters replied with graphic stories.

“Some people come there just seeking resources or asking factual questions, and don’t expect to find people sharing narratives of really tough images,” Doyle said. “For people already in a vulnerable emotional state, it can be damaging.”

Doyle stressed that he is not specifically critiquing Reddit, but rather raising questions about how to more effectively support people using social media platforms for suicide bereavement support. 

He imagines a day when, using the AI tool his team developed, narrative posts (which can be therapeutic to write) could be categorized and users, when logging on, could opt in or out of seeing them.

He also suggests that moderators get training around grief support and users have an opportunity to customize what they want to see at the top of their feed.

 

Journal Link: Proceedings of the ACM on Human-Computer Interaction