Social Media

How Instagram Users Share Their #Depression Stories

The platform seems to be a surprisingly supportive place to share mental health struggles

Social Media
Illustration: Tara jacoby
Feb 20, 2017 at 12:42 PM ET

Most people tend to think of Instagram as a carefully-curated feed of where strategic crops, flattering filters, and emoji-laden captions that help ensure users are being seen in the best, most positive possible light. But for some users with depression, the photo-sharing site can serve a different purpose entirely.

A new study, which will soon be presented at the Association For Computing Machinery conference, looks at the usage of the hashtag “#depression” across the platform. The data was collected in 2014 using Instagram’s API (which has since been made less publicly accessible). Overall, the researchers found 95,046 unique photos uploaded with this hashtag in one month, going on to sample 788 for analysis.

They found that these posts covered a variety of different topics, though most could be related to self-harm, eating disorders, and anxiety or other mental health disorders. The photos included selfies, self-harmed bodies, and even suicide threats.

“It was very difficult and emotionally-challenging work,” Nazanin Andalibi, one of the study’s lead doctoral researchers, said of the experience looking through the posts. “The process was emotionally taxing.”

In addition to analyzing the captions, the researchers also analyzed the comments that were left on these posts. In total, they looked at 1,949 comments on 444 posts.

Contrary to the pervasive rhetoric surrounding largely anonymous online communities built around mental illness and the often-harmful practices symptomatic of them, the researchers actually found that the majority of the responses were positive and supportive.

This is something that Andalibi says surprised her, since, unlike Facebook, Instagram can be an anonymous platform where anyone can post under any user name.

“There’s this kind of double-edged sword about being anonymous and not having to use your real name,” she said. “The popular narrative around anonymity has been that people will troll each other and everything will just be really abusive…but opportunities for anonymity are really central to disclosing things that are sensitive for some people and to give and provide support. It just so happens that in this particular platform people are finding each other and being supportive of each other.”

While the team didn’t catalogue account usernames, she noted that anecdotally, many of these Instagram users’ bios included the fact that it was a secondary account and that usernames did not appear to include real names.

Of course, not every reaction was so positive. When looking at posts that addressed self-harm specifically a number contained language that was supportive of harmful behavior.

“There’s a lot of nuance in how people are responding to disclosures and I think that was maybe the most concerning [finding],” she said.

It’s not the first time that a sociological connection has been made between Instagram use and depression. Another recent study found that photos’ hue (which can be dictated by both organic lighting and filter choice) can be indicative of whether or not an Instagram user is depressed when analyzed by an algorithm.

Aware of the ways that users struggling with depression and other forms of mental illness have been using the platform, Instagram adopted two features developed in conjunction with the National Eating Disorders Association and National Suicide Prevention Lifeline last year. One allows users to anonymously flag a photo when its contents leads them to believe the original poster may need help. The original poster will then receive a message from Instagram offering options to get help. Additionally, when users search Instagram for “depression” and some other terms that address self-harm and mental illness a popup that asks “Can we help?” states that “posts with words or tags you’re searching for often encourage behavior that can cause harm and even lead to death.” It then gives users the option to “get support” or “view posts anyways.” If users select the “get help” option, they’re prompted to talk to a friend, consult a helpline, or get tips and support.

This kind of deliberate effort to keep graphic and potentially triggering content from the platform draws from a long history of blogging platforms like Xanga, Tumblr, and YouTube being accused of not doing enough to address the problem. In lieu of banning all potentially sensitive posts using these keywords, Instagram’s updates aim to ensure those who use the app to give and receive emotional support are not kept from doing so. But, Andalibi says, it may be an imperfect solution.

While she’s supportive of the fact that Instagram is not banning depression-related hashtags altogether (while banning others that imply support for harmful behaviors), she is concerned the strong wording could serve as a deterrent.

“[Some] people are really simply making sense of their identities, seeking support, and providing support to each other,” she said. “What I’d love to see is for the platform to engage in fostering the supportive interactions that are already happening.”