The algorithm of a massive social media company provides mental health support to some, but at what expense?
As TikTok continues to gain popularity, concerns have been raised about its effect on society, especially in relation to mental health. A first-of-its-kind study from computer science researchers at the University of Minnesota Twin Cities reveals that TikTok and its exclusive algorithm can either provide a sanctuary or become a hindrance for individuals who are struggling with their mental wellbeing.
The research by the investigators will appear in the proceedings of the Association for Computing Machinery (ACM) Conference on Human Factors in Computing Systems. They will be showcasing their study at the soon-to-be-held conference, which will take place from April 23rd to 28th.
Through interviews with TikTok users, the University of Minnesota team found that the platform provided many people with a sense of self-discovery and community they were unable to find on other social media. However, the researchers said, the TikTok algorithm also displayed a worrying tendency to repeatedly expose users to content that could be harmful to their mental health.
“TikTok is misunderstood by people who don’t use the platform,” explained Stevie Chancellor, senior author of the paper and an assistant professor in the University of Minnesota Department of Computer Science & Engineering. “They think of it as the dance platform or the place where everybody gets an ADHD diagnosis. Our research shows that TikTok helps people find community and mental health information. But, people should also be mindful of its algorithm, how it works, and when the system is providing them things that are harmful to their wellbeing.”
TikTok is different from other social media platforms in that it is primarily run by a recommender system algorithm that displays videos it thinks you will like on your “For You Page” feed, as opposed to mostly showing posts from accounts you follow. While this can be great for showing you more content that you like, it can also lead to a rabbit hole of negative content that’s nearly impossible to escape from, the researchers said.
“TikTok is a huge platform for mental health content,” said Ashlee Milton, first author of the paper and a University of Minnesota computer science and engineering Ph.D. student. “People tend to gravitate toward social media to find information and other people who are going through similar situations. A lot of our participants talked about how helpful this mental health information was. But at some point, because of the way the feed works, it’s just going to keep giving you more and more of the same content. And that’s when it can go from being helpful to being distressing and triggering.”
The researchers found that when users get into harmful spirals of negative content, there often is no escape. The TikTok interface includes a “Not interested” button, but the study participants said it didn’t make any difference in the content that appeared in their feeds.
The research participants also expressed that it’s difficult to discern when TikTok creators are posting emotional or intense mental health content genuinely, or if they’re just “chasing clout” to gain more followers and likes. Many participants were forced to take breaks or quit using the platform entirely because of the stress it caused.
According to the University of Minnesota researchers, all of this doesn’t mean TikTok is evil. But, they said, it is useful information to keep in mind when using the platform, especially for mental health purposes.
“One of our participants jokingly referred to the For You page as a ‘dopamine slot machine,’” Milton said. “They talked about how they would keep scrolling just so that they could get to a good post because they didn’t want to end on a bad post. It’s important to be able to recognize what is happening and say, ‘Okay, let’s not do that.’”
This study is the first in a series of papers Chancellor and Milton plan on writing about social media, TikTok, and mental health.
“Ashlee and I are interested in how platforms may promote harmful behaviors to a person so that eventually, we can design strategies to mitigate those bad outcomes,” Chancellor said. “The first step in this process is interviewing people to make sure we understand their experiences on TikTok. We need insights from people before we as computer scientists go in and design to fix this problem.”
Reference: ““I See Me Here”: Mental Health Content, Community, and Algorithmic Curation on TikTok” by Ashlee Milton, Leah Ajmani, Michael Ann DeVito and Stevie Chancellor, 23 April 2023, Association for Computing Machinery (ACM) Conference on Human Factors in Computing Systems.
In addition to Chancellor and Milton, the research team included University of Minnesota Twin Cities computer science and engineering Ph.D. student Leah Ajmani and University of Colorado Boulder researcher Michael Ann DeVito.