The potential negative impacts of social media on mental health are nothing new. Study after study has shown that time spent on these platforms can disrupt sleep and expose teens to bullying and unrealistic views of other people’s lives. Links have been observed between high levels of social media use and depression or anxiety symptoms.
Some social media platforms have shown that they are aware of their influence on mental health by releasing updates to protect users and better their experience online.
Popular video-sharing app TikTok has put out at least half a dozen initiatives this year alone to further safety and privacy, primarily for teen users. Its latest initiative, launched Sept. 14, is aimed at promoting new resources to support mental health and community well-being on its app.
“While we don’t allow content that promotes, glorifies or normalizes suicide, self-harm or eating disorders, we do support people who choose to share their experiences to raise awareness, help others who might be struggling and find support among our community,” Tara Wadhwa, director of policy for TikTok US, said in a statement announcing the additional resources to support well-being.
The app is expanding its resources related to eating disorders, for example. Earlier this year, TikTok started offering guides and tools in the search results when a user searched terms related to eating disorders. TikTok also previously introduced permanent PSAs for certain hashtags, like the #whatieatinaday tag, that promote safety and awareness. Now, the app will also have a safety center guide on eating disorders for teens, caregivers and educators. TikTok collaborated with the National Eating Disorders Association (NEDA), National Eating Disorder Information Centre, Butterfly Foundation and Bodywhys on its eating disorder guide.
The platform is also expanding its search interventions. When a user searches for terms related to suicide, for example, they’ll be directed to local support resources such as the Crisis Text Line helpline. Users will also be able to opt-in to view videos in the search results from TikTok creators sharing their own experiences with mental health, providing information on where to find support, and giving advice on how to talk to loved ones.
The app is also strengthening its notices on search results, including its warning label on sensitive content.
“We’re proud that our platform has become a place where people can share their personal experiences with mental well-being, find community and support each other, and we take very seriously our responsibility to keep TikTok a safe space for these important conversations,” TikTok said in its statement.
The TikTok announcement comes as social media brands have come under increased scrutiny for their negative effects on people’s mental health.
“Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse,” internal research from Instagram claimed. “Teens blame Instagram for increases in the rate of anxiety and depression.”
Another slide claimed this increase in anxiety and depression among users was “consistent across all groups,” according to WSJ.
To curb these issues, Instagram created a “well-being” team in 2018 that has come up with ideas like hidling like-counts and creating the ability to flag posts for those who could be suffering from mental illness and potentially self-harm.
Instagram has worked with organizations like the National Eating Disorders Association and the National Suicide Prevention Lifeline in recent years to craft these new features — the company also has a page dedicated to related useful information on its help center.
Constantly scrolling on social media apps, especially news-heavy ones like Twitter
can impact somebody’s mental health in a negative way, according to Mesfin Bekalu, a research scientist at the Lee Kum Sheung Center for Health and Happiness at Harvard’s T. H. Chan School of Public Health. She warns users against doom-scrolling, during which one spends an excessive amount of screen time absorbed in negative news.
“Doom-scrolling can lead to the same long-term effects on mental health unless we mount interventions that address users’ behaviors and guide the design of social media platforms in ways that improve mental health and well-being,” she said.
Like Instagram, Twitter also has a feature to report those users threatening any type of violence directly in its app, and includes a page dedicated to providing useful information on abuse, self-harm and other types of violence.
Earlier this month, Twitter debuted “Safety Mode” that uses artificial intelligence to automatically block users who are being aggressive or hateful.
Representatives from Facebook and Twitter were not immediately available to comment on this story.
Stocks of many social media companies have jumped over the past year. In the last 12 months, Facebook is up more than 46%, Twitter is up 58%, and Snap Inc.
is up 198%, compared to a 33% gain for the S&P 500
over the same period.