1. Blog
  2. Innovation
  3. How Social Media Is Adjusting To Ensure the Emotional Well-being of Its Users
Innovation

How Social Media Is Adjusting To Ensure the Emotional Well-being of Its Users

As challenging as it is, social media platforms are starting to take steps to protect the emotional well-being of their users. Here are some of the ways they are doing so.

Carolina Batista

By Carolina Batista

Solutions Architect Carolina Batista helps BairesDev clients solve technical challenges through creative approaches to deliver high-quality software.

5 min read

Featured image

Social media is under fire for a lot of things these days, from privacy matters to content ownership concerns. But maybe the biggest debate about social platforms is around mental health. There’s a lot of research about social media and its impact on its users, many of which found that these platforms can be highly detrimental to people’s mental health.

In that light, it’s not surprising to learn that people are demanding actions from social media platforms to protect themselves from bullying, gaslighting, depression, and isolation, among other issues. While tackling all those things can be hard, social media platforms have started stepping up to the challenge and are introducing new features and measures to ensure the emotional well-being of users.

Here are some of the most notable ones.

Expert-backed Resources

Probably the most common measure social media platforms have taken to help fight against mental issues is providing users with sets of expert resources filled with important information and actionable tips. Virtually all the major platforms have put forward this kind of resource hub, especially since the pandemic started.

That’s precisely what happened with Instagram, often pointed at as one of the most harmful social media platforms for its users. Though silent for a lot of time about the negative body image issues that plague the platform, Instagram launched a set of expert-backed resources for people with negative body image problems or eating disorders. This hub provides hotlines, advice, and in-depth information about these topics, popping up especially when users search for terms related to eating disorders or share content that addresses it. 

Facebook also introduced a set of mental health resources during this year, expanding on the already available contents of its Emotional Health Resource Center. The additions brought new global mental health guides and tips backed by the World Health Organization while also offering resources developed by different organizations across the world to help battle anxiety, depression, loss, and stress.

Pinterest had already introduced a similar mental resource hub in the U.S. back in 2019, which it later extended to other countries throughout 2020. The feature is called Compassionate Search, a name that speaks for itself about how this works: Once a user searches for terms and phrases of interest regarding stress, anxiety, depression, or difficult emotions, the feature pops up and offers well-being resources. 

TikTok and Snapchat both implemented their own set of resources as well. TikTok launched new features that automatically provide access to organizations that fight against negative body image issues and eating disorders. Snapchat, for its part, announced Here for You, a set of tools that promises proactive in-app support to users who may be having a mental health crisis. 

More Than Just Information

Naturally, users demand more than just access to information from other organizations. Given that the platforms are responsible for what is happening with their users, it’s only natural for people to expect stricter regulations and more control features that help prevent the most common issues in social media. Fortunately, platforms are introducing some interesting novelties. 

For instance, TikTok introduced a screen time management system into its users’ feeds back in 2020. The main goal of this system is to prompt a notification in users’ feeds whenever they’ve spent a lot of time scrolling through the app. This reminder is a very rare feature for social media platforms, which are always so keen to keep people on their sites. 

Snapchat also made improvements to its in-app reporting tools, making it easier for users to alert the platform whenever they feel like other users might be at risk of self-harm. This alert prompts the platform to contact the person and offer all the help available in Snapchat.

Regulations are a huge part of the deal when it comes to fighting mental health issues online. That’s why Instagram has implemented a stricter set of penalties for users sending abusive messages or using hate speech, which results in the deactivation of the accounts that send such messages. Instagram also provides users with the ability to better filter out comments and messages and manage DMs to prevent bullies or unwanted users from reaching out.

Both Facebook and Snapchat have taken a different approach to reach out to people with mental health issues. How? By developing their own video series targeted to audiences struggling with emotional distress. Facebook released Peace of Mind with Taraji, a Facebook Watch talk show hosted by Taraji P. Henson focusing on diverse mental health topics. Snapchat, for its part, did something similar with Mind Yourself, a docuseries hosted by Kevin Hart.

Finally, social media platforms are also partnering with mental health organizations to provide them with much-needed funding. Pinterest has committed to providing $10 million in funding for next year to help mental health organizations. The first beneficiary is the NPO #HalfTheStory, which raises awareness about the impact of social media on mental health. Twitter, the seemingly less active platform, has collaborated with the Cross-Government Working Group on Anti-Muslim Hatred to better understand how to fight hate speech on the platform. 

A Long Way to Go

It seems as if the pandemic has shaken the social media environment and pushed companies to do more about mental health issues. And while all the measures and features commented here are a nice start, they aren’t enough to tackle the widespread problems caused by social media. 

We still have a long way to go to really limit the impact of social media on mental health, so we need to remain active and keep demanding these changes from the platforms we use on a daily basis. That’s the only way we can make sure that social media evolves into a safe space for everyone to share and get together.

Carolina Batista

By Carolina Batista

Carolina Batista is a Solutions Architect at BairesDev. Carolina leverages her expertise to provide the highest quality software delivery by assessing and solving technical challenges, defining teams, and establishing creative approaches to solve client problems.

Stay up to dateBusiness, technology, and innovation insights.Written by experts. Delivered weekly.

Related articles

Innovation - The Future of
Innovation

By BairesDev Editorial Team

4 min read

Contact BairesDev
By continuing to use this site, you agree to our cookie policy and privacy policy.