Instagram, Facebook become even stricter for teens

  • Meta has imposed new restrictions on the content that young people, especially teens, can gain access to on apps like Facebook and Instagram.
  • Now, Meta says, teens will automatically be restricted from gaining access to certain posts on these apps, especially in terms of self-harm, suicide, and eating disorders.
  • In the past, former employees and independent studies have shown how Meta’s apps can have negative effects on the mental well-being of the young people who use them.

Amid seemingly endless waves of pressure from parents and regulatory bodies across the US and Europe, Meta says it is making content on its social media platforms more restrictive for its younger users, including on Facebook and Instagram.

“Today, we’re announcing additional protections that are focused on the types of content teens see on Instagram and Facebook,” explains the “social metaverse” firm in a blog post published Tuesday.

“We’re automatically placing teens into the most restrictive content control setting on Instagram and Facebook. We already apply this setting for new teens when they join Instagram and Facebook, and are now expanding it to teens who are already using these apps,” it adds.

Meta gives the example of someone posting about their struggle with self-harm, and while it is an important topic to understand and have discussions about, it isn’t appropriate for all ages, especially not younger people who would benefit more from having these conversations approached to them by trained individuals.

The company says it will now block these kinds of posts automatically for children and teens on Instagram and Facebook.

“We already aim not to recommend this type of content to teens in places like Reels and Explore, and with these changes, we’ll no longer show it to teens in Feed and Stories, even if it’s shared by someone they follow,” it explains. Additionally, posts related to suicide, self-harm and eating disorders will also be more limited on these platforms for teen users.

There have been countless examples of the dangers that social media can mean for the mental health of young people. In 2022, UK teen Molly Russel, who passed away in 2017 from an apparent suicide, was found to have been consuming hours upon hours of self-harm content on Pinterest and Instagram before her untimely death. She was 14.

Different institutions have conducted studies on the negative impact that social media can have on younger people. A 2021 study found that people in the US who use a significant amount of social media platforms daily were three times more likely to develop depression than those who use less social media per day.

Meta’s continued efforts in assuaging regulatory bodies that they are in fact doing something to protect young people from the content that can exist on their platforms have only increased in recent years as these bodies put the firm under the magnifying glass.

In November 2022, a former Meta employee told the US Senate that Meta was aware that teens faced harassment and other harms on the company’s platforms, but had failed to act against these issues.

“It’s time that the public and parents understand the true level of harm posed by these ‘products’ and it’s time that young users have the tools to report and suppress online abuse,” said Arturo Bejar, the former employee who worked at well-being at Instagram for two years, as per Reuters.

For parents and guardians who want to know more about the new restrictions from Meta, click here.

[Image – Photo by Mariia Shalabaieva on Unsplash]


About Author


Related News