Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Facebook and Instagram parent company Meta will restrict the type of content that teenagers can see on its platforms in the near future as part of wider efforts to make the social networks safer and more “age-appropriate” for young users.
Meta said the new content limitations are designed to ensure teenagers have a more age-appropriate experience when using the apps and will make it harder for teens to view and search for sensitive content such as suicide, self-harm, and eating disorders.
Teens attempting to access such content will instead be diverted to helpful resources including information from organizations like the National Alliance on Mental Illness, the social media giant said.
The company noted that it already aims not to recommend such sensitive content to teenagers in its “Reels” and “Explore” sections on the apps but that the new changes mean teenagers will also not be able to see such content in their “Feeds” and “Stories.”
“While we allow people to share content discussing their own struggles with suicide, self-harm, and eating disorders, our policy is not to recommend this content and we have been focused on ways to make it harder to find,” the company said.
Meta noted the new policy is already being rolled out for teenagers and will be fully in place on both platforms in the coming months.
The company is also automatically placing teens into the most restrictive content control settings on Instagram and Facebook, it said.
The new updates come as Meta—which is headed by billionaire Mark Zuckerberg—faces mounting pressure from regulators and lawmakers in both the United States and Europe over claims its social media sites are addictive and harmful to the mental health of younger users.
The lawsuit further accuses Meta of having “profoundly altered the psychological and social realities of a generation of young Americans” through technologies that boost engagement, and says the company flouted its obligations under the Children’s Online Privacy Protection Act by “unlawfully collecting the personal data of its youngest users without their parent’s permission.”
Meta has denied the claims made in the lawsuit and regularly touted its work over the past decade to bolster the safety of teenagers online, noting that it has over 30 tools to support teenagers and their parents.
However, in November, former Meta employee turned whistleblower Arturo Bejar told the Senate Judiciary Committee on Privacy, Technology, and the Law that the company was aware of the harm its products may cause to young users but failed to take appropriate action to remedy the issue.
Mr. Bejar, who worked as a Facebook engineering director from 2009 to 2015, and later as a consultant at Instagram from 2019 to 2021, told the committee that he had highlighted the issue in an email to Mr. Zuckerberg but that his warnings ultimately went unheeded.
In its blog post on Tuesday, Meta said it wants teens to have safe, age-appropriate experiences across its apps.
“Parents want to be confident their teens are viewing content online that’s appropriate for their age,” Vicki Shotbolt, CEO of ParentZone.org said in the post. “Paired with Meta’s parental supervision tools to help shape their teens’ experiences online, Meta’s new policies to hide content that might be less age-appropriate will give parents more peace of mind.”