Meta to hide posts about suicide, eating disorders from teens’ Instagram and Facebook feeds

Advertisement

Advertise with us

SAN FRANCISCO (AP) — Meta said Tuesday it will start hiding inappropriate content from teenagers' accounts on Instagram and Facebook, including posts about suicide, self-harm and eating disorders.

Read this article for free:

or

Already have an account? Log in here »

To continue reading, please subscribe:

Monthly Digital Subscription

$0 for the first 4 weeks*

  • Enjoy unlimited reading on winnipegfreepress.com
  • Read the E-Edition, our digital replica newspaper
  • Access News Break, our award-winning app
  • Play interactive puzzles

*No charge for 4 weeks then price increases to the regular rate of $19.00 plus GST every four weeks. Offer available to new and qualified returning subscribers only. Cancel any time.

Monthly Digital Subscription

$4.75/week*

  • Enjoy unlimited reading on winnipegfreepress.com
  • Read the E-Edition, our digital replica newspaper
  • Access News Break, our award-winning app
  • Play interactive puzzles

*Billed as $19 plus GST every four weeks. Cancel any time.

To continue reading, please subscribe:

Add Free Press access to your Brandon Sun subscription for only an additional

$1 for the first 4 weeks*

  • Enjoy unlimited reading on winnipegfreepress.com
  • Read the E-Edition, our digital replica newspaper
  • Access News Break, our award-winning app
  • Play interactive puzzles
Start now

No thanks

*Your next subscription payment will increase by $1.00 and you will be charged $16.99 plus GST for four weeks. After four weeks, your payment will increase to $23.99 plus GST every four weeks.

Hey there, time traveller!
This article was published 09/01/2024 (670 days ago), so information in it may no longer be current.

SAN FRANCISCO (AP) — Meta said Tuesday it will start hiding inappropriate content from teenagers’ accounts on Instagram and Facebook, including posts about suicide, self-harm and eating disorders.

The social media giant based in Menlo Park, California, said in a blog post that while it already aims not to recommend such “age-inappropriate” material to teens, now it also won’t show it in their feeds, even if it is shared by an account they follow.

“We want teens to have safe, age-appropriate experiences on our apps,” Meta said.

FILE - The Meta logo is seen at the Vivatech show in Paris, France, June 14, 2023. Meta said in a blog post Tuesday, Jan. 9, 2024, that it will start restricting inappropriate content for teenagers' accounts on Instagram and Facebook, such as posts about suicide, self-harm and eating disorders. (AP Photo/Thibault Camus, File)
FILE - The Meta logo is seen at the Vivatech show in Paris, France, June 14, 2023. Meta said in a blog post Tuesday, Jan. 9, 2024, that it will start restricting inappropriate content for teenagers' accounts on Instagram and Facebook, such as posts about suicide, self-harm and eating disorders. (AP Photo/Thibault Camus, File)

Teen users — provided they did not lie about their age when they signed up for Instagram or Facebook — will also see their accounts placed on the most restrictive settings on the platforms, and they will be blocked from searching for terms that might be harmful.

“Take the example of someone posting about their ongoing struggle with thoughts of self-harm. This is an important story, and can help destigmatize these issues, but it’s a complex topic and isn’t necessarily suitable for all young people,” Meta said. “Now, we’ll start to remove this type of content from teens’ experiences on Instagram and Facebook, as well as other types of age-inappropriate content.”

Meta’s announcement comes as the company faces lawsuits from dozens of U.S. states that accuse it of harming young people and contributing to the youth mental health crisis by knowingly and deliberately designing features on Instagram and Facebook that addict children to its platforms.

Critics said Meta’s moves don’t go far enough.

“Today’s announcement by Meta is yet another desperate attempt to avoid regulation and an incredible slap in the face to parents who have lost their kids to online harms on Instagram,” said Josh Golin, executive director of the children’s online advocacy group Fairplay. “If the company is capable of hiding pro-suicide and eating disorder content, why have they waited until 2024 to announce these changes?”

Report Error Submit a Tip