Meta to limit reach of harmful content to teens on Facebook and Instagram amid scrutiny


Facing increased scrutiny over the effects of its social media on teen users, Meta announced Tuesday that teens on Facebook and Instagram will see less content related to self-harm and eating disorders.

Meta already filters such content from the feeds it recommends to users, such as Instagram's Reels and Explore. But under a series of changes rolling out in the coming months, harmful posts and stories will not be shown to teens “even if [they’re] shared by someone they follow,the company said in a statement.

Harmful topics include suicide, self-harm, eating disorders, restricted goods (including firearms, drugs, and sex toys), and nudity.

Another change will automatically set users under 18 to the most restrictive content recommendation settings, with the goal of making Meta's algorithms less likely to recommend harmful content to them. However, it's unclear whether teens could simply change their settings to remove the restrictions.

The company says the apps' search functionality will be limited to queries related to harmful topics. Instead of providing requested content, the apps will direct users to get help when searching for content related to suicide, self-harm, and eating disorders.

Teen users will also be asked to update their privacy settings, according to the statement.

The changes are necessary to help “social media platforms [into] spaces where teens can connect and be creative in age-appropriate ways,” said Rachel Rodgers, associate professor in the Department of Applied Psychology at Northeastern University.

Facebook and Instagram have been wildly popular with teens for years. The platforms have raised concerns among parents, experts and elected officials about the effects on younger users, partly because of what these users see and partly because of the amount of time they spend on the networks.

US Surgeon General Vivek Murphy warned in May that because the effects of social media on children and teens were largely unknown, companies needed to take “immediate action to protect children now.”

In October, California joined dozens of other states in a lawsuit against Meta, alleging that the company used “psychologically manipulative product features” to attract young users and keep them on the platforms as long as possible.

“Meta has leveraged its extraordinary innovation and technology to attract young people and teenagers to maximize the use of its products,” says Atty. Gen. Rob Bonta said at a news conference where he announced the lawsuit.

In November, an unredacted version of the lawsuit revealed an allegation that Mark Zuckerberg vetoed a proposal to ban camera filters in apps that simulate the effects of plastic surgery, despite concerns that the filters could be harmful. for the mental health of users.

After the unredacted complaint was published, Bonta was more emphatic: “Meta knows that what he is doing is bad for children, period,” he said in a statement, saying the evidence is “there in black and white, and It is damning. “

The Associated Press contributed to this report.

scroll to top