Ending the teen mental health crisis isn’t Meta’s priority, so it has to be ours

It’s been obvious for a while now that the leaders at Meta (owner of Facebook, Instagram, WhatsApp, and more) are well aware of the dangers their apps pose to young people. In fact, it’s been documented since Francis Haugen blew the whistle four years ago.

Back then, Haugen revealed internal research from Facebook showing that 13.5% of teen girls said Instagram worsened suicidal thoughts and 17% of teen girls said their eating disorders got worse after using Instagram.

That’s why this week’s news — that Meta’s more recent internal research (via documents reviewed and reported on by Reuters) revealed that teens exposed to more “body-focused” content on the platform felt more negatively about their own bodies — should set off alarm bells for anyone who’s a parent or who cares about kids and teens.

Headline from the Wall Street Journal, September 2021

The harm Meta knows it is causing

Just how big is the problem? In a survey of more than 1,000 teens who used Instagram throughout 2023 and 2024, those who reported feeling badly about their bodies after scrolling the platform (which, frighteningly, was one of five respondents) had seen three times more “body-focused content” than the other teens in the survey. 

As if that’s not bad enough, researchers found that the teens who reported the most negative feelings about themselves saw twice as much content that Meta classifies as “mature themes,” “risky behavior,” “harm and cruelty,” or “suffering.”

Document with headline "'People Disagree' Content Seen by Teens Reporting Different Levels of Body Dissatisfaction After Viewing Content on IG"
Source for recent reporting regarding Meta's new research about Instagram content. ("People Disagree Content" can be translated to "concerning or negative content")

What will Meta do about it? (hint: Not much)

Even worse, the company clearly has little intention of doing anything to mitigate these negative effects beyond announcing small changes that will make little to no improvement. In fact, court documents show that a few years ago, Mark Zuckerberg personally rejected Meta’s proposal to improve teen mental health on its platform. 

Not good news, but not surprising.

As Emma and Jake describe to younger teens in our Social Media Driver’s License video “How Social Media Works”, the purpose of the social media companies is to make money. And they do that by keeping people engaged and scrolling. 

Screenshot from Social Media Driver's License sneak peek
Excerpt from The Social Media Driver's License on how the "Dopamine Loop" keeps kids scrolling

Prevention is the path forward

This is something we at Ready Set Screen have known for years, and why our previous media literacy programs were designed to help young women and girls understand and resist the negative effects of media messages on their self-esteem and body image. 

Our approach was effective in doing just that, with 94% of program participants reporting improved self-esteem and confidence after learning the media literacy skills needed to question what they were seeing in ads and on social media.

We all need to work toward prevention in the service of public health, not just try to “fix” affected young people with hard-to-get therapy (which doesn’t always resolve the problem).

Until the leaders at Meta make significant changes to their platforms that end these terrifying effects, we have to take that prevention into our own hands.

Jennifer Berger is Ready Set Screen’s Founder and Executive Director.

Leave a Reply

Your email address will not be published. Required fields are marked *