Meta Accused of Hiding Evidence of Social Media Harm

Mental health
  • Plaintiffs allege Meta hid product risks from users and authorities
  • Meta accused of ineffective youth safety features and prioritizing growth over safety
  • Meta opposed unsealing of internal documents in court
  • Meta allegedly ignored valid research findings on mental health impacts
  • TikTok allegedly influenced National PTA to publicly support its safety claims

Meta is facing allegations that it shut down internal research after discovering causal evidence linking Facebook use to negative mental health outcomes. According to unredacted court filings, a 2020 project called “Project Mercury” found that users who deactivated Facebook for a week reported lower levels of depression, anxiety, loneliness, and social comparison. Rather than publishing the results, Meta allegedly halted further work, claiming the findings were influenced by negative media narratives. Internal staff messages cited in the filings suggest researchers believed the conclusions were valid and compared the decision to tobacco companies concealing harmful evidence.

Plaintiffs Claim Risks Were Concealed

The allegations form part of a lawsuit filed by Motley Rice on behalf of U.S. school districts against Meta, Google, TikTok, and Snapchat. Plaintiffs argue that the companies intentionally hid recognized risks from parents, teachers, and users. Specific claims include encouraging underage use, failing to address child sexual abuse content, and promoting social media use among teenagers during school hours. The filings also allege that platforms sought to influence child-focused organizations by offering sponsorships, citing TikTok’s reported partnership with the National PTA.

Meta faces more detailed accusations than its rivals. Internal documents reportedly show that youth safety features were designed to be ineffective, with testing blocked to avoid harming growth. The filings claim Meta required 17 attempts at sex trafficking before removing accounts, optimized engagement despite harmful content, and stalled efforts to prevent predators from contacting minors. A 2021 text message from CEO Mark Zuckerberg allegedly stated child safety was not his top priority compared to building the metaverse.

Meta’s Response

Meta spokesman Andy Stone disputed the allegations, saying the company’s teen safety measures are effective and accounts flagged for sex trafficking are removed immediately. He argued that the lawsuit misrepresents Meta’s efforts and relies on selective quotes. Stone emphasized that Meta has worked for over a decade to improve safety, listening to parents and making changes to protect teens. The company has filed a motion to strike the documents, citing concerns about the breadth of what plaintiffs seek to unseal.

TikTok, Google, and Snapchat did not immediately respond to requests for comment. A hearing on the case is scheduled for January 26 in Northern California District Court. The outcome could have significant implications for how social media companies handle internal research and user safety. Critics argue the filings highlight a broader pattern of prioritizing growth over safeguarding young users.

Broader Context

The controversy underscores ongoing debates about the impact of social media on mental health, particularly among teenagers. Research has long suggested links between online platforms and issues such as anxiety, depression, and social comparison. What makes the Meta case notable is the allegation of causal evidence, which goes beyond correlation. If proven, the findings could reshape regulatory discussions around digital platforms and youth protection.

The comparison to the tobacco industry is not new in tech debates. In the 1990s, internal tobacco documents revealed companies knew of health risks but concealed them. Critics now argue that social media firms risk following a similar path if they suppress evidence of harm, making transparency a central issue in ongoing lawsuits.


 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.