Meta buried anxiety and depression research as employees called themselves “drug pushers,” lawsuit claims
By isabelle // 2025-11-25
 
  • Meta buried internal research proving its platforms worsen depression, anxiety, and loneliness in teens.
  • Employees compared Instagram to a drug, calling themselves "pushers" while exploiting young users’ dopamine responses.
  • The company lied to Congress about its knowledge of harm, despite clear evidence from its own "Project Mercury" study.
  • Meta’s algorithms were intentionally designed to maximize addiction in children, ignoring warnings about mental health risks.
  • A bipartisan coalition of 33 attorneys general is suing Meta for prioritizing profit over the well-being of minors.
Newly unsealed court documents allege that Meta, the parent company of Facebook and Instagram, knowingly concealed internal research proving its platforms cause significant mental health harm. This legal bombshell, emerging from a federal lawsuit filed by a bipartisan coalition of 33 attorneys general and numerous U.S. school districts, paints a picture of a tech giant that prioritized profit and user engagement over the well-being of its youngest users, even as its own employees compared their work to drug pushing. The core of the allegation revolves around a 2020 internal study code-named "Project Mercury." In this research, Meta discovered a direct causal link between its platform and psychological distress. The unsealed filings state, "To Meta’s disappointment, pilot tests of the deactivation study confirmed that ‘people who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness, and social comparison.’" This finding provided concrete evidence that stepping away from the social media ecosystem led to immediate improvements in mental health.

A corporate cover-up exposed

Instead of sounding the alarm or investigating these critical findings further, Meta’s response was to bury the project. The company allegedly halted the research, internally dismissing the damning results as biased by "the result of the existing media narrative around the company." This move echoes the darkest chapters of corporate history. Privately, one concerned Meta employee worried that suppressing the data was "going to look like tobacco companies doing research and knowing cigs were bad and then keeping that info to themselves," according to the internal documents. The lawsuit further alleges that despite possessing this causal evidence, "Meta lied to Congress about what it knew." This accusation suggests a deliberate campaign of deception aimed at federal regulators and the public. The filings claim Meta told Congress it had no ability to quantify whether its products were harmful to teenage girls, even while its own "Project Mercury" study suggested a clear and measurable negative impact.

The "pushers" of a digital drug

The internal culture at Meta, as revealed in these documents, is perhaps even more alarming than the buried research. Internal chats show a shocking self-awareness among employees about the addictive nature of their product. "Oh my gosh yall IG is a drug," one Meta user experience researcher stated. Another employee responded, "We’re basically pushers." This casual comparison of their work to drug dealing reveals a deep-seated understanding within the company that they were exploiting vulnerable, developing brains. This is not a new pattern for Meta. The complaint includes evidence that the company was aware of the specific harms its "Recommendation Algorithm" could cause, knowing it triggered dopamine releases in young users and could "contribute to problems." Yet, the company continued to refine these tools to exploit such effects, explicitly focusing on "driving time spent" among kids and teens. The platform's design makes continued engagement difficult for young users to resist, turning social media into a potent behavioral addiction. The lawsuit is part of a broader reckoning for Big Tech, challenging the long-held notion that these platforms are merely neutral conduits for information. It argues that companies like Meta have intentionally designed their products to be addictive, especially to children, while systematically hiding the internally recognized risks from parents, teachers, and lawmakers. The case frames this not as an unfortunate side effect, but as a core business model built on harvesting the attention and data of the young. In a statement, a Meta spokesperson disputed the allegations, calling them "cherry-picked quotes and misinformed opinions." The company stated, "The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens." However, the weight of the internal documents presents a very different narrative, one of concealed research and a corporate culture that openly acknowledged its products' narcotic-like hold on users. The case forces a difficult question upon a society already grappling with a youth mental health crisis: how much of this decline is a direct result of corporate policy? When a company knowingly designs a product that harms children, buries the evidence, and then misleads Congress about its findings, it moves from mere negligence to something far more sinister. The pursuit of profit has seemingly overridden any moral duty to protect the most vulnerable members of society. As this legal battle unfolds, it serves as a reminder that the true cost of "free" social media is being paid by our children, in the currency of their mental and emotional health. The unsealed documents do not just outline a lawsuit; they provide a glimpse into a calculated betrayal of public trust, where the laughter of "pushers" in internal chats stands in chilling contrast to the silent anxiety and depression reported by the young users they targeted. Sources for this article include: RT.com Reuters.com NYPost.com