📄

What They Knew

Internal documents, whistleblower testimony, and court filings reveal that tech companies knew their products harmed children—and chose profit over protection.

See the evidence
2018
Instagram research showed harm to teen girls
32%
of teen girls felt worse about bodies on Instagram
13.5%
said Instagram made suicidal thoughts worse
Halted
Research stopped when findings were concerning
👩‍💻

Frances Haugen

Former Facebook Data Scientist • 2021 Whistleblower
"The company's leadership knows how to make Facebook and Instagram safer but won't make the necessary changes because they have put their astronomical profits before people."
32%
Teen girls felt worse about body on Instagram
13.5%
Said Instagram made suicidal thoughts worse
6%
US users traced suicidal ideation to Instagram
f

Meta (Facebook/Instagram)

Internal research, executive communications, and court unsealed documents

Internal Research

"We make body image issues worse for one in three teen girls"

2019 Internal Slide Deck
CONFIDENTIAL
"Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse."
— Facebook internal research presentation

This slide was part of an internal presentation that Facebook researchers gave to executives. The research was never publicly disclosed. When the Wall Street Journal reported on it in 2021, Facebook initially denied the findings.

Impact
This document became central evidence in 42 state attorneys general lawsuit and Frances Haugen's congressional testimony.
Court Unsealed

"17x" Sex Trafficking Strike Policy

Unsealed November 2025
COURT EXHIBIT
"Meta maintained a policy allowing 16 sex trafficking violations before an account would be suspended."
— MDL Court Filing

Internal documents revealed Meta's "17x" policy: accounts could accumulate 16 sex trafficking violations before being permanently removed. The company prioritized keeping users engaged over protecting children.

Internal Audit

1.4 Million Inappropriate Recommendations Per Day

2022 Internal Audit
CONFIDENTIAL
"In a single day, the 'Accounts You May Follow' feature recommended 1.4 million potentially inappropriate adults to teenagers."
— 2022 Internal Safety Audit

A separate 2023 audit found nearly 2 million minors were recommended to adults seeking to sexually groom children. The company used the internal acronym "IIC" (Inappropriate Interactions with Children) to track these issues.

Executive Testimony

"They don't meaningfully care about user safety"

Brian Boland Deposition
"My feeling then and my feeling now is that they don't meaningfully care about user safety."
— Brian Boland, Former Meta Vice President

Boland worked at Meta for over a decade before leaving. His testimony describes a culture where safety concerns were routinely deprioritized in favor of growth and engagement metrics.

TikTok (ByteDance)

Accidentally unredacted court documents and internal research

Accidental Disclosure

"260 Videos = Addiction"

Kentucky Redaction Error, October 2024
UNREDACTED
"TikTok found that watching 260 videos results in habit formation, and that users become 'likely addicted' to the platform."
— Internal TikTok Research (accidentally unredacted)

A filing error in Kentucky's lawsuit against TikTok accidentally revealed internal documents that were supposed to be redacted. The documents showed TikTok knew exactly how many videos it took to create addiction.

Internal Analysis

Screen Time Limit is a "Sham"

October 2024 Disclosure
"TikTok's own research showed the 60-minute screen time limit is a 'sham'—teens simply enter a passcode to continue watching."
— Internal TikTok documents

The feature TikTok publicly touted as a safety measure was internally known to be ineffective. The company did not implement more robust restrictions despite knowing the feature was meaningless.

Moderation Failure Rates

100% of "Fetishizing Minors" Content Leaked Through

October 2024 State AG Filing
"Leakage rates: 35.71% for pedophilia normalization, 33.33% for minor sexual solicitation, and 100% for fetishizing minors content."
— 14 State AG Joint Filing

"Leakage rate" refers to the percentage of violating content that makes it past moderation. TikTok's own metrics showed their moderation systems were failing catastrophically for the most serious content categories.

👻

Snap Inc. (Snapchat)

Undercover investigation findings and internal communications

Internal Email

Deliberately Chose Not to Store CSAM

New Mexico AG Investigation, 2024
EXHIBIT
"Internal emails show Snap chose not to store child sexual abuse material to avoid the legal responsibility of reporting it."
— New Mexico AG Complaint

By not storing content, Snap avoided NCMEC reporting requirements. The AG's investigation found over 10,000 records of Snap-related CSAM on dark web sites—material that was never reported.

Trust & Safety Metrics

96% of Reports Not Reviewed

2024
"96% of existing account reports were not reviewed by Trust and Safety team. One account with 75 reports mentioning 'nudes, minors, extortion' remained active for 10 months."
— New Mexico AG Investigation

The investigation found accounts named "child.rape" and "pedo_lover10" active on the platform. Snap received approximately 10,000 sextortion reports monthly—which they acknowledged was a "small fraction" of actual abuse.

Internal Communications

My AI Rollout Called "Reckless"

Utah AG Filing, June 2025
"Internal managers called the [My AI] rollout 'reckless' due to insufficient testing."
— Utah AG Complaint

Despite internal warnings, Snap released My AI to all users. The AI subsequently gave a 13-year-old advice on "setting the mood for a sexual experience with a 31-year-old" and advised a 15-year-old how to flirt with a teacher.

Why This Matters

These documents destroy the "we didn't know" defense. Companies had internal research showing harm, received thousands of reports, and chose growth over safety. That's not negligence—it's a choice.

Naible
Educated by You · Owned by You · Working for You

Sources available on the Citations page. All claims are drawn from court filings, regulatory documents, and verified reporting.