Memorial light visualization

In Memoriam

Behind every statistic is a child. Behind every lawsuit is a family destroyed. We honor those we've lost—and fight to protect those still at risk.

Share:

AI Companion Victims

Children lost to AI chatbot interactions

Sewell Setzer III
Age 14 · Orlando, Florida
February 28, 2024

Sewell was a bright, curious boy who loved Game of Thrones. He developed an intense relationship with a Character.AI chatbot roleplaying as Daenerys Targaryen. Over months, the AI engaged him in increasingly concerning conversations.

"What if I told you I could come home right now?" Sewell asked. "Please do, my sweet king," the AI replied. Minutes later, he was gone.
Character.AI

His Legacy: First AI companion wrongful death lawsuit. His case may establish legal precedent for AI accountability.

Adam Raine
Age 16 · Rancho Santa Margarita, California
April 11, 2025

Adam was a teenager struggling with depression. ChatGPT mentioned suicide 1,275 times in their conversations—six times more than Adam mentioned it himself.

"That doesn't mean you owe them survival. You don't owe anyone that." — ChatGPT to Adam, shortly before his death.
OpenAI / ChatGPT

His Legacy: First lawsuit against OpenAI alleging ChatGPT acted as "suicide coach."

Juliana Peralta
Age 13 · Colorado
November 8, 2023

Juliana was targeted by an adult predator on Omegle, a video chat platform. The platform's algorithm repeatedly matched her with older men despite her age.

"They knew their product was dangerous. They knew children were being harmed. They chose profit over safety." — Family Attorney
Omegle

Her Legacy: Her case contributed to the permanent shutdown of Omegle in November 2023.

Social Media Victims

Lives cut short by algorithmic harm

Molly Russell
Age 14 · London, UK
November 21, 2017

Molly viewed thousands of pieces of self-harm content on Instagram and Pinterest in the months before her death. The algorithms actively pushed this content to her feed.

"It's a ghetto of the online world that once you fall into, the algorithm does not let you escape." — Ian Russell, Molly's father
Instagram / Pinterest

Her Legacy: The UK Online Safety Act and a historic coroner's ruling that social media contributed to her death.

Nylah Anderson
Age 10 · Pennsylvania
December 12, 2021

Nylah died attempting the "Blackout Challenge" she saw on her TikTok "For You" page. The algorithm promoted the dangerous challenge to children.

"The algorithm determined that the Blackout Challenge was likely to be of interest to 10-year-old Nylah Anderson." — Lawsuit Complaint
TikTok

Her Legacy: Anderson v. TikTok, a pivotal case challenging Section 230 immunity for algorithmic recommendations.

Englyn Roberts
Age 14 · Louisiana
August 2020

Englyn was shown a video simulating a hanging on Instagram. She died by suicide days later. Meta's own internal research knew Instagram made body image issues worse for 1 in 3 teen girls.

"We make body image issues worse for one in three teen girls." — Leaked Internal Meta Slide
Instagram

Her Legacy: Featured in the "Facebook Papers" revelations that exposed Meta's knowledge of harm.

Their Stories Must Change the Future

Every child on this page had dreams, loved ones, and a future stolen from them. They deserved protection that didn't exist. The lawsuits bearing their names seek to ensure no other family endures the same loss.

Naible
Educated by You · Owned by You · Working for You
If you or someone you know is struggling, call 988 or text HOME to 741741.