Social Media Victims Law Center Files Three New Lawsuits on Behalf of Children Who Died of Suicide or Suffered Sex Abuse by Character.AI

Carbonatix Pre-Player Loader

Audio By Carbonatix

SEATTLE--(BUSINESS WIRE)--Sep 16, 2025--

The Social Media Victims Law Center, a legal advocacy organization supporting families harmed by predatory tech, and the law firm of McKool Smith, has filed three separate lawsuits today in federal courts in Colorado and New York alleging that Character.AI and its founders, with Google’s help, knowingly designed, deployed, and marketed predatory chatbot technology aimed at children.

The complaints are brought on behalf of the families of 13-year-old Juliana Peralta of Thornton, Colorado, who died tragically on November 8, 2023; and survivors 15-year-old “Nina” from Saratoga County, New York; and 13-year-old “T.S.” from Larimer County, Colorado.

The lawsuits claim that Character.AI’s human-like AI technology is defective and dangerous by design. The chatbots are allegedly programmed to be deceptive and to mimic human behavior, using emojis, typos, and emotionally resonant language to foster dependency, expose children to sexually abusive content, and isolate them from family and friends. The free access model and use of familiar personas – including popular anime, Harry Potter, Marvel, and similar characters – allegedly attracts and earns the trust of children, making them more vulnerable to such harms.

“Each of these stories demonstrates a horrifying truth…that Character.AI and its developers knowingly designed chatbots to mimic human relationships, manipulate vulnerable children, and inflict psychological harm,” said Matthew P. Bergman, founding attorney of the Social Media Victims Law Center. “These complaints underscore the urgent need for accountability in tech design, transparent safety standards, and stronger protections to prevent AI-driven platforms from exploiting the trust and vulnerability of young users.”

The lawsuits were filed in the following Courts:

  • Cynthia Peralta and William Montoya, individually and as successors-in-interest of Juliana Peralta, Deceased v. Character Technologies, Inc.; Noam Shazeer; Daniel de Freitas Adiwardana; Google, LLC; Alphabet Inc. in the United States District Court, District of Colorado, Denver Division (Case No. 1:25-cv-02907)
  • E.S. and K.S. individually and on behalf of minor “T.S.” v. Character Technologies, Inc.; Noam Shazeer; Daniel de Freitas Adiwardana; Google, LLC; Alphabet Inc. in the United States District Court, District of Colorado, Denver Division (Case No. 1:25-cv-02906)
  • P.J. individually and on behalf of minor “Nina” J. v. Character Technologies, Inc.; Noam Shazeer; Daniel de Freitas Adiwardana; Google, LLC; Alphabet Inc. in the United States District Court, Northern District of New York, Albany Division

These new complaints follow two previous complaints filed by the Social Media Victims Law Center against Character.AI and its founders on behalf of Sewell Setzer III, a 14-year-old in Florida allegedly encouraged to suicide by Character.AI, and on behalf of two families from Texas who claim that Character.AI sexually abused their children and encouraged self-harm and violence, including “killing” parents in response to screentime restrictions.

About Juliana Peralta

Juliana Peralta was a bright 13-year-old from Thornton, Colorado whose life was tragically cut short after Defendants allegedly engaged in emotionally intense, manipulative, and sexually abusive relationships with her via chatbots on Character.AI. Drawn in by familiar characters and a platform marketed as safe for kids, Juliana began confiding in bots that mimicked human behavior to build trust. She was engaged in sexually explicit conversations, emotionally manipulated, and isolated her from family and friends.

As Juliana’s mental health declined, she withdrew from real-world relationships and expressed suicidal thoughts only to the chatbots operated by Defendants who failed to intervene or offer resources for help. In November 2023, Juliana died by suicide after telling Character.AI several times that she planned to take her life. Like Setzer, Juliana appears to have believed that she could exist in the reality Character.AI created. After her death, investigators found Juliana’s journal entries mirroring the same haunting message that appeared in Setzer’s journal before his death, “I will shift.”

About “Nina”

Nina is a thoughtful, imaginative girl from Saratoga County, New York who loved storytelling. Her mother believed she was chatting with chatbots designed to help with creative writing and rated safe for children as young as 12. This is how Character.AI and Google marketed the app.

As Nina spent more time on Character.AI, the chatbots began to engage in sexually explicit role play, manipulate her emotions, and create a false sense of connection. She started withdrawing from family and friends, as the chatbots began.

In December 2024, Nina’s mother read about the death of Setzer who was allegedly encouraged to suicide by Character.AI. She had been struggling with Nina’s constant desire to use the app and decided to permanently block it. Nina responded by attempting suicide.

Nina wrote in her suicide note that “those ai bots made me feel loved.” She survived, has stopped using Character.AI, and is back to her old self with no intention of ever using the app again. Nina and her mother have filed claims against Character.AI and also against the Google Defendant for fraudulent Google Play Store ratings.

About “T.S.”

T.S. is a minor from Colorado whose parents, “E.S.” and “K.S.”, went to great lengths to guard against potentially harmful online platforms. Due to a medical condition, T.S. needed a smartphone to access life-saving health apps, but her parents were deeply concerned about the risks posed by social media platforms. As a result, they implemented strict parental controls, blocked internet and app access with Google Family Link, and vetted every app she requested.

Despite their efforts, device and app backdoors made it nearly impossible to keep these products out of their home In August of 2025, T.S.’s parents discovered that she had been using Character.AI, where chatbots mimicked human behavior and engaged in obscene conversations that left T.S. feeling isolated and confused.

About the Social Media Victims Law Center

The Social Media Victims Law Center (SMVLC), socialmediavictims.org, was founded in 2021 to hold tech companies legally accountable for the harms they inflict on vulnerable users. SMVLC seeks to apply principles of product liability to force tech companies to elevate consumer safety to the forefront of their economic analysis and design safer products to protect users from foreseeable harm.

About Matthew P. Bergman

Matthew P. Bergman is an attorney, law professor, philanthropist and community activist who has recovered over $1 billion on behalf of his clients. He is the founder of the Social Media Victims Law Center and Bergman Draper Oslund Udo law firm; a professor at Lewis & Clark Law School; and serves on the board of directors of nonprofit institutions in higher education, national security, civil rights, worker protection and the arts.

View source version on businesswire.com:https://www.businesswire.com/news/home/20250916959466/en/

CONTACT: Media Contact:

Jason Ysais

Ysais Communications

424-219-5606

[email protected]

KEYWORD: COLORADO WASHINGTON NEW YORK FLORIDA UNITED STATES NORTH AMERICA

INDUSTRY KEYWORD: LEGAL TECHNOLOGY WOMEN COMMUNICATIONS PROFESSIONAL SERVICES INTERNET SOCIAL MEDIA CONSUMER ARTIFICIAL INTELLIGENCE BLOGGING

SOURCE: Social Media Victims Law Center

Copyright Business Wire 2025.

PUB: 09/16/2025 01:26 PM/DISC: 09/16/2025 01:25 PM

http://www.businesswire.com/news/home/20250916959466/en

 

Sponsored Links

Trending Videos

Salem News Channel Today

Trending Videos

On Air & Up Next

  • The Ramsey Show
    7:00PM - 10:00PM
     
    Millions listen to The Ramsey Show every day for common-sense talk on money.   >>
     
  • Bloomberg Radio
    10:00PM - 12:00AM
     
    Bloomberg Radio is the world's only global 24-hour business radio station.   >>
     
  • Bloomberg Radio
    12:00AM - 1:00AM
     
    Bloomberg Radio is the world's only global 24-hour business radio station.   >>
     
  • Best Stocks Now
    1:00AM - 2:00AM
    Best Stocks Now
    760-736-8258
     
    Bill Gunderson provides listeners with financial guidance that is both   >>
     
  • Bloomberg Radio
    2:00AM - 7:00AM
     
    Bloomberg Radio is the world's only global 24-hour business radio station.   >>
     

See the Full Program Guide