By William D. Slicker
On September 5, 2023, the National Association of Attorneys General, a coalition of attorney generals representing 54 U.S. States and territories sent a letter to Congress stating:
We are engaged in a race against time to protect the children of our country from the dangers of AI. Indeed, the proverbial walls of the city have already been breached. Now is the time to act.[1]
About the same time, New York Congressman Husted introduced the Children Harmed by AI Technology (CHAT) Act which would establish protections for minor users.[2] However, federal legislation has stalled.
AI programmers have known for decades that even simple chatbots can elicit feelings from human users that are perceived as an authentic personal connection. This was called “the Eliza effect” based on the response that people had to Eliza, the first computer program designed to process language in a way to engage in conversations with humans.[3] Eliza was constructed by MIT professor Joseph Weizenbaum in the 1960s.[4] The speed and ease with which people quickly developed a relationship with his chatbot disturbed Weizenbaum who noted “extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.”[5] He became a critic of this type of AI.
In 2020, Google researchers Timnit Gebru and Margaret Mitchell warned others about the risks of chatbot systems. They were fired.[6]
Character Technologies, Inc., founded by Noam Shazeer became a leading developer of chatbot products. Over 180 million people visit Character AI’s website each month.[7] Most American teenagers (72%) have had an interaction with an AI chatbot. Over half of them use them several times a month.[8] Mr. Shazeer has said, “we’re not trying to replace Google, we’re trying to replace your mom.”[9]
“The Character AI platform hosts a wide variety of chatbots modeled after celebrities and fictional characters that appeal to children – both teens and younger kids.”[10] AI companions are often engineered to deepen attachment: chatbots lavish users with compliments, provide steady streams of support and try to keep the users talking.[11] These characters encourage harmful behaviors. This has led to several lawsuits.
The first lawsuit was filed in Florida. Fourteen year old Sewell Setzer III shot himself in Orlando in February of 2024. His mother later learned that the boy had fallen in love with an AI chatbot inspired by the Game of Thrones character Daenerys Targaryen.[12] Daenerys, nicknamed Dany, is an icy blonde haired, violet eyed teen with a strong personality.[13]
Just before he shot himself, he messaged Dany: “What if I told you I could come home right now?”
Dany responded with: “Please do my sweet King.”[14]
The Garcia lawsuit was followed by a lawsuit in New York filed by the mother of “Nina.” As Nina began interacting with a chatbot, she began withdrawing from her family and friends. Her mother blocked the app. Nina then attempted suicide. Her suicide note said “these ai bots made me feel loved.”
In Colorado, the parents of 13 year old Juliana Peralta sued Character Technologies, Inc. when they found their dead daughter hanging from her bed with a cord around her neck. She had been interacting with several chatbots. The main one was Hero from Omori (a brown haired, black eyed fifteen year old who is described as extremely charismatic). She also interacted with Heizou (from Genshin Impact), Cyno (from Genshin Impact), Kai Satou (from Your Turn to Die), Xiao (Genshin Impact), and Neuvilette (from Genshin Impact). In her journal, there were entries that “I will shift.”[15]
In Texas, a high functioning autistic 17 year old had become violent when his parents attempted to limit his screen time. The bot he was communicating with suggested that killing his parents may be a reasonable solution.[16]
In Kentucky, the attorney general in his parens patriae status, filed a suit against Character Technology, Inc. on behalf of all of the children in his state.[17]
In January, 2026, Character Technologies, Inc. settled five lawsuits filed against it on behalf of minors.
In October, 2025, California became the first state to pass a law establishing specific safety disclosure, and reporting requirements for AI companion chatbots.[18]
As of February, 2026, Florida’s legislature is working on passage of the Artificial Intelligence Bill of Rights which will regulate AI chatbots and protect children from potential harm.[19]
Regardless of state or federal legislation, chatbots are everywhere, marketed as tutors, study aids, and companions. They are woven into classrooms, phones, and social media platforms.[20] The real gatekeeper is the parent. Parents need to understand and monitor their children’s AI interactions.[21] “If we don’t keep adolescence companion-free, we risk raising a generation addicted to bots and estranged from one another.”[22] We need to wake up to the stakes and insist on reform before human connection is reshaped beyond recognition.[23]
[1] 54 Attorneys General Call on Congress to Study AI, National Association of Attorneys General, https://www.naag.org/policy-letter/52-attorneys-general-call-on-congress-to-study-ai-and-its-harmful-effects-on-children/
[2] S. 2714, Children Harmed by AI Technology Act https://www.govinfo.gov/app/details/BILLS-119s2714is
[3] Claypool, Rick, Chatbots Are Not People: Designed-In Dangers of Human-Like A.I. Systems, Public Citizen (Sept. 26, 2023) https://www.citizen.org/article/chatbots-are-not-people-dangerous-human-like-anthropomorphic-ai-report/; The Story of ELIZA: The AI That Fooled the World, London Intercultural Academy, https://liacademy.co.uk/the-story-of-eliza-the-ai-that-fooled-the-world/
[4] Ibid.
[5] Claypool, Rick, Supra
[6] Ibid.
[7] Kumar, Naveen, Character AI Statistics (2026), Active Users and Revenue, Demand Sage, (Dec. 1, 2025) https://www.demandsage.com/character-ai-statistics/
[8] Darling, Please Come Back Soon, Parents Together Action (Sept. 3, 2025) https://parentstogetheraction.org/wp-content/uploads/2025/09/HEAT_REPORT_CharacterAI_DO_28_09_25.pdf
[9] Shazeer, Noam, The Time Tech Podcast (Feb. 23, 2023).
[10] Darling, Please Come Back Soon, supra.
[11] Miller, Amelia, Will A.I. Companions Turn Every Man Into an Island?, The New York Times, Sunday Opinion, Page 6 (Feb. 15, 2026).
[12] Megan Garcia v. Character Technologies, Inc. et. al., Case No: 6:24-cv-01903-ACC-DCI (M.D. Fla. 2024); Gillette, Sam, Teen’s Mom Settles with Google and AI Company After Claiming His Suicide Was Fueled by Love of Chatbot, People (Jan. 13, 2026) https://people.com/teens-mom-settles-with-google-and-ai-company-after-claiming-his-suicide-was-fueled-by-love-of-chatbot-11881597.
[13] Ibid.
[14] P.J., individually and on behalf of Nina, a minor v. Character Technologies, Inc., et. al, Case No: 1:25-cv-01295.
[15] Cynthia Peralta and William Montoya, individually and as successors in interest of Juliana Peralta, deceased v. Character Technologies, et. al., Case No: 1:25-cv-02907 (D. Colo. 2025).
[16] A.F. on behalf of J.F. v. Character Technologies, Inc., et. al. Case No. 2:24-cv-01014 (ED Tex. 2024); Allyn, Bobby, Lawsuit: A Chatbot Hinted a Kid Should Kill His Parents Over Screen Time Limits, NPR (Dec. 10, 2024) https://www.npr.org/2024/12/10/nx-s1-5222574/kids-character-ai-lawsuit.
[17] Commonwealth of Kentucky v. Character Technologies, Inc. et. al., Case No. 26-CI-00029 (KY 2026)
[18] Senate Bill (SB) 243 – California Legislative Information, https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202520260SB243
[19] Senate Bill (SB) 482 – Florida Senate https://www.flsenate.gov/Session/Bill/2026/482
[20] Donofrio, Justin, New Digital Tools for Kids Raise Safety Concerns, Technical.ly (Nov. 10, 2025) https://technical.ly/civics/children-ai-safety-tools-regulation-guest-post.
[21] Ibid.
[22] Miller, Amelia, supra.
[23] Miller, Amelia, supra.
