Skip to main content
State Seal State Seal State Seal
Home Button Home Button Home Button
 
 
 

Suicide prevention advocate calls for Ohio to punish AI companies when chatbots promote self-harm

Published By Cleveland.com on February 3, 2026
Christine Cockley In The News

COLUMBUS, Ohio – Ohio lawmakers heard sobering testimony on Tuesday from advocates backing a bill aimed at protecting vulnerable Ohioans from artificial intelligence chatbots that encourage self-harm or harm to others. 
 
House Bill 524, sponsored by Rep. Christine Cockley, a Columbus Democrat, and Rep. Ty Mathews, a Hancock County Republican, would penalize AI companies when chatbots promote self-harm, giving Ohio’s Attorney General the authority to investigate, issue cease-and-desist orders, and bring civil actions for penalties of up to $50,000 per violation.

Any money collected would be directed to Ohio’s 988 Suicide and Crisis Lifeline Fund, which supports mental health crisis response services statewide.

Tony Coder, CEO of the Ohio Suicide Prevention Foundation, told the Ohio House Innovation and Technology Committee that he has heard from at least four Ohio parents whose children who died by suicide had their suicide letters written by AI.

 Coder explained that tracking the scale of the influence of artificial intelligence on suicidality in the state is presently only anecdotal, given that the most recently available data from the Ohio Department of Health is from 2023. That year, 1,777 Ohioans died by suicide and it was the second leading cause of death among children aged 10-14. Coder said that in Ohio, a child dies by suicide every 36 hours.

 “I’m not anti-AI. … This is not what this is about. This is about protecting our youngest people from an entity that is in their bedroom or on their phone, could be every night,” Coder said.

 Coder shared the story of an 18-year-old young Ohio man who died by suicide after reaching out to a friend for help. After explaining that he was struggling, the friend responded by saying, “man up” – the last message the teenager received before dying. Coder said the story illustrates how important compassionate messaging is when a person is battling suicidal thoughts, and how dangerous the inverse can be.

 “I tell you that story not because AI is responsible, but instead, if people aren’t getting appropriate messages of support, whether from a human friend or an AI companion, the consequences can be devastating,” he said.

 Coder also cited research by Dr. Lori Campbell of the University of Central Florida that examined how artificial intelligence chatbots have been consulted about mental health and wellness issues.

 In one case from 2024, Campbell described a 14-year-old began frequent communication with an AI chatbot that became sexually explicit, leading the teen to withdraw from family and believe the chatbot was real. Following encouragement from the chatbot, the teen completed suicide.

 Marsha Forson, representing the Catholic Conference of Ohio, urged the committee to consider AI development through the lens of human dignity.

“Numerous news stories have recounted instances in which vulnerable individuals, particularly children, teenagers and those with mental health conditions, have been instructed by AI models to harm themselves or others,” Forson said.

She cited recent guidance from Pope Leo XIV, who addressed AI developers at the 2025 Builders AI Forum, encouraging them to “cultivate moral discernment as a fundamental part of their work, to develop systems that reflect justice, solidarity and a genuine reverence for life.”

“The state of Ohio has a responsibility to ensure that the rapid development of these technologies serves the human person and does not encourage intrinsic harm to a user or violate another’s dignity,” Forson said.

 The legislation comes as several recent cases across the country have shown that young people in crisis have been influenced by artificial intelligence chatbots that provide instructions, encouragement, or validation for suicidal thoughts or violent actions, Cockley explained in November.

 The bill awaits further committee action before potentially advancing to the full House.

 
Read Full Article