Florida Rep. Christine Hunschofsky filed legislation this week that would impose new regulations on AI companion chatbots, aiming to protect children and teens from emotional harm, sexually explicit content, and suicide-related risks as artificial intelligence tools become more widely used.
House Bill 659 would create a new section of Florida law governing artificial-intelligence systems designed to engage users in ongoing, human-like social interaction. The bill targets companion chatbots capable of sustaining relationships across multiple conversations, while excluding customer-service bots, productivity tools, limited video-game characters, and standard voice-activated assistants.
Under the proposal, operators would be required to clearly disclose that users are interacting with artificial intelligence, not a real person, and warn that such chatbots may not be suitable for some minors. For users identified as minors, platforms would have to provide recurring reminders at least every three hours emphasizing the chatbot’s artificial nature and encouraging breaks from continued interaction.
The bill also mandates safeguards to address suicide and self-harm risks. Operators would be required to maintain evidence-based protocols to detect and respond to suicidal ideation, prohibit chatbots from discussing self-harm or suicide, and direct users to the 988 Suicide & Crisis Lifeline when appropriate.
“Increasingly, we are seeing heartbreaking cases where young people form deep emotional bonds with AI companions that end up pushing them further toward self-harm,” said Rep. Hunschofsky. “These companion chatbots blur the lines to the point that people can’t tell the difference between a human and AI. While AI continues to advance at a rapid pace, we have to ensure that safeguards are in place for our youth. This bill is about protecting our children by making sure there are clear warnings, strong safeguards, and real accountability for the companies that put these companion chatbots on the market.”
Additional provisions would require operators to offer age-verification options and implement protections preventing the production of sexually explicit visual content or encouragement of sexual conduct when interacting with minors.
Beginning in 2027, operators would be required to submit annual reports to the Department of Legal Affairs outlining their safety protocols and the number of crisis-line referrals issued, without including any personal user information. Violations would be enforced by the Attorney General under Florida’s unfair and deceptive trade practices law, with a 30-day cure period and no private right of action.
If enacted, the bill would take effect July 1, 2026.

