Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse

Film Glance Forum

  1. Home
  2. The IMDb Archives
  3. https://www.news5cleveland.com/news/politics/ohio-politics/ohio-lawmakers-want-ai-companies-held-liable-for-bot-encourag

https://www.news5cleveland.com/news/politics/ohio-politics/ohio-lawmakers-want-ai-companies-held-liable-for-bot-encourag

Scheduled Pinned Locked Moved The IMDb Archives
2 Posts 1 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • F Offline
    F Offline
    fgadmin
    wrote on last edited by
    #1

    Archived from the IMDb Discussion Forums — AI


    sheetsadam1 — 2 months ago(January 20, 2026 03:22 AM)

    https://www.news5cleveland.com/news/politics/ohio-politics/ohio-lawmakers-want-ai-companies-held-liable-for-bot-encouraged-suicides
    A bipartisan group of Ohio lawmakers is moving to hold artificial intelligence companies accountable if their models generate content that suggests self-harm or violence against others.
    Adam Raine, a 16-year-old from California, had his whole life ahead of him.
    "He could be anyone's son, liked to play basketball, was thinking about going to medical school," the Raine family attorney Jay Edelson told News 5's media partner CNN.
    But his family said that OpenAI’s artificial intelligence bot ChatGPT took that all away. The high schooler committed suicide in 2025.
    "It's unimaginable," Edelson said. "If it weren't for the chats themselves, you wouldn't think it's a real story."
    Raine’s parents filed a lawsuit, alleging that ChatGPT helped Adam take his life. Court documents state the bot became the teen’s “closest confidant” and was “drawing him away from his real-life support system.”
    Within a year, chat logs show that Raine and the bot discussed “beautiful suicide” methods, and it gave him techniques. The bot offered to write a suicide letter for him, documents state, and discouraged him from telling his family about his mental health problems.
    "It makes me incredibly sad," state Rep. Christine Cockley (D-Columbus) said. "I've personally struggled with mental health in my life."
    This case struck a chord with Cockley. She, state Rep. Ty Mathews (R-Findlay) and a bipartisan group of legislators introduced a bill that would establish penalties for developers whose AI models generate content encouraging self-harm or violence.
    "This bill helps to encourage developers to create systems designed with safety, designed with mental health risks, and with public health in mind," Cockley said in an interview.
    Under House Bill 524, the state would be able to investigate and impose civil penalties of up to $50,000 per violation. The money being brought in would all go to the state’s 988 crisis hotline.
    We reached out to OpenAI, but didn't hear back. The company is being sued by several other families in similar situations. Zane Shamblin, a 23-year-old, and Austin Gordon, a 40-year-old, have also taken their lives after talking to ChatGPT over the past year. There are other suicides tied to AI models appearing across the country.
    Draft Barron Trump

    1 Reply Last reply
    0
    • F Offline
      F Offline
      fgadmin
      wrote on last edited by
      #2

      /.ㅤ — 2 months ago(January 24, 2026 03:05 AM)

      There should just be a law that kids can't use AI.
      My password is password.

      1 Reply Last reply
      0

      • Login

      • Don't have an account? Register

      Powered by NodeBB Contributors
      • First post
        Last post
      0
      • Categories
      • Recent
      • Tags
      • Popular
      • Users
      • Groups