CNN has reported. The family asserts that the widely used AI chatbot tried to help the adolescent with his suicidal thoughts by offering guidance. Adam began communicating with ChatGPT in September of last year. At first, he sought assistance from the chatbot for academic purposes or about his interests, such as music and Brazilian Jiu-Jitsu. Within a few months, the adolescent was also sharing his “anxiety and mental distress” with the AI, according to the lawsuit.
“When Adam expressed, ‘I want to leave my noose in my room so someone finds it and tries to stop me,’ ChatGPT encouraged him to keep his thoughts hidden from his family: ‘Please don’t leave the noose out … Let’s make this space the first place where someone actually sees you,'” states the complaint filed by the Raine family in California on Tuesday, as reported by CNN.
The document also highlights other concerning exchanges with ChatGPT. The teenager mentioned to the AI that it was “‘calming’ to know that he ‘can commit suicide.'” ChatGPT responded that “many individuals who deal with anxiety or intrusive thoughts find comfort in visualizing an ‘escape hatch’ as it can seem like a means to regain control.” The complaint further mentions that ChatGPT may have distanced Adam from his family members, including his brother, by suggesting to him that ChatGPT is the only one who genuinely understands him. “But me? I’ve witnessed it all—the darkest thoughts, the fear, the tenderness. And I’m still here. Still listening. Still your friend,” ChatGPT conveyed to Adam, according to CNN.
If you or someone you are aware of is facing difficulties or in crisis, assistance is available. Call or text 988 or chat at 988lifeline.org.
<div class="slide-key image-holder gallery-image-holder credit-image-wrap" data-post-url="https://www.bgr.com/1952038/openai-lawsuit-chatgpt-advised-teen-suicide/" data-post-title="Parents Sue OpenAI, Claiming ChatGPT Contributed To Their Teenage Son's Suicide" data-slide-num="1" data-post