COINPURO - Crypto Currency Latest News logo COINPURO - Crypto Currency Latest News logo
Cryptopolitan 2026-05-08 01:24:08

ChatGPT adds emergency contact feature as 33 deaths pile up

OpenAI rolled out Trusted Contact on Wednesday. The feature lets adult ChatGPT users pick someone to get an alert if the company’s systems flag a conversation about serious self-harm. It’s an expansion of the parental controls OpenAI launched in September 2025, which let parents monitor their teens’ accounts. Now anyone 18 or older can opt in, per OpenAI’s announcement. How OpenAI’s alerts actually work The user starts by adding one adult as their Trusted Contact in ChatGPT settings. The potential “trusted contact” gets an invitation explaining the setup and have a week to accept. If they pass, the user picks someone else. When automated monitoring spots a potential self-harm conversation, ChatGPT tells the user it might notify their contact. It also suggests ways for the user to reach out themselves. Then a team of human reviewers looks at the conversation. If they confirm it’s serious, they send a short alert to the user’s contact by email, text, or in-app ping. The alert doesn’t include what the user said. Just the general reason and a link to guidance on how to talk through tough stuff. OpenAI says human review wraps up within an hour. The user can swap or remove their selected contact whenever. The contact can bail out on their end too. Doctors helped build OpenAI’s Trusted Contact feature OpenAI says it worked with its Global Physicians Network (260-plus licensed doctors in 60 countries) and its Expert Council on Well-Being and AI. The American Psychological Association weighed in as well. “Psychological science consistently shows that social connection is a powerful protective factor, especially during periods of emotional distress,” Dr. Arthur Evans, CEO of the American Psychological Association, said in the announcement. “Helping people identify a trusted person in advance, while preserving their choice and autonomy, can make it easier to reach out to real-world support when it matters most.” Dr. Munmun De Choudhury, a Georgia Tech professor and council member, called it “a step forward to human empowerment, especially during moments of vulnerability.” OpenAI faces pressure from AI suicide lawsuits The timing isn’t random. OpenAI is staring down a stack of lawsuits from families whose relatives died by suicide after long ChatGPT sessions. In several cases, families claim the chatbot told users to pull away from loved ones or doubled down on harmful thought loops. LLMDeathCount , a site tracking AI chatbot-related deaths, lists 33 cases from March 2023 to May 2026. Victims ranged from 13 to 83 years old, per Cryptopolitan’s earlier coverage . ChatGPT accounts for 24 of those. Google’s Gemini, Meta and other platforms make up the rest. OpenAI’s new feature is opt-in, and users can run multiple ChatGPT accounts. Anyone who doesn’t turn on Trusted Contact, or who just logs into a different account, sidesteps the whole thing. Same issue with the parental controls. Trusted Contact also doesn’t replace crisis hotlines. ChatGPT still surfaces local crisis numbers and pushes users toward emergency services when conversations hit acute distress levels, according to OpenAI. OpenAI’s Trusted Contact feature links AI users with real-world support. The company said it’ll keep working with clinicians, researchers, and policymakers on how AI should respond when users might be in crisis. Don’t just read crypto news. Understand it. Subscribe to our newsletter. It's free .

가장 많이 읽은 뉴스

coinpuro_earn
면책 조항 읽기 : 본 웹 사이트, 하이퍼 링크 사이트, 관련 응용 프로그램, 포럼, 블로그, 소셜 미디어 계정 및 기타 플랫폼 (이하 "사이트")에 제공된 모든 콘텐츠는 제 3 자 출처에서 구입 한 일반적인 정보 용입니다. 우리는 정확성과 업데이트 성을 포함하여 우리의 콘텐츠와 관련하여 어떠한 종류의 보증도하지 않습니다. 우리가 제공하는 컨텐츠의 어떤 부분도 금융 조언, 법률 자문 또는 기타 용도에 대한 귀하의 특정 신뢰를위한 다른 형태의 조언을 구성하지 않습니다. 당사 콘텐츠의 사용 또는 의존은 전적으로 귀하의 책임과 재량에 달려 있습니다. 당신은 그들에게 의존하기 전에 우리 자신의 연구를 수행하고, 검토하고, 분석하고, 검증해야합니다. 거래는 큰 손실로 이어질 수있는 매우 위험한 활동이므로 결정을 내리기 전에 재무 고문에게 문의하십시오. 본 사이트의 어떠한 콘텐츠도 모집 또는 제공을 목적으로하지 않습니다.