Tech News

Opena reveals approximately how many users discuss suicide with its AI

In a Moombic Blog Post, OpenAI TOUTED the development of its automated model, GPT-5, made in recognition and response to user feedback, including suicidal ideation. While the new protections and the introduction of psychologists to help the GPT-5 train led to the improvement of AI answers in Mental Health Proving, the blog post also revealed some numbers that will raise eyebrows.

While explaining the GPT-5’s ability to detect serious mental health concerns, such as psychosis and mania, the post noted that users’ concerns and discussion are “rare.”

“While, as noted above, these conversations are difficult to find and measure how common they are, our initial analysis reveals an emergency signal of 0.01%.”

The percentage seems small, but chatgpt has 800 million users, according to Sam Altman, manager of opengpt, which owns chatgpt. Altman made the surprise announcement earlier this month at Opelai’s Devday.

BREAKFUT:

Sam Altman: Chatgpt is going to be ‘friendly’ again, even criminally of course

If Altman’s numbers are correct, that equates to 560,000 chatgpt users showing symptoms of psychosis or mania, and 80,000 of their messages indicating mental health emergencies, according to the site’s estimates.

Bright light speed

Opena continues to work with its models to better identify signs of self-harm and guide those people to use resources, such as suicidal ideation or their friends or family members. The Blog Post goes on to suggest that Chatgt discussions regarding self-harm are rare, but estimates that “0.15% of users engage in Recidal Planning or 0.05% of messages contain suicidal ideation or intent.”

With 800 million users, the equivalent of 1.2 million chatgpt users engaged in conversations with AI about suicide in a given week, with 400,000 messages from users indicating direct or indirect intent to commit suicide or indirect intent to commit suicide or indirect intent to commit suicide.

“Even a very small percentage of our large user base represents a meaningful number of people, which is why we take this work seriously,” said an OpenAi spokesperson, who said that the company believes that chatgpt user symptoms reflect the community at large, where there is a mental health condition.

The spokesperson also reiterated that the company’s prices are estimates and “the numbers we’ve given are subject to change as we learn more.”

Openai is currently facing a lawsuit from the parents of Adam Rae, a 16-year-old who died by suicide earlier this year while using heavy chatgt. In a newly amended legal filing, Rain said the double-entry protections were lowered by exercise protection to increase involvement in the months before their son’s death.

If you are feeling suicidal or experiencing a mental health problem, please talk to someone. You can call or text 988 suicide and cardiac suicide at 988, or chat there 9888 Lifeline.org. You can reach Trans Lifeline by calling 877-565-8860 or the Trevor Project at 866-488-7388. Text “Begin” on the Frisis Text Line at 741-741. Contact nami support at 1-800-950-Nami, Monday through Friday from 10:00 AM – 10:00 PM ET, or email [email protected]. If you don’t like the phone, consider using it 988 suicide and Lifeline Chat conflict. Here is a List of international sources.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button