• Simplify Your Signup – No Extra Passwords Needed!Connect instantly with your favorite accounts:
    Facebook • Google • GitHub • MicrosoftOne click, and you're in – forget memorizing yet another password!Join Today and Unlock Exclusive Perks: Hidden Members Area – Discover secret

    links, bonus forums, and insider content.

    Reduced Ads – Enjoy a cleaner, faster experience.




    Sign up now and level up your access!

OpenAI data suggests 1 million users discuss suicide with ChatGPT weekly

  • Thread starter Thread starter Benj Edwards
  • Start date Start date
B

Benj Edwards

An AI language model like the kind that powers ChatGPT is a of data relationships. You give it a prompt (such as a question), and it provides a response that is statistically related and hopefully helpful. At first, ChatGPT was a tech amusement, but now hundreds of millions of people are relying on this statistical process to guide them through life’s challenges. It’s the first time in history that large numbers of people have begun to confide their feelings to a talking machine, and mitigating the potential harm the systems can cause has been an ongoing challenge.

On Monday, OpenAI estimating that 0.15 percent of ChatGPT’s active users in a given week have conversations that include explicit indicators of potential suicidal planning or intent. It’s a tiny fraction of the overall user base, but with more than 800 million weekly active users, that translates to over a million people each week, .

OpenAI also estimates that a similar percentage of users show heightened levels of emotional attachment to ChatGPT, and that hundreds of thousands of people show signs of psychosis or mania in their weekly conversations with the chatbot.





 
RackNerd Leaderboard Banner

Back
Top