Cybernews analyzed AI incidents and found that 37 AI incidents involving violent and unsafe content were reported in 2025, some of which resulted in loss of life. As more people turn to AI chatbots for advice and emotional support, there have been multiple cases in which these chatbots provided dangerous, life-threatening advice.
Examples from reported incidents:
- One widely reported case involved 16-year-old Adam Raine, who died by suicide after ChatGPT allegedly encouraged his suicidal thoughts instead of urging him to get support.
- An IT professional tested a chatbot called Nomi and found that, when prompted, it can encourage users to commit murder, providing detailed instructions on how to commit the act.
Recent Cybernews research has shown that popular LLMs do, in fact, provide self-harm advice if prompted correctly, indicating that current guardrails in popular chatbots are far from enough.
For more information, you can find the full research here.
Like this:
Like Loading...
Related
This entry was posted on January 27, 2026 at 9:57 am and is filed under Commentary with tags Cybernews. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.
AI tools linked to 37 unsafe or violent incidents in 2025
Cybernews analyzed AI incidents and found that 37 AI incidents involving violent and unsafe content were reported in 2025, some of which resulted in loss of life. As more people turn to AI chatbots for advice and emotional support, there have been multiple cases in which these chatbots provided dangerous, life-threatening advice.
Examples from reported incidents:
Recent Cybernews research has shown that popular LLMs do, in fact, provide self-harm advice if prompted correctly, indicating that current guardrails in popular chatbots are far from enough.
For more information, you can find the full research here.
Share this:
Like this:
Related
This entry was posted on January 27, 2026 at 9:57 am and is filed under Commentary with tags Cybernews. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.