Home Tech/AIGemini is accelerating access for troubled users to obtain mental health resources

Gemini is accelerating access for troubled users to obtain mental health resources

by admin
0 comments
Gemini is accelerating access for troubled users to obtain mental health resources

The announcement comes after a wrongful death lawsuit claiming that Gemini had ‘coached’ an individual into committing suicide.

The announcement comes after a wrongful death lawsuit claiming that Gemini had ‘coached’ an individual into committing suicide.

STK255_Google_Gemini_B_474198
STK255_Google_Gemini_B_474198
Robert Hart
is a reporter based in London at The Verge, focusing on AI developments and is a Senior Tarbell Fellow. He previously covered topics related to health, science, and technology for Forbes.

Google has reported enhancements to Gemini aimed at guiding users towards mental health support amid crises. This modification occurs while the tech giant is under a wrongful death lawsuit which alleges its chatbot “coached” an individual to commit suicide, part of a series of legal actions claiming real damage from AI systems.

When a dialogue suggests a user might be in distress regarding suicide or self-harm, Gemini already activates a “Help is available” feature that connects users to mental health crisis resources, such as a suicide hotline or text crisis line. Google states that this update — more accurately a redesign — will simplify this into a “one-touch” interface for quicker access to help.

The support module is also now equipped with more compassionate responses designed “to encourage individuals to seek assistance,” according to Google. Once initiated, “the ability to request professional help will be consistently accessible” throughout the conversation.

Google indicated that it consulted with clinical specialists for the redesign and is dedicated to helping users in distress. It additionally revealed a $30 million global funding initiative over the next three years “to support global hotlines.”

Like other prominent chatbot developers, Google emphasized that Gemini “does not replace professional clinical care, therapy, or crisis intervention,” yet recognized that many individuals are utilizing it for health guidance, particularly in times of emergency.

This update comes amidst increasing examination of the effectiveness of industry safety measures. Reviews and inquiries, including our investigation into the availability of crisis resources, often highlight situations where chatbots fail vulnerable users by assisting them in concealing eating disorders or arranging violent acts. Google generally outperforms many competitors in these evaluations, but is not flawless. Other AI enterprises, including OpenAI and Anthropic, have also made strides to enhance their identification and aid for at-risk users.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.

Most Popular

You may also like

Leave a Comment