Sam Altman says OpenAI fixed ChatGPT’s serious mental health issues in just a month. Anyway, here comes the erotica
Well that was quick. Just over a month ago, OpenAI CEO Sam Altman announced a 120-day plan to roll out new protections for identifying and helping ChatGPT users who are suffering a mental health crisis, after a series of reports brought attention to such users harming themselves and others after using the company’s AI chatbot.
Today, Altman says the company has built new tools to address these issues and “mitigated” these problems.
Altman is so confident that they’ve addressed mental health safety that the company is reverting ChatGPT’s behavior so it “behaves more like what people liked about 4o.” Altman essentially apologized to users for the changes that were made to address mental health problems that arose with use of the chatbot:
“We realize this made it less useful/enjoyable to many users who had no mental health problems, but given the seriousness of the issue we wanted to get this right.”
Separately, the company announced the members of its Expert Council on Well-Being and AI, an eight-person council of mental health experts.
As a reward for the adults who aren’t suffering mental health issues exacerbated by confiding in the chatbot, Altman says that erotica is on the way.
“In December, as we roll out age-gating more fully and as part of our ‘treat adult users like adults’ principle, we will allow even more, like erotica for verified adults.”
In response to Altman’s post on X, Missouri Senator Josh Hawley quoted Altman’s post with this message:
“You made ChatGPT ‘pretty restrictive’? Really. Is that why it has been recommending kids harm and kill themselves?”
Altman is so confident that they’ve addressed mental health safety that the company is reverting ChatGPT’s behavior so it “behaves more like what people liked about 4o.” Altman essentially apologized to users for the changes that were made to address mental health problems that arose with use of the chatbot:
“We realize this made it less useful/enjoyable to many users who had no mental health problems, but given the seriousness of the issue we wanted to get this right.”
Separately, the company announced the members of its Expert Council on Well-Being and AI, an eight-person council of mental health experts.
As a reward for the adults who aren’t suffering mental health issues exacerbated by confiding in the chatbot, Altman says that erotica is on the way.
“In December, as we roll out age-gating more fully and as part of our ‘treat adult users like adults’ principle, we will allow even more, like erotica for verified adults.”
In response to Altman’s post on X, Missouri Senator Josh Hawley quoted Altman’s post with this message:
“You made ChatGPT ‘pretty restrictive’? Really. Is that why it has been recommending kids harm and kill themselves?”