Altman is so confident that they’ve addressed mental health safety that the company is reverting ChatGPT’s behavior so it “behaves more like what people liked about 4o.” Altman essentially apologized to users for the changes that were made to address mental health problems that arose with use of the chatbot:
“We realize this made it less useful/enjoyable to many users who had no mental health problems, but given the seriousness of the issue we wanted to get this right.”
Separately, the company announced the members of its Expert Council on Well-Being and AI, an eight-person council of mental health experts.
As a reward for the adults who aren’t suffering mental health issues exacerbated by confiding in the chatbot, Altman says that erotica is on the way.
“In December, as we roll out age-gating more fully and as part of our ‘treat adult users like adults’ principle, we will allow even more, like erotica for verified adults.”
In response to Altman’s post on X, Missouri Senator Josh Hawley quoted Altman’s post with this message:
“You made ChatGPT ‘pretty restrictive’? Really. Is that why it has been recommending kids harm and kill themselves?”