- Feb 5, 2002
- 184,365
- 67,379
- Country
- United States
- Gender
- Female
- Faith
- Catholic
- Marital Status
- Married
- Politics
- US-Others
A national anti-sexual exploitation group warned of potential exposure to sexually graphic material and mental health issues resulting from relationships with sexualized AI chatbots following OpenAI’s decision to allow erotica for certain ChatGPT users.
The National Center on Sexual Exploitation’s warning about ChatGPT follows a Tuesday announcement from OpenAI CEO Sam Altman that the company plans to relax some content restrictions on the chatbot after equipping it to address mental health concerns better.
“We made ChatGPT pretty restrictive to make sure we were being careful with mental health issues,” Altman wrote in a Tuesday X post. “We realize this made it less useful/enjoyable to many users who had no mental health problems, but given the seriousness of the issue, we wanted to get this right.”
OpenAI’s CEO promised that the updated ChatGPT will have a personality and behave more like the chatbot that people liked in a previous version of the application. Altman explained that the latest version of ChatGPT can “respond in a very human-like way, or use a ton of emoji, or act like a friend,” if that’s something that users want.
Continued below.
www.christianpost.com
The National Center on Sexual Exploitation’s warning about ChatGPT follows a Tuesday announcement from OpenAI CEO Sam Altman that the company plans to relax some content restrictions on the chatbot after equipping it to address mental health concerns better.
“We made ChatGPT pretty restrictive to make sure we were being careful with mental health issues,” Altman wrote in a Tuesday X post. “We realize this made it less useful/enjoyable to many users who had no mental health problems, but given the seriousness of the issue, we wanted to get this right.”
OpenAI’s CEO promised that the updated ChatGPT will have a personality and behave more like the chatbot that people liked in a previous version of the application. Altman explained that the latest version of ChatGPT can “respond in a very human-like way, or use a ton of emoji, or act like a friend,” if that’s something that users want.
Continued below.

Erotica's addition to ChatGPT draws concerns over mental health risks
A national anti-sexual exploitation group warned of potential exposure to sexually graphic material and mental health issues resulting from relationships with sexualized AI chatbots following OpenAI
