- Feb 5, 2002
- 186,384
- 68,707
- Country
- United States
- Gender
- Female
- Faith
- Catholic
- Marital Status
- Married
- Politics
- US-Others
Sam Altman, the CEO of ChatGPT-owner OpenAI, recently announced that his software would soon allow users to generate erotic content. On X, Altman posted: “In December, as we roll out age-gating more fully and as part of our ‘treat adult users like adults’ principle, we will allow even more, like erotica for verified adults.” Some users on X criticized this decision, and Altman doubled down. “You won’t get it unless you ask for it,” he said to one such critic.
What is ironic about Altman’s announcement about adult content is that it was made in the same post in which he proclaimed that OpenAI wants to steward their customers’ well-being. He admitted that previous ChatGPT restrictions were meant to prevent “mental health issues,” but that the new policy would allow users with “no mental health problems” to get more out of the service.
Three things are worth noting about OpenAI’s new policy.
First, Altman’s comments about mental health ring hollow. Pornography is not safe for those without mental health issues. It is a cause of them. The November 2025 issue of Harper’s contains a report that is both impossible to recommend and impossible to forget. Daniel Kolitz profiles the “gooners,” a movement of proudly porn-addicted young men who structure their lives, their homes, and their relationships around masturbation.
One of the key themes in Kolitz’s article is how the young men he profiles believe that porn and masturbation are their saviors from the cruel world of real women. Kolitz summarizes the philosophy of the young men this way:
“[C]ompilations can’t give you chlamydia; a zip file can’t impugn your virility. But what a zip file also can’t do is lie to you.” One of the young men meekly admits, “I just feel like it’s exhausting. For both parties.”
Kolitz’s essay is one of the most effective arguments I’ve ever seen for the dissociative and depressive effects of pornography. As major AI companies follow the advertising money and allow users to generate their own sexual dysfunction, they assume a major role in the deepening mental and emotional freefall that Kolitz documents. This is not stewardship of public mental health. It is a greedy disregard for it.
Continued below.
What is ironic about Altman’s announcement about adult content is that it was made in the same post in which he proclaimed that OpenAI wants to steward their customers’ well-being. He admitted that previous ChatGPT restrictions were meant to prevent “mental health issues,” but that the new policy would allow users with “no mental health problems” to get more out of the service.
Three things are worth noting about OpenAI’s new policy.
First, Altman’s comments about mental health ring hollow. Pornography is not safe for those without mental health issues. It is a cause of them. The November 2025 issue of Harper’s contains a report that is both impossible to recommend and impossible to forget. Daniel Kolitz profiles the “gooners,” a movement of proudly porn-addicted young men who structure their lives, their homes, and their relationships around masturbation.
One of the key themes in Kolitz’s article is how the young men he profiles believe that porn and masturbation are their saviors from the cruel world of real women. Kolitz summarizes the philosophy of the young men this way:
“[C]ompilations can’t give you chlamydia; a zip file can’t impugn your virility. But what a zip file also can’t do is lie to you.” One of the young men meekly admits, “I just feel like it’s exhausting. For both parties.”
Kolitz’s essay is one of the most effective arguments I’ve ever seen for the dissociative and depressive effects of pornography. As major AI companies follow the advertising money and allow users to generate their own sexual dysfunction, they assume a major role in the deepening mental and emotional freefall that Kolitz documents. This is not stewardship of public mental health. It is a greedy disregard for it.
Continued below.