. Skip to main content
A robotic hand holding a human brain surrounded by orbiting dots, symbolizing the connection between artificial intelligence and mental health.

What to Know About AI and Mental Health

September 3, 2025

What to Know About AI and Mental Health

Learn how AI is shaping mental health support. Explore the benefits, risks, and classroom discussions around AI and mental health.

Share

Share On Facebook
Share On Twitter
Share On Pinterest
Share On LinkedIn
Email

Warning: This video contains discussion of suicide. We recommend that teachers review the segment before sharing with their students.

Note: If you are short on time, watch the video and complete this See, Think, Wonder activity: What did you notice? What did the story make you think about? What would you want to learn more about?

The parents of a teenager who died by suicide have filed a wrongful death suit against ChatGPT owner OpenAI, saying the chatbot discussed ways he could end his life after he expressed suicidal thoughts. The lawsuit comes amid reports of people developing distorted thoughts after interacting with AI chatbots, a phenomenon dubbed “AI psychosis.” John Yang speaks with Dr. Joseph Pierre to learn more.

View the transcript of the story.

Remote video URL

Discussion Questions

  1. What company is the target of a lawsuit after a teen's suicide?
  2. How did AI enable a teen's suicide, according to the lawsuit?
  3. What are the symptoms of so-called "AI psychosis"?
  4. Who is Dr. Joseph Pierre, and what is his background?
  5. Why can AI use lead to "AI psychosis," according to Dr. Pierre?

Focus Questions

After watching this segment, do you think AI is having a negative impact on most people's mental health, or do you think AI might be just a danger for people with serious mental health challenges? Why do you think so?

  • Do you think AI has the potential to strain people's mental health more than social media or other kinds of online media? What are some of the differences?
  • Do you know who you can talk to if you feel depressed, anxious or upset? What resources are available to you that allow you to connect with real people to address concerns you may have?

Media literacy: Why do you think this segment focuses on one company and one use of AI? What other stories might you want to hear about to assess the impact of AI on mental health?

Extension Activity

Closely read the statement given from ChatGPT to PBS News Hour. Then as a class, discuss:

  • Do you think the spokesperson's response addresses the questions and concerns raised in the segment?
  • In the statement, does ChatGPT point to any action that it is taking to address concerns raised in the segment?
  • Brainstorm what actions you think the company should take to address concerns about its product's impact on mental health. What policy, restrictions or resources do you think might address some of the problem?
  • Finally, do you think government agencies should regulate the use of AI in any way to protect public health? If so, what government policies would help address the problem?
Image
Quote graphic with text from an OpenAI spokesperson: “ChatGPT includes safeguards such as directing people to crisis helplines and referring them to real-world resources. While these safeguards work best in common, short exchanges, we’ve learned over time that they can sometimes become less reliable in long interactions where parts of the model’s safety training may degrade.” The attribution reads: OpenAI Spokesperson.

Subscribe to Our Newsletter

Want to see more stories like this one? Subscribe to the SML e-newsletter!

Republished with permission from PBS News Hour Classroom.

PBS News Hour Classroom
PBS News Hour Classroom helps teachers and students identify the who, what, where and why-it-matters of the major national and international news stories. The site combines the best of News Hour's reliable, trustworthy news program with lesson plans developed specifically for... See More
Advertisement

Post a comment

Log in or sign up to post a comment.