Many people now trust AI with their feelings. And therapists want to talk about it

A 32-year-old marketing manager in Fitchburg, Suzi Sanford, uses Anthropic’s AI chatbot Claude to process emotional stress between therapy sessions, finding it helpful for organizing fragmented thoughts. Meanwhile, psychiatrist Dr. Christine Crawford in Boston consults AI tools like ChatGPT to process trauma-related emotions from patient sessions, though she avoids sharing patient data and finds the experience unsettling due to its realism.
Suzi Sanford, a 32-year-old marketing manager from Fitchburg, relies on Anthropic’s AI chatbot Claude to help manage emotional stress between therapy sessions. She uses it to summarize her thoughts and recurring feelings, such as feeling 'unseen' and 'unheard,' which she finds difficult to articulate on her own. The AI provides structured summaries of her conversations, aiding her in reflecting on job and relationship pressures. A recent KFF poll revealed that 16% of U.S. adults used AI tools for mental health support in the past year, with younger adults more likely to do so. Pew Research found two-thirds of teenagers interact with chatbots, and nearly half of those with mental health conditions use AI for psychological help. These trends highlight growing reliance on AI for emotional support outside traditional therapy. Dr. Christine Crawford, a Boston-based psychiatrist, uses AI like ChatGPT to process emotions tied to traumatic patient cases. She inputs details of difficult sessions, such as a patient’s childhood trauma, to help her reflect and maintain focus. The AI’s responses, while helpful, unsettle her due to their lifelike nature, prompting her to avoid verbal interactions. Crawford compares her AI use to consulting peers during her training but notes the convenience of digital tools in private practice. She emphasizes strict confidentiality, never sharing patient information with AI platforms. Concerns about data privacy and ethical boundaries remain central to her cautious approach. Therapists like Crawford are increasingly discussing AI use with clients, integrating it into mental health conversations. While AI offers accessibility and immediate support, clinicians stress the need for human oversight to ensure ethical and effective care. The balance between AI assistance and professional guidance continues to evolve as digital tools reshape mental health support.
This content was automatically generated and/or translated by AI. It may contain inaccuracies. Please refer to the original sources for verification.