A newly released national survey commissioned by ICANotes, a behavioural health electronic health record provider, suggests that many mental health clinicians in the United States believe the growing use of artificial intelligence tools for emotional support may be slowing patients’ decisions to seek professional care.
The February 2026 survey included responses from 174 licensed mental health clinicians across the United States. Among them, 61.07% reported that patients’ use of AI emotional-support tools often (18.12%) or sometimes (42.95%) leads to delays in accessing licensed mental health services.
These findings emerge at a time when access to mental health care remains a significant challenge. Data from the 2024 National Survey on Drug Use and Health indicates that almost half (47.9%) of U.S. adults living with a mental illness received no mental health treatment during the previous year.
Concern among clinicians about the use of AI-based emotional-support tools appears substantial. On average, respondents rated their level of concern at 3.58 on a four-point scale. Additionally, 44.83% reported awareness that clients within their caseloads are using AI technologies such as chatbots, mobile apps, or virtual assistants for emotional or mental health support either before starting therapy or alongside it.
Disclosure of AI use by patients varies widely. Over the past year, 9.83% of clinicians reported that patients consistently inform them about using AI tools for emotional support. A further 28.90% indicated that patients disclose this occasionally, while 20.81% said such disclosure happens rarely. Meanwhile, 40.46% stated that patients do not report using AI tools at all.
AI usage appears particularly common among younger adults. Of clinicians who were aware of patient use, 55.46% identified individuals aged 26 to 40 as the most frequent users, followed by those aged 18 to 25 at 42.86%. Usage was also observed among patients aged 41 to 60 (32.77%), under 18 (19.33%), and those aged 61 or above (5.88%).
Clinicians pointed to constant availability as the main reason patients turn to AI-based tools, cited by 64.24% of respondents. Other reasons include lower costs (37.75%), feeling less intimidated than speaking with a clinician (31.13%), easier access compared with arranging therapy appointments (30.46%), faster responses (29.14%), greater anonymity (27.81%), challenges in finding a provider (25.17%), insurance limitations (15.89%), and previous negative therapy experiences (9.27%).
Emily Mendenhall, Professor and Medical Anthropologist at Georgetown University, said the findings reflect broader structural challenges within the U.S. mental health system.
“Mental health care in the United States is only getting more difficult to access,” she said. “Because of structural barriers and rapid shifts in AI, the low-cost immediate strategy of AI as therapist may seem like a replacement for people who are struggling and cannot access the care they need.”
Dr. October Boyles, DNP, MSN, BSN, RN, behavioural health expert and clinical consultant at ICANotes, emphasised that AI tools should not replace professional mental health assessment.
“When individuals delay seeking professional care, especially for moderate to severe symptoms, opportunities for early intervention can be missed,” said Dr. Boyles. “Technology can support clinicians and patients, but it must be implemented thoughtfully, with patient safety and evidence-based practice at the forefront.”
The research survey was conducted in February 2026 and gathered responses from 174 licensed mental health clinicians practising across the United States.
