AI therapy works best when you feel emotionally close to your chatbot, study reveals
Mental health chatbots work best when people form an emotional connection with their AI therapist, according to research out today by the University of Sussex.
With more than one in three UK residents now using AI to support their mental health, a new study highlights the key to effective chatbot therapy and the psychological risks of ‘synthetic intimacy’.
Analysis of feedback from 4,000 users of a market-leading mental health app found that therapy was more successful when users developed emotional intimacy with their AI therapist. However, the study also raises fresh questions about the growing phenomenon of synthetic intimacy - where people develop social, emotional or intimate bonds with artificial intelligence.
University of Sussex Assistant Professor Dr Runyu Shi said: “Forming an emotional bond with an AI sparks the healing process of self-disclosure. Extraordinary numbers of people say this works for them, but synthetic intimacy is not without its problems. People can get stuck in a self-fulfilling loop, with the chatbot failing to challenge dangerous perceptions, and vulnerable individuals end up no closer to clinical intervention.”
Reports of people around the globe in relationships or even marriages with artificial intelligence have put synthetic intimacy in the spotlight. The researchers say this is the extreme end of a common phenomenon and have pinpointed the stages by which intimacy with AI is generated.
The process is described as a loop, where users take part in intimate behaviour by disclosing personal information, then they have an emotional response, with feelings of gratitude, safety and freedom from judgement. This can lead to positive changes in thinking and wellbeing, such as self-confidence and higher energy levels. Over time this loop creates an intimate relationship, with human-like roles attributed to the app.
Published in Social Science and Medicine, today’s paper was based on feedback from users of Wysa, a popular mental health app prescribed under the NHS Talking Therapies programme. NHS Trusts are using the app to aid self-referral and support patients on waiting lists. The study reports that users commonly referred to the app as a friend, companion, therapist and even occasionally partner.
University of Sussex Professor Dimitra Petrakaki said: “Synthetic intimacy is a fact of modern life now. Policymakers and app designers would be wise to accept this reality and consider how to ensure cases are escalated when an AI witnesses users in serious need of clinical intervention.”
With chatbots increasingly filling the gaps left by overstretched services charities like Mental Health UK are calling for urgent safeguards to make sure people receive safe and appropriate information.
This press release was distributed by ResponseSource Press Release Wire on behalf of University of Sussex in the following categories: Health, Women's Interest & Beauty, Medical & Pharmaceutical, for more information visit https://pressreleasewire.responsesource.com/about.