Are intelligent chatbots the healers of the future?

Pixels / Lisa Summer

Source: Pexels / Liza Summer

Could AI chatbots be the wizard of the future? ChatGPT, a chatbot for text generation using OpenAI’s powerful third-generation GPT-3 language processing model, reignites this decades-old question.

GPT-3, one of the multi-language paradigms, was introduced in 2020, and its predecessors years earlier. It received widespread attention when OpenAI released a public preview of ChatGPT as part of its research phase. GPT-3 is a third-generation pre-trained transformer, which is a machine learning model of a neural network that has been trained on massive amounts of conversational text from the Internet and refined with training from human reviewers.

GPT-3 has been used in many ways across industries: to write a play produced in the UK, to create a text adventure game, to create applications for non-programmers, or to create phishing emails as part of a study of malicious use cases. In 2021, a game developer created a chatbot that simulated his late fiancée until OpenAI shut down the project.

AI chatbots are promising for certain types of therapy that are more structured, concrete, and skill-based (eg, cognitive behavioral therapy, dialectical behavioral therapy, or wellness coaching).

Research has shown that cIt can teach people skills and train them to quit smoking, eat healthy, and exercise more. One SlimMe AI chatbot with artificial empathy helped people lose weight. During the COVID pandemic, the World Health Organization has developed virtual humans and chatbots to help people quit smoking. many Companies have created chatbots for support, including Woebot, which was created in 2017, based on cognitive behavioral therapy. Other chatbots offer guided meditation or monitor an individual’s mood.

However, delivering certain types of therapy, such as psychodynamic, therapeutic, or humane therapy, may be more difficult via chatbots because it is unclear how effective these therapies would be without the human element.

Potential advantages of Chatbot handlers

  • Scalability, accessibility, and affordability. Virtual automated therapy, if done effectively, could help bring mental health services to more people, on their own time and in their own homes.
  • People can be less self-aware and more amenable to chatbots. Some studies have found that people can feel more comfortable revealing private or embarrassing information to chatbots.
  • Standardized, standardized and traceable delivery of care. Chatbots can deliver a uniform and more predictable set of responses, and these interactions can be reviewed and analyzed later.
  • multiple modalities. Chatbots can be trained to provide a multitude of treatment approaches beyond what an individual human therapist might provide. The algorithm can help determine which type of treatment is best for each case.
  • Personalize the treatment. ChatGPT generates a chat script in response to text prompts and can remember previous prompts, making it possible to become a dedicated handler.
  • Access to extensive psychological education resources. Chatbots can tap into and connect customers with widely available digital resources, including websites, books, or online tools.
  • Enhancement or collaboration with human healers. Chatbots can augment real-time therapy by providing feedback or suggestions, such as improving empathy.

Potential limitations and challenges for Chatbot handlers

Chatbot handlers face human-AI interaction barriers.

  • Authenticity and empathy. What are the attitudes of humans towards chatbots, and will they be a barrier to recovery? Will people miss human contact in therapy? Even if chatbots could deliver empathetic language and the right words, that alone may not be enough. Research has shown that people prefer human-to-human interaction in certain emotional situations, such as venting or expressing frustration or anger. a A 2021 study found that people were more comfortable with a chatbot based on how they felt: when people were angry, they were less satisfied with the chatbot.

    People may not feel understood or heard when they know that he or she is not a real human being at the end of the conversation. The ‘active ingredient’ in therapy can be based on the human-to-human relationship – the human’s testimony of the difficulties or suffering one is going through. AI replacement likely won’t work in all situations.

  • Timing and subtle interactions. Many approaches to therapy require features beyond empathy, including a timely balance of challenge and support. Chatbots are limited to text responses and cannot express through eye contact and body language, although this would be possible with AI-powered “virtual human” or “human avatar” therapists, although it is debatable if this provides the same level of comfort and confidence.
  • The difficulty of accountability and retention rates. People may be more likely to be visible and accountable to human therapists than chatbots. User interaction is a major challenge with mental health apps. It is estimated that only 4% of users who download a mental health app continue to use the app after 15 days, and only 3% continue after 30 days. Will people come regularly to their chatbot handler?
  • Complex, high-risk situations such as suicide assessment and crisis management may benefit from human judgment and supervision. In high-risk cases, augmenting AI with human supervision is safer than replacing it with AI. There are open ethical and legal questions regarding faulty AI liability – who will be held liable if a chatbot handler fails to appropriately assess or manage an urgent crisis or provides the wrong guidance? Will AI be trained to flag and alert professionals to situations with potential imminent risks of harm to self or others?
  • Increased need for user data security, privacy, and informed consent. Mental health data requires a high level of protection and confidentiality. Not many mental health apps are available about what happens to user data.
  • Possible hidden bias. It is important to be aware of the biases inherent in the training data and to find ways to mitigate them.

As human-AI interaction becomes a part of everyday life, more research is needed to show whether robotic therapists can deliver effective therapy beyond behavioral training. Studies comparing human versus chatbot across different treatment modalities will highlight the potential advantages and limitations of chatbot therapists.

Copyright © 2023 Marlynn Wei, MD, PLLC. All rights reserved.

To find a therapist near you, visit the Psychotherapy Directory today.

Leave a Comment