From VOA Learning English, this is the Health & Lifestyle report.
Earkick, a mental health bot, presents users with a friendly-looking dragon designed to seamlessly integrate into a children’s system.
When discussing stress, the dragon offers comforting statements akin to those of a trained mental health professional, such as a doctor. It then proceeds to recommend stress management techniques and breathing exercises.
One of the numerous AI tools dedicated to addressing youth mental health concerns is Earkick. However, Karin Andrea Stephan, the Inc-Chairman of Earkick, expresses hesitance in labeling their chatbots as therapy devices, citing a lack of certainty.
In the rapidly expanding digital health domain, determining whether these chatbots or apps serve as basic self-help aids or legitimate mental health treatments is crucial. The Food and Drug Administration (FDA) mandates approval for apps that do not claim to diagnose or treat medical conditions.
The Role of AI Bots
With the advancements in chatbots driven by artificial intelligence (AI), there is a growing scrutiny of the industry’s stance. These technologies mimic human language patterns by leveraging vast amounts of data.
The advantages are evident: the bots are constantly available, operating 24⁄7, and individuals can utilize them discreetly.
However, the downsides are notable as well: there is limited evidence supporting the effectiveness of these bots in improving mental health, and they lack FDA authorization to address depressive conditions.
Vaile Wright, the chair of technology and psychology at the American Psychological Association, highlights that users of these bots may not ascertain their true efficacy. While AIs differ from conventional mental health care methods, Wright suggests they could assist individuals dealing with less severe mental and emotional issues.
Earkick’s website explicitly mentions that the software does not offer any medical treatment, mental health diagnosis, analysis, or care. Some legal experts in the health field argue that these disclaimers may be inadequate.
Glenn Cohen, a professor at Harvard Law School, emphasizes the importance of providing a more direct statement if there are concerns about individuals using the app for mental health services. He proposes a disclaimer like “This is just for fun”.
Despite the ongoing shortage of mental health professionals, bots are increasingly fulfilling a role in this space.
This photo provided by Earkick in March 2024 shows the company’s cognitive health robot on a cellphone. (AP image of Earkick)
Shortage of Mental Health Professionals
The National Health Service of Great Britain has introduced a robot named Wysa to help alleviate stress, anxiety, and depression among young individuals, including those awaiting a doctor’s appointment. Similarly, some health insurers, organizations, and hospitals in the United States are offering comparable programs.
In the state of New Jersey, Dr. Angela Skrzynski, a home medicine practitioner, observes that individuals are often receptive to trying a bot when informed about the waiting time to see a therapist. Woebot, provided by her employer Virtua Health, caters to some young individuals.
Established in 2017 by a psychologist from Stanford, Woebot operates without AI plans. Instead, it utilizes a multitude of structured language models developed by its founders and researchers.
Alison Darcy, the head of Woebot, asserts that this rules-based approach is safer within the healthcare system. While acknowledging past challenges, Darcy reveals ongoing tests involving relational AI models.
The results of Woebot were featured in a study on AI published last year in Digital Medicine, indicating rapid improvements in treating depression. However, the long-term impact on mental health remains uncertain.
Ross Koppel, a student at the University of Pennsylvania specializing in health information technology, expresses concerns that these bots might be wrongly perceived as substitutes for medical care and medications. Koppel advocates for FDA evaluation and potential regulation of these chatbots.
Dr. Doug Opel from Seattle Children’s Hospital underscores the need for comprehensive understanding of this technology to enhance children’s mental and physical well-being.
And that concludes the Health & Lifestyle report. I’m Anna Matteo.
Matthew Perrone reported this story for the Associated Press from Washington, D.C. Anna Matteo adapted it for VOA Learning English.
Quiz-Artificial Ai to Aid with Mental Health Struggles
To participate in the survey, please proceed.
\_\_\_________________________________________________
Reflections on This Story
Robot: a computerized system or character designed to simulate human behavior during interactions with individuals (as in a game).
- Anxiety: an intense feeling of fear and apprehension, often accompanied by physical symptoms
- Diagnose: to identify (such as a disease) based on signs and symptoms
Artificial intelligence: the capability of computer systems or algorithms to mimic intelligent human behavior
- Counselor: an individual specializing in the study of mind and behavior or in the treatment of mental, emotional, and cognitive issues
- Treatment: the process of identifying a condition from its symptoms and signs
- Facilitate: to aid in bringing about something
We value your feedback. Do you find this content relatable? Feel free to share your thoughts using any of the emotions mentioned in the story in the Comments section. Here is our engagement plan.