Clinical applications of Artificial Intelligence(AI)for mental health care have experienced a meteoric rise in the past few years.AIenabled chatbot software and applications have been administering significant medical...Clinical applications of Artificial Intelligence(AI)for mental health care have experienced a meteoric rise in the past few years.AIenabled chatbot software and applications have been administering significant medical treatments that were previously only available from experienced and competent healthcare professionals.Such initiatives,which range from“virtual psychiatrists”to“social robots”in mental health,strive to improve nursing performance and cost management,as well as meeting the mental health needs of vulnerable and underserved populations.Nevertheless,there is still a substantial gap between recent progress in AI mental health and the widespread use of these solutions by healthcare practitioners in clinical settings.Furthermore,treatments are frequently developed without clear ethical concerns.While AI-enabled solutions show promise in the realm of mental health,further research is needed to address the ethical and social aspects of these technologies,as well as to establish efficient research and medical practices in this innovative sector.Moreover,the current relevant literature still lacks a formal and objective review that specifically focuses on research questions from both developers and psychiatrists in AI-enabled chatbotpsychologists development.Taking into account all the problems outlined in this study,we conducted a systematic review of AI-enabled chatbots in mental healthcare that could cover some issues concerning psychotherapy and artificial intelligence.In this systematic review,we put five research questions related to technologies in chatbot development,psychological disorders that can be treated by using chatbots,types of therapies that are enabled in chatbots,machine learning models and techniques in chatbot psychologists,as well as ethical challenges.展开更多
基金This work was supported by the grant“Development of an intellectual system prototype for online-psychological support that can diagnose and improve youth’s psychoemotional state”funded by the Ministry of Education of the Republic of Kazakhstan.Grant No.IRN AP09259140.
文摘Clinical applications of Artificial Intelligence(AI)for mental health care have experienced a meteoric rise in the past few years.AIenabled chatbot software and applications have been administering significant medical treatments that were previously only available from experienced and competent healthcare professionals.Such initiatives,which range from“virtual psychiatrists”to“social robots”in mental health,strive to improve nursing performance and cost management,as well as meeting the mental health needs of vulnerable and underserved populations.Nevertheless,there is still a substantial gap between recent progress in AI mental health and the widespread use of these solutions by healthcare practitioners in clinical settings.Furthermore,treatments are frequently developed without clear ethical concerns.While AI-enabled solutions show promise in the realm of mental health,further research is needed to address the ethical and social aspects of these technologies,as well as to establish efficient research and medical practices in this innovative sector.Moreover,the current relevant literature still lacks a formal and objective review that specifically focuses on research questions from both developers and psychiatrists in AI-enabled chatbotpsychologists development.Taking into account all the problems outlined in this study,we conducted a systematic review of AI-enabled chatbots in mental healthcare that could cover some issues concerning psychotherapy and artificial intelligence.In this systematic review,we put five research questions related to technologies in chatbot development,psychological disorders that can be treated by using chatbots,types of therapies that are enabled in chatbots,machine learning models and techniques in chatbot psychologists,as well as ethical challenges.