Visible to the public Would You Trust a Robot Therapist? Validating the Equivalency of Trust in Human-Robot Healthcare Scenarios

TitleWould You Trust a Robot Therapist? Validating the Equivalency of Trust in Human-Robot Healthcare Scenarios
Publication TypeConference Paper
Year of Publication2018
AuthorsXu, J., Bryant, D. G., Howard, A.
Conference Name2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)
Date Publishedaug
Keywordsartificial intelligence, Atmospheric measurements, corrective feedback, feedback, Games, Health Care, healthcare assistance, human agent, Human Behavior, human factors, human therapist condition, human-robot healthcare, human-robot interaction, interactive robot agents, interactive systems, interpersonal interaction, medical robotics, medical treatment, multi-agent systems, Particle measurements, patient treatment, patient-agent relationship, pubcrawl, resilience, Resiliency, robot therapist condition, Robot Trust, robotic agent, robots, robust trust, therapy intervention

With the recent advances in computing, artificial intelligence (AI) is quickly becoming a key component in the future of advanced applications. In one application in particular, AI has played a major role - that of revolutionizing traditional healthcare assistance. Using embodied interactive agents, or interactive robots, in healthcare scenarios has emerged as an innovative way to interact with patients. As an essential factor for interpersonal interaction, trust plays a crucial role in establishing and maintaining a patient-agent relationship. In this paper, we discuss a study related to healthcare in which we examine aspects of trust between humans and interactive robots during a therapy intervention in which the agent provides corrective feedback. A total of twenty participants were randomly assigned to receive corrective feedback from either a robotic agent or a human agent. Survey results indicate trust in a therapy intervention coupled with a robotic agent is comparable to that of trust in an intervention coupled with a human agent. Results also show a trend that the agent condition has a medium-sized effect on trust. In addition, we found that participants in the robot therapist condition are 3.5 times likely to have trust involved in their decision than the participants in the human therapist condition. These results indicate that the deployment of interactive robot agents in healthcare scenarios has the potential to maintain quality of health for future generations.

Citation Keyxu_would_2018