Traditionally, Human-Robot Interaction (HRI) has involved a human operator explicitly controlling an unmanned asset using a Human Computer Interface (HCI). This paradigm does not allow humans and robots to interact as a team within a real-world dynamic environment (e.g., combat). In order to achieve this level of interaction, more advanced but intuitive interfaces must be developed that humans can easily learn to communicate to robotic team members. At the same time, robots must be able to respond in ways that can be easily interpreted with high response time and little cognitive load on Soldiers within the team.
One of the principal barriers to fielding HR teams capable of true collaboration for operationally relevant tasking, such as force protection of fixed sites, securing new sites, and intelligence, surveillance, and reconnaissance (ISR), is a lack of viable communication between human and robot elements. Inadequate communication capabilities prevent bi-directional communication leading to a loss of shared team awareness. In noisy, acoustically and visually challenging situations, multi-modal communications and their mutual contextual understanding are critical to effective command, control, intelligence, and collaboration.
REFEREED PUBLICATIONS
• Hancock, P.A., Billings, D.R., & Schaefer, K.E. (2011). Can you trust your robot?. Ergonomics in Design, 19(3), 24-29. PDF
Abstract: It is proposed that trust is a critical element in the interactive relations between humans and the automated and robotic technology they create. This article presents (a) why trust is an important issue for this type of interaction, (b) a brief history of the development of human-robot trust issues, and (c) guidelines for input by human factors/ergonomics professionals to the design of human-robot systems with emphasis on trust issues. Our work considers trust an ongoing and dynamic dimension as robots evolve from simple tools to active, sentient teammates.
• Hancock, P.A., Billings, D.R., Schaefer, K. E., Chen, J.Y.C., de Visser, E.J., & Parasuraman, R. (2011). A meta-analysis of factors affecting trust in human-robot interaction. Human Factors, 53(5), 517-527. PDF
Application: The findings provide quantitative estimates of human, robot, and environmental factors influencing HRI trust. Specifically, the current summary provides effect size estimates that are useful in establishing design and training guidelines with reference to robot-related factors of HRI trust. Furthermore, results indicate that improper trust calibration may be mitigated by the manipulation of robot design. However, many future research needs are identified.
REFEREED CONFERENCE PUBLICATIONS
Billings, D.R., Oleson, K.E., Chen, J.Y.C., & Hancock, P.A. (2011, August). Mitigating inappropriate trust in human-robot interactions: A review of trust calibration strategies in the literature. Poster accepted for presentation at the APA 119th Annual Convention, Washington, D.C.
Hancock, P. A., Billings, D. R., Oleson, K., & Chen, J. (2011, July). Factors impacting development of trust in human-robot teams. Poster presented at the Autonomous Systems Technical Assessment Board (TAB). Aberdeen Proving Ground, MD.
Hancock, P. A., Billings, D. R., Oleson, K. E. (2011, July). Influential factors in the development of human-robot team trust & future research needs. Poster presented at the 5th International Summer School on Aviation Psychology: Training and future challenges in aviation. Graz, Austria.
Oleson, K.E., Hancock, P.A., Billings, D. R., & Schesser, C. D. (2011, May). Trust in unmanned aerial systems: A synthetic distributed trust model derived from a human-robot trust meta-analysis. Poster accepted for presentation at the 16th International Symposium on Aviation Psychology, Dayton, OH.
Kocsis, V., Alesia, M., Billings, D.R., Oleson, K.E., & Hancock, P.A. (2011, April). Occupational Stereotypes in Human-Robot Interaction. Lecture conducted at the Human Factors and Applied Psychology Conference, Daytona, FL.
Oleson, K.E., Billings, D.R., Kocsis, V., Chen, J.Y.C., & Hancock, P.A. (2011). Antecedents of trust in human-robot collaborations. Conference Proceedings of the IEEE International Multi-Disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support (CogSIMA).
Sanders, T., Oleson, K.E., Billings, D.R., Chen, J.Y.C. & Hancock, P.A. (2011). A Model of Human-Robot Trust: Theoretical Framework and Meta-Analysis. Proceedings of the 55th Annual Human Factors and Ergonomics Society Conference. Las Vegas, NV.
Oleson, K., Billings, D.R., Kocsis, V., Chen, J.Y.C., & Hancock, P.A. (2010). Approaches to a meta-analysis of human-robot trust. Proceedings of the 8th Annual Meeting of the Society for Human Performance in Extreme Environments, (p. 35, Abstract Only).
TECHNICAL REPORTS
Hancock, P.A., Billings, D.R., Oleson, K.E., Chen, J.Y.C., Parasuraman, R., & de Visser, E. (submitted, 2011). A Meta-analysis of Factors Influencing the Development of Human-Robot Trust. (ARL Technical Report).