Human-Robot Interaction Experiment
Photo by Jennifer Lee
Aerial Robots...
Aerial robots cannot easily change their morphology to incorporate features (such as anthropomorphic or zoomorphic) to enable them to communicate their intentions to people without the possibility of sacrificing their mobility and agility in flight. This can make it difficult for aerial robots sharing work space with humans to communicate intent since humans coordinate teamwork intentions through social cues such as gestures and gaze. Augmented reality (AR) could be a potential solution to this problem. We compared different AR designs to assess which more effectively communicated drone motion intentions.
Augmented Reality
AR overlays computer graphics onto real world environments in real-time. We explored designs which augmented the environment, the robot, and the user interface.
Comparing the 4 different designs
We used the Microsoft HoloLens and evaluated these designs, as well as using no AR, in a 60-participant between-subjects user study. Participants interacted with a collocated aerial robot while performing a task. We measured objective data such as time interrupted during the task and designed a survey with 7-point Likert scales to measure subjective data such as people's perceptions and preferences. We also collected qualitative data through open-ended questions in the survey.
Our task
All participants wore an AR head-mounted display, seeing one of our four designs or no AR.
For a set time, they created as many bead strings as possible following instructions for specific color patterns. The aerial robot flew between color stations in a pre-determined path. The robot path and color pattern order were the same for all participants.
What we found
We found that several of our designs significantly improved objective task efficiency when compared to users who only received physically-embodied orientation cues (no AR visuals). The designs offered several trade-offs in terms of intent clarity and user perceptions of the robot as a teammate. We envisioned this being applicable for situations such as warehouse stocking, where humans and robots could share working space.
The paper "Communicating Robot Motion Intent with Augmented Reality" has won Best Paper for HRI Design in the Human Robot Interaction Conference for 2018.
This research was the follow up from a pilot study conducted for Professor Dan Szafir's Human-Robot Interaction (HRI) course Spring of 2017. I worked along side the same two students from Professor Szafir's IRON Lab in both the pilot and final experiment. The HRI course, the pilot study, final experiment, and HRI conference paper submission has introduced me to research methodology and design, an overview of statistical analysis, HRI literature, and working with tools such as JMP and LaTeX.
Video password: hri2018