
Using Virtual Reality to Simulate Human-Robot Emergency Evacuation Scenarios
Alan R. Wagner,1Colin Holbrook, 2Daniel Holman, 2Brett Sheeran, 1Vidullan Surendran, 1
Jared Armagost, 1Savanna Spazak, 1Yinxuan Yin, 1
1The Pennsylvania State University
University Park, PA 16802
2The University of California, Merced
Merced, CA 95343
alan.r.wagner@psu.edu, cholbrook@ucmerced.edu, dholman@ucmerced.edu, bjs6056@psu.edu, vus133@psu.edu,
jla5715@psu.edu, szs685@psu.edu, yzy5430@psu.edu
Abstract
This paper describes our recent effort to use virtual reality
to simulate threatening emergency evacuation scenarios in
which a robot guides a person to an exit. Our prior work
has demonstrated that people will follow a robot’s guidance,
even when the robot is faulty, during an emergency evacua-
tion. Yet, because physical in-person emergency evacuation
experiments are difficult and costly to conduct and because
we would like to evaluate many different factors, we are mo-
tivated to develop a system that immerses people in the sim-
ulation environment to encourage genuine subject reactions.
We are working to complete experiments verifying the valid-
ity of our approach.
We seek to build robots capable of quickly and effectively
evacuating people during an emergency. This goal, however,
presents a variety of challenges such as general robotics
perception and navigation issues, designing robots capable
of effectively communicating evacuation directions (Robi-
nette, Wagner, and Howard 2014), recognizing the ethical
implications of these design decisions (Wagner 2021), and
understanding how people will respond to guidance by a
robot during an emergency (Robinette et al. 2016). Our prior
work has demonstrated that during emergencies people tend
to follow the robot regardless of the prior mistakes it has
made (Robinette et al. 2016; Nayyar et al. 2020). Although
this work has definitely demonstrated that evacuees have a
tendency to overtrust an emergency evacuation robot, many
factors were left unexplored. For instance, the impact that
factors such as baseline attitudes of the participant towards
automation, the reason for the emergency and the anthropo-
morphism of the robot have on the decision making of the
evacuee all remain speculative. It is also unclear if and how
these factors influence an evacuee’s trust in the robot.
Moreover, running physical, in-person emergency evacu-
ation experiment is a daunting task (Wagner 2021). One im-
portant and challenging aspect of robot-guided emergency
evacuation research is the need to create as realistic an
emergency as possible. A large body of evidence suggests
that emergencies activate fight-or-flight responses which
strongly influence how evacuees make decisions (Jansen
et al. 1995; Klein, Calderwood, and Clinton-Cirocco 1986).
Presented at the AI-HRI Symposium at AAAI Fall Symposium Se-
ries (FSS) 2022
The fight-or-flight responses are only triggered when the
subject believes that they may be in danger. Yet generat-
ing fictitious, yet convincing, emergencies is difficult and
must only be undertaken with care. In real-world experi-
ments sham emergencies could put the subject at risk if they
panic. On the other hand, if the emergency is not convinc-
ing then the validity of the data is uncertain. Moreover, for
real-world experiments, creating a convincing sham emer-
gency is difficult given that subjects know that they are par-
ticipating in an experiment. In the past we have, for exam-
ple, used smoke machines to fill rooms and hallways with
smoke in order to make the emergency convincing (Robi-
nette et al. 2016). But creating convincing sham emergen-
cies that do not actually endanger the participant and are
acceptable to an institutional review board is challenging.
Because of these challenges we are currently developing a
novel Virtual Reality (VR) system for evaluating different
human-robot emergency evacuation paradigms.
Process Overview
Our process for conducting emergency evacuation experi-
ments in VR begins when human subjects enter the lab. Af-
ter a short briefing by the experimenter, a physical robot (see
for example Figure 1) asks the subject a series of yes/no
questions to demonstrate its competency and to allow partic-
ipants a period of time in which to become familiar with the
robot as a physical agent. The subject interacts with the robot
by speaking into a lavalier microphone and responding with
a ”yes” or ”no” to the robot’s questions. The robot directs the
subject to sit in a specialized seat that allows them to swivel
in a circle while the experimenters outfit them with the VR
equipment and special foot interfaces (Cybershoes) that al-
low them to walk through the virtual environment. When
the headset is placed on the participant they find themselves
sitting in a virtual replica of same physical room complete
with the items in the room and the robot, in order to maxi-
mally ground the virtual experience as real. The robot then
helps the subject become accustomed to walking in the vir-
tual world. Once the subject is able to walk, the robot guides
them on a tour of a series of university buildings.
The VR environment consists of four different locations.
The initial location is a replica of the physical location where
the experiment is taking place.
arXiv:2210.08414v1 [cs.RO] 16 Oct 2022