The eyes and hearts of UAV pilots: observations of physiological
responses in real-life scenarios
Alexandre Duval1, Anita Paas2, Abdalwhab Abdalwhab1and David St-Onge1
Abstract— The drone industry is diversifying and the number
of pilots increases rapidly. In this context, flight schools need
adapted tools to train pilots, most importantly with regard to
their own awareness of their physiological and cognitive limits.
In civil and military aviation, pilots can train themselves on
realistic simulators to tune their reaction and reflexes, but also
to gather data on their piloting behavior and physiological
states. It helps them to improve their performances. Opposed
to cockpit scenarios, drone teleoperation is conducted outdoor
in the field, thus with only limited potential from desktop simu-
lation training. This work aims to provide a solution to gather
pilots behavior out in the field and help them increase their
performance. We combined advance object detection from a
frontal camera to gaze and heart-rate variability measurements.
We observed pilots and analyze their behavior over three flight
challenges. We believe this tool can support pilots both in their
training and in their regular flight tasks.
I. INTRODUCTION
The industry of teleoperated drones for service, such as in
infrastructure inspection, crops monitoring and cinematog-
raphy, has expanded at least as fast as the technology that
supports it over the past decade. However, in most countries
the regulation is only slowly adapting. Nevertheless, several
regulating bodies already recognized human factors as a core
contributor to flight hazards. While core technical features
of the aerial systems are evolving, namely autonomy, flight
performances and onboard sensing, the human factors of
UAV piloting stay mostly uncharted territory.
Physiological measures, including eye-based measures
(changes in pupil diameter, gaze-based data, and blink rate),
heart rate variability, and skin conductance, are valuable
indirect measures of cognitive workload. These measures
are increasingly used to measure workload level during a
task and are being integrated into interfaces and training
applications to optimize performance and training programs
[1].
Eye-tracking glasses can be used to monitor training
progress during various levels of task load. In a task requiring
operators to track targets and other vehicles on a map,
Coyne and Sibley [2] found a significant decrease in operator
situational awareness when task load was high, which was
related to reduced eye gaze spent on the map. This suggests
that eye gaze may be useful as a predictor of situational
*We thank NSERC USRA and Discovery programs for their financial
support. We also acknowledge the support provided by Calcul Qu´
ebec and
Compute Canada.
1Alexandre Duval, Abdalwhab Abdalwhab and David St-Onge are with
the Lab INIT Robots, Department of Mechanical Engineering, Ecole de
technologie sup´
erieure, Canada name.surname@etsmtl.ca
2Anita Paas is with the Department of Psychology, Concordia University,
Canada anita.paas@concordia.ca
awareness. Further, Memar and Esfahani [3] found that gaze-
based data were related to target detection and situational
awareness in a tele-exploration task with a swarm of robots.
Thus, multisensory configurations can be more robust to
capture cognitive load. While each sensor is susceptible to
some noise, these sources of noise do not overlap between
sensors, such as HRV not influenced by luminance. This
works aims at extracting gaze behavior and so we also gather
pupil diameter for cognitive load estimation. However, we
added another device extract HRV metrics and enhance our
cognitive load estimation.
Section II opens the path with an overview of the vari-
ous inspirational domains to this work. We then build our
solution on a biophysical capturing software (sec. III) and a
detector trained on a custom dataset (sec. IV). Finally, we
present the results of a small user study in sec. V and discuss
our observations of the pilots behaviors.
II. RELATED WORKS
A. On gaze-based behavioral studies
The benefit of eye tracking is that we can measure gaze
behavior in addition to changes in pupil diameter. Gaze
behavior can provide information about the most efficient
way to scan and monitor multiple sources of input. For
example, in surveillance tasks, operators monitoring several
screens can be supported by systems that track gaze behavior
and automatically notify the operator to adjust their scanning
pattern [4]. In a simulated task, Veerabhadrappa, et al.
[5] found that participants achieved higher performance on
a simulated UAV refuelling task when they maintained a
longer gaze on the relevant region of interest compared to
less relevant regions. Further, in training scenarios, gaze-
based measures can identify operator attention allocation
and quantify progress of novice operators [6]. Gaze-based
measures of novices can also be compared with those of
experts to determine training progress and ensure efficient
use of gaze.
In a review paper focused on pilot gaze behaviour, Ziv
[7] found that expert pilots maintain a more balanced visual
scanning. Expert pilots scan the environment more efficiently
and spend less time on each instrument compared to novices.
However, in complex situations, experts spend more time
on the relevant instruments which enables them to make
better decisions than novices. Overall, Ziv concluded that
the differences in gaze behavior between expert and novice
pilots are related to differences in flight performance.
arXiv:2210.14910v1 [cs.HC] 26 Oct 2022