Real-Time Navigation for Bipedal Robots in Dynamic Environments
Octavian A. Donca1, Chayapol Beokhaimook2, and Ayonga Hereid1
Abstract— The popularity of mobile robots has been steadily
growing, with these robots being increasingly utilized to execute
tasks previously completed by human workers. For bipedal
robots to see this same success, robust autonomous navigation
systems need to be developed that can execute in real-time and
respond to dynamic environments. These systems can be divided
into three stages: perception, planning, and control. A holistic
navigation framework for bipedal robots must successfully
integrate all three components of the autonomous navigation
problem to enable robust real-world navigation. In this paper,
we present a real-time navigation framework for bipedal robots
in dynamic environments. The proposed system addresses all
components of the navigation problem: We introduce a depth-
based perception system for obstacle detection, mapping, and
localization. A two-stage planner is developed to generate
collision-free trajectories robust to unknown and dynamic envi-
ronments. And execute trajectories on the Digit bipedal robot’s
walking gait controller. The navigation framework is validated
through a series of simulation and hardware experiments that
contain unknown environments and dynamic obstacles.
I. INTRODUCTION
Bipedal robots have been a popular field of research, due
to the large range of tasks they can be utilized in. The
general humanoid shape of bipedal robots allows them to
be better integrated into a society designed for humans.
However, for bipedal robots to be fully integrated into
society, robust autonomous navigation systems need to be
designed. These systems can generally be divided into three
stages: perception, planning, and control. It is only with the
combination of these three stages into a single navigation
framework in which bipedal robots can truly integrate into
human society.
In this paper, we seek to implement a holistic bipedal
robot navigation framework to enable the exploration of
unknown, dynamic environments. The three components of
autonomous navigation – perception, planning, and control
– must be combined into a single navigation system. The
perception system must be capable of extracting obstacle
and environment structure information. This environment
information must then be used to generate maps of the global
and local environment for planning. A planning framework
must generate global paths and local trajectories that are
robust to unknown environments and dynamic obstacles. Fur-
thermore, planning must respect the kinematic constraints of
the robot while avoiding obstacles to ensure safe and feasible
navigation. Finally, these trajectories must be executed with
*This work was supported in part by the National Science Foundation
under grant FRR-21441568.
1Mechanical and Aerospace Engineering, Ohio State University, Colum-
bus, OH, USA. (donca.2, hereid.1)@osu.edu.
2Ottonomy Inc. chayapol.beokhaimook@gmail.com
a low-level controller that maintains safe and stable walking
gaits. The combination of these capabilities will enable the
safe navigation of bipedal robots in complex environments.
Many works have expanded on the methods of A* [1]–
[5], PRM [6], and RRT [7]–[9] to the unique problems of
bipedal motion planning and footstep planning. However,
many of these works lack several components required for
autonomous navigation systems such as real-time perception,
mapping, and localization processes. Furthermore, only few
works expand further to adapt these bipedal motion planning
methods into more holistic bipedal navigation frameworks.
However, many are still unable to address components re-
quired in holistic autonomous bipedal systems such as on-
robot perception systems [10], localization methods [11],
robustness to dynamic obstacles [12], [13], or validation in
hardware.
We propose a real-time navigation framework for the
Digit robot based on Move Base Flex [14], as shown in
Fig. 1. The framework utilizes two RGB-Depth cameras and
a LiDAR sensor for perception. The environment is mapped
using global and local costmaps to capture large-scale en-
vironment structure and local dynamic obstacles. Odome-
try and localization are calculated using LiDAR Odometry
and Mapping during navigation. We developed a two-stage
planner to generate collision-free paths and obstacle avoiding
trajectories. In particular, a D* Lite global planner capable
of fast re-planning is used to generate high-level paths in
the global costmap. A Timed-Elastic-Band local planner then
follows the global path through optimization-based trajectory
generation that respects kinematic, velocity, acceleration,
and obstacle avoidance constraints. The local trajectory is
executed by generating a sequence of velocity commands
sent to Digit’s walking controller.
The rest of this paper is organized as follows. Section II
introduces the perception, mapping, and localization process.
Next, Section III describes the motion planning methods.
Section IV introduces the simulation and hardware experi-
ments and results. Finally, Section Vconcludes the naviga-
tion framework and provides future work discussion.
II. DIGIT PERCEPTION, MAPPING,AND LOCALIZATION
In this section, we describe the process of building a
real-time map of the environment using Digit’s perception
sensor suite and localizing Digit within that map. The robot
is equipped with two depth cameras (Intel RealSense D430,
placed at the pelvis, with one facing forwards at a downward
angle and one facing rearwards at a downward angle), and
one LiDAR sensor (Velodyne LiDAR VLP-16, placed on top
of Digit’s torso).
arXiv:2210.03280v1 [cs.RO] 7 Oct 2022