Real-Time Navigation for Bipedal Robots in Dynamic Environments Octavian A. Donca1 Chayapol Beokhaimook2 and Ayonga Hereid1 Abstract The popularity of mobile robots has been steadily

2025-04-29 0 0 3.63MB 7 页 10玖币
侵权投诉
Real-Time Navigation for Bipedal Robots in Dynamic Environments
Octavian A. Donca1, Chayapol Beokhaimook2, and Ayonga Hereid1
Abstract The popularity of mobile robots has been steadily
growing, with these robots being increasingly utilized to execute
tasks previously completed by human workers. For bipedal
robots to see this same success, robust autonomous navigation
systems need to be developed that can execute in real-time and
respond to dynamic environments. These systems can be divided
into three stages: perception, planning, and control. A holistic
navigation framework for bipedal robots must successfully
integrate all three components of the autonomous navigation
problem to enable robust real-world navigation. In this paper,
we present a real-time navigation framework for bipedal robots
in dynamic environments. The proposed system addresses all
components of the navigation problem: We introduce a depth-
based perception system for obstacle detection, mapping, and
localization. A two-stage planner is developed to generate
collision-free trajectories robust to unknown and dynamic envi-
ronments. And execute trajectories on the Digit bipedal robot’s
walking gait controller. The navigation framework is validated
through a series of simulation and hardware experiments that
contain unknown environments and dynamic obstacles.
I. INTRODUCTION
Bipedal robots have been a popular field of research, due
to the large range of tasks they can be utilized in. The
general humanoid shape of bipedal robots allows them to
be better integrated into a society designed for humans.
However, for bipedal robots to be fully integrated into
society, robust autonomous navigation systems need to be
designed. These systems can generally be divided into three
stages: perception, planning, and control. It is only with the
combination of these three stages into a single navigation
framework in which bipedal robots can truly integrate into
human society.
In this paper, we seek to implement a holistic bipedal
robot navigation framework to enable the exploration of
unknown, dynamic environments. The three components of
autonomous navigation – perception, planning, and control
– must be combined into a single navigation system. The
perception system must be capable of extracting obstacle
and environment structure information. This environment
information must then be used to generate maps of the global
and local environment for planning. A planning framework
must generate global paths and local trajectories that are
robust to unknown environments and dynamic obstacles. Fur-
thermore, planning must respect the kinematic constraints of
the robot while avoiding obstacles to ensure safe and feasible
navigation. Finally, these trajectories must be executed with
*This work was supported in part by the National Science Foundation
under grant FRR-21441568.
1Mechanical and Aerospace Engineering, Ohio State University, Colum-
bus, OH, USA. (donca.2, hereid.1)@osu.edu.
2Ottonomy Inc. chayapol.beokhaimook@gmail.com
a low-level controller that maintains safe and stable walking
gaits. The combination of these capabilities will enable the
safe navigation of bipedal robots in complex environments.
Many works have expanded on the methods of A* [1]–
[5], PRM [6], and RRT [7]–[9] to the unique problems of
bipedal motion planning and footstep planning. However,
many of these works lack several components required for
autonomous navigation systems such as real-time perception,
mapping, and localization processes. Furthermore, only few
works expand further to adapt these bipedal motion planning
methods into more holistic bipedal navigation frameworks.
However, many are still unable to address components re-
quired in holistic autonomous bipedal systems such as on-
robot perception systems [10], localization methods [11],
robustness to dynamic obstacles [12], [13], or validation in
hardware.
We propose a real-time navigation framework for the
Digit robot based on Move Base Flex [14], as shown in
Fig. 1. The framework utilizes two RGB-Depth cameras and
a LiDAR sensor for perception. The environment is mapped
using global and local costmaps to capture large-scale en-
vironment structure and local dynamic obstacles. Odome-
try and localization are calculated using LiDAR Odometry
and Mapping during navigation. We developed a two-stage
planner to generate collision-free paths and obstacle avoiding
trajectories. In particular, a D* Lite global planner capable
of fast re-planning is used to generate high-level paths in
the global costmap. A Timed-Elastic-Band local planner then
follows the global path through optimization-based trajectory
generation that respects kinematic, velocity, acceleration,
and obstacle avoidance constraints. The local trajectory is
executed by generating a sequence of velocity commands
sent to Digit’s walking controller.
The rest of this paper is organized as follows. Section II
introduces the perception, mapping, and localization process.
Next, Section III describes the motion planning methods.
Section IV introduces the simulation and hardware experi-
ments and results. Finally, Section Vconcludes the naviga-
tion framework and provides future work discussion.
II. DIGIT PERCEPTION, MAPPING,AND LOCALIZATION
In this section, we describe the process of building a
real-time map of the environment using Digit’s perception
sensor suite and localizing Digit within that map. The robot
is equipped with two depth cameras (Intel RealSense D430,
placed at the pelvis, with one facing forwards at a downward
angle and one facing rearwards at a downward angle), and
one LiDAR sensor (Velodyne LiDAR VLP-16, placed on top
of Digit’s torso).
arXiv:2210.03280v1 [cs.RO] 7 Oct 2022
Fig. 1. The proposed navigation framework, an architecture built on top of Move Base Flex [14]. Point cloud detections are
used for obstacle segmentation by Random Sample Consensus [15]. Global and local costmaps [16] are generated from the
obstacle segmentations. LiDAR Odometry and Mapping [17] localizes the robot. A D* Lite global planner [18] uses the
global costmap to generate an optimal, collision-free path, which is used by the Timed-Elastic-Band local planner [19] to
generate local obstacle-avoiding trajectories, which are executed through velocity commands sent to Digit’s gait controller.
Fig. 2. Pre-processing and obstacle segmentation results. a)
Original point cloud, b) Filtered cloud, and c) Filtered point
cloud with segmented obstacles in red.
A. Perception
Point Cloud Pre-processing. Before these point clouds can
be used for obstacle segmentation, pre-processing is applied
to obtain a uniform density and remove outlier detections.
First, an average point cloud size reduction of 91.07% is
achieved using a Voxel Grid filter [20] for downsampling.
Additionally, inaccurate detections outside of the sensors’
accurate range are removed by removing points further
than 2.9 m of the depth cameras. Finally, erroneous points
detected underground due to reflections are removed using a
pass-through filter.
Obstacle Segmentation. After filtering the point cloud from
both depth cameras, the resulting clouds are fused into
a combined point cloud. The Random Sample Consensus
(RANSAC) method proposed in [15] is used to segment
obstacles from the fused point cloud. For a given point cloud,
P, RANSAC randomly samples 3 points to solve a unique
plane model:
ax +by +cz +d= 0,(1)
where a, b, c, d Rare the fitted coefficients, and (x, y, z)
R3represents the Cartesian coordinates of a point. Then,
the absolute distance, Di, of each point iin the cloud is
calculated for the fitted model:
Di=
axi+byi+czi+d
a2+b2+c2
,for i∈ {1,2, .., n},(2)
where n, is the total number of points in the point cloud P.
Points within a given threshold distance, Dthreshold, to the
plane model are labeled as inliers of the model, denoted PI,
PI={piP|Di< Dthreshold}.(3)
This process is repeated iteratively for a specified number of
iterations, N, determined statistically as:
N= round log (1 α)
log (1 (1 ε)3),(4)
where αis the desired minimum probability of finding at
least one good plane from P, usually within [0.90 0.99],
and ε= (1 u)where uis the probability that any selected
point is an inlier. The resulting plane model is selected as the
one which generates the largest number of inliers with the
smallest standard deviation of distances. From the resulting
RANSAC plane model, inlier points are retained as ground
points, and outlier points are retained as obstacle points.
B. Mapping
To enable real-time planning in a dynamic environment,
we create two 2D layered costmaps presented in [16] to
represent the global and local environments.
1) Global Map: The global map is used to capture and
store macro-scale information of the environment. To enable
this, the global map uses a lower resolution of 0.1 m and
only updates at a rate of 2 Hz. Cells in the global map have
two possible states:
a) Occupied: Any 3D point detections from the percep-
tion sensors are projected onto the 2D plane of the global
map. Any cells that contain projected points are classified as
occupied and are considered untraversable.
b) Free: Any cell that is not occupied is considered
free and traversable for the robot. Unexplored regions of the
environment are by default considered to be free.
2) Local Map: The local map is used for obstacle avoid-
ance to capture a smaller region local to the robot where
dynamic obstacles may be present. We use a higher resolu-
tion of 0.05 m and update the local map at 10 Hz. The local
map introduces another cell state in addition to occupied and
free cells:
c) Inflated: Cells within a certain distance of obstacles
are inflated with non-zero cell costs. These non-zero costs are
used to penalize trajectory planning through regions close to
obstacles while not completely prohibiting trajectories from
entering these regions.
摘要:

Real-TimeNavigationforBipedalRobotsinDynamicEnvironmentsOctavianA.Donca1,ChayapolBeokhaimook2,andAyongaHereid1Abstract—Thepopularityofmobilerobotshasbeensteadilygrowing,withtheserobotsbeingincreasinglyutilizedtoexecutetaskspreviouslycompletedbyhumanworkers.Forbipedalrobotstoseethissamesuccess,robust...

展开>> 收起<<
Real-Time Navigation for Bipedal Robots in Dynamic Environments Octavian A. Donca1 Chayapol Beokhaimook2 and Ayonga Hereid1 Abstract The popularity of mobile robots has been steadily.pdf

共7页,预览2页

还剩页未读, 继续阅读

声明:本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。玖贝云文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知玖贝云文库,我们立即给予删除!
分类:图书资源 价格:10玖币 属性:7 页 大小:3.63MB 格式:PDF 时间:2025-04-29

开通VIP享超值会员特权

  • 多端同步记录
  • 高速下载文档
  • 免费文档工具
  • 分享文档赚钱
  • 每日登录抽奖
  • 优质衍生服务
/ 7
客服
关注