Preprint Placing by Touching An empirical study on the importance of tactile sensing for precise object placing

2025-05-02 0 0 5.39MB 9 页 10玖币
侵权投诉
Preprint
Placing by Touching: An empirical study on the importance
of tactile sensing for precise object placing
Luca Lach1,2, Niklas Funk3, Robert Haschke1, S´
everin Lemaignan4, Helge Joachim Ritter1,
Jan Peters3,5,6,7, Georgia Chalvatzaki3,6
Fig. 1: Stable object placing sequence with a blind robot, from left to right: (1) an object is handed to the robot in an unknown pose.
Subsequently, by leveraging the tactile sensor readings inside the gripper, we estimate how the object needs to be reoriented for stable
placement. (2) Given the estimate, a controller aligns the object with the placing surface. (3) We move the robot down to place the object
on the table. Contact is again detected using tactile sensors. (4) The gripper is opened, and the robot retracts.
Abstract This work deals with a practical everyday prob-
lem: stable object placement on flat surfaces starting from
unknown initial poses. Common object-placing approaches
require either complete scene specifications or extrinsic sensor
measurements, e.g., cameras, that occasionally suffer from
occlusions. We propose a novel approach for stable object
placing that combines tactile feedback and proprioceptive
sensing. We devise a neural architecture that estimates a
rotation matrix, resulting in a corrective gripper movement that
aligns the object with the placing surface for the subsequent
object manipulation. We compare models with different sensing
modalities, such as force-torque and an external motion capture
system, in real-world object placing tasks with different objects.
The experimental evaluation of our placing policies with a
set of unseen everyday objects reveals significant generaliza-
tion of our proposed pipeline, suggesting that tactile sensing
plays a vital role in the intrinsic understanding of robotic
dexterous object manipulation. Code, models, and supplemen-
tary videos are available on https://sites.google.com/
view/placing-by-touching.
I. INTRODUCTION
Human dexterity is impeccable and is largely attributed to
the human sense of touch. Tactile sensing enables precise,
Authors contributed equally.
This work was supported by the European Union’s Horizon 2020 Marie
Curie Actions under grant no. 813713 NeuTouch, the Horizon Europe
research and innovation program under grant no. 101070600 SoftEnable,
the DFG Emmy Noether Programme (CH 2676/1-1), the BMBF Project
Sim4Dexterity, the BMBF Project Aristotle, and the AICO grant by the
Nexplore/Hochtief Collaboration with TU Darmstadt.
1Neuroinformatics Group, Technical Faculty, Bielefeld University
{llach, rhaschke, helge}@techfak.uni-bielefeld.de
2Institut de Rob`
otica i Inform`
atica Industrial, CSIC-UPC
3Computer Science Department, Technical University Darmstadt,
4PAL Robotics, 5German Research Center for AI (DFKI), Re-
search Department: Systems AI for Robot Learning, 6Hessian.AI,
7Centre for Cognitive Science. {niklas.funk, jan.peters,
georgia.chalvatzaki}@tu-darmstadt.de
This work has been submitted to the IEEE for possible publication.
Copyright may be transferred without notice, after which this version may
no longer be accessible.
reliable, dexterous manipulation, a crucial component for the
vision of generalized autonomy and of more capable, intel-
ligent autonomous robotic systems [1]. Integrating the sense
of touch is a key research topic in robotics, with prominent
works including tactile insertion [2]–[4], in-hand manipu-
lation [5]–[8], assembly [9]–[11], human-robot-interaction
[12], [13], and object pose estimation [14], [15].
In the past few years, significant effort has been put
into designing sensitive and compact tactile sensors that
allow easy integration with robotic systems. Compared to
more traditional Force/Torque (F/T) sensors that only pro-
vide one 6-dimensional F/T measurement at the sensor’s
location, tactile sensors offer a high spatial resolution and
measurements directly at the points of contact. However,
the high-dimensional tactile signals are usually unsuitable
for direct integration in control loops and require additional
preprocessing. Tactile sensors can be realized using a wide
range of sensing principles [16]. Recently, tactile sensors
based on piezo-resistive sensing distributed in an array of
taxels [17]–[21], and those relying on cameras capturing
a soft gel’s surface [22]–[27] have become increasingly
popular amongst the robotics community.
Herein, we focus on piezo-resistive tactile sensors. In par-
ticular, we use a pair of Myrmex tactile sensors [18], which
feature a 16×16 taxel array sampled at 1 kHz, i.e., matching
the frequency of good F/T sensors, whilst visuotactile sensors
can perform at best around 90 Hz. Regarding costs, tactile
sensors like Myrmex or GelSight are considerably cheaper
than F/T sensors.
This work studies the benefit of local tactile measurements
between gripper and object for stable and reliable object
placing. Stable object placement is an essential skill for
any autonomous robotic system, particularly for capable
assistive household robots. It forms the basis for many tasks,
such as object rearrangement, assembly, sorting, and storing
arXiv:2210.02054v4 [cs.RO] 27 Nov 2023
goods. While a large body of prior works exists on stable
object placing [28]–[34], none of those works investigates the
contribution of tactile feedback in stable placing. Rather, they
rely either on vision systems, which are prone to occlusions
and require external sensors, or accurate scene descriptions,
which demand cumbersome manual labor. We attempt to fill
this gap by investigating the impact of tactile sensing in this
simple yet challenging scenario.
We propose an effective pipeline for translating taxel-
based measurements into useful features for learning a pose
correction signal to ensure optimal object placement. A
placing action is optimal if the object’s placing normal
(orthogonal to its placing surface) is colinear with the normal
of the placing surface, e.g., a table. Our method comprises a
deep convolutional neural network that predicts a corrective
rotation action for the gripper. Given the current tactile
sensor readings and potentially adding other signals, e.g., F/T
information, we predict a rotation matrix w.r.t. the current
gripper frame (cf. Fig. 3(a)). The z-axis of this predicted
frame corresponds to the object’s placing normal. This
prediction is subsequently used to plan a hand movement
to align the object’s placing normal with the table’s normal.
After this single-step prediction and alignment, we attempt to
place the object on the surface while keeping the previously
determined orientation fixed. The major challenge here is
to predict the object’s placing normal solely from tactile
and proprioceptive sensors instead of employing traditional
extrinsic vision-based methods. To assess the importance of
learning-based placing policies for this problem, we compare
our method to two classical baseline approaches.
Our main contributions are twofold; (i) the development
and training of tactile-based policies for stable object placing
without requiring any extrinsic visual feedback, and (ii) an
open-source suite of our dataset, CADs, pretrained models,
and the codebase of all methods (both classical and deep
learning ones) from our extensive real-robot experiments.
Overall, our study confirms that tactile sensing can be a
powerful and valuable low-cost addition to robotic manipu-
lators: their signals provide features that increase reliability
and robot dexterity.
II. RELATED WORK
Object placing. Stable object placing is a crucial skill
for autonomous robotic systems. Many prominent tasks in
the robotic community, such as object rearrangement or
assembly, require robotic pick & place sequences that heavily
rely on this skill. The authors of [29] propose a model-based
pointcloud-conditioned approach for stable object placing
by matching polygon models of object and environment.
Similarly, [28] uses pointcloud observations for extracting
meaningful feature representations for learning to place new
objects in stable configurations. More recently, [32] proposes
to exploit learned keypoint representations from raw RGB-
D images for solving category-level manipulation tasks. [31]
also uses a combination of vision and learning for manipu-
lating unknown objects. [33] presents a planning algorithm
for stable object placement in cluttered scenes requiring a
fully specified environment. Closely related to our work is
[34], which presents an iterative learning-based approach
for placing household objects onto flat surfaces but using
a system of three external depth cameras for input.
While most of these works deal with the problem of
generating stable placing poses for unknown objects, the
main difference to our work lies in the input observation–
none of them considers tactile sensing. Instead, they all
rely on single or even multiple depth/RGB images. Relying
on image data might be problematic due to gripper-object
occlusions in highly cluttered scenes, especially if the object
ends up inside the gripper without any prior knowledge of its
pose. Additionally, external sensing systems require careful
and precise calibration w.r.t. the robot, which is often tedious,
time-consuming, and error-prone. In contrast, tactile sensors
directly provide the contact information between the object
and gripper, independent of the surrounding environment.
In-hand object pose estimation. Due to the inherent diffi-
culties of estimating a grasped object’s pose and due to its
importance for tasks like pick & place or in-hand manipu-
lation, multiple methods for object-in-hand pose estimation
have been developed. The authors of [35] solely exploit
tactile sensors and match their signal with a local patch
of the object’s geometry, thereby estimating its pose. Other
works [36], [37] make use of both visual and tactile inputs.
While [36] only requires an initial visual input for initializing
a particle filter, [37] employs an extended Kalman filter
constantly using vision & touch. Recent progress in deep
learning has fostered data-driven methods for in-hand object
pose estimation. [38]–[40] present end-to-end approaches
based on RGB images. While [38], [39] directly outputs pose
predictions, [40] learns observation models that can later be
exploited in optimization frameworks. [41] fuses vision and
tactile in an approach that self-determines the reliability of
each modality, while [42] exploits a learned tactile observa-
tion model in combination with a Bayes filter. Following the
successes of recent deep learning approaches, we propose
to learn an end-to-end direct mapping from tactile input for
estimating the grasped object’s placing normal. We want to
point out that we are not interested in estimating the object’s
full 6D pose. Instead, we only focus on aligning the object
with the placing surface. Moreover, our proposed method
does not require any repetitive measurements or filtering and
solely needs to be queried once. Finally, our method requires
a very small training set.
Insertion. Stable object placing is also related to tactile in-
sertion. Successful completion of both tasks requires suitable
alignment between object & table, or peg & hole. Several
works approach challenging insertion tasks using tactile
sensors [2]–[4], [43], [44]. The authors of [43] leverage
vision-based tactile sensors for precisely localizing small
objects inside the gripper. This information is subsequently
exploited for small-part insertions using classical control. [2],
[3] also focus on solving tight insertion tasks using learned
tactile representations. Both exploit the tactile measurements
as a feedback signal to predict residual control commands.
Recently, [4] demonstrated tactile insertion through active ex-
摘要:

PreprintPlacingbyTouching:AnempiricalstudyontheimportanceoftactilesensingforpreciseobjectplacingLucaLach∗1,2,NiklasFunk∗3,RobertHaschke1,S´everinLemaignan4,HelgeJoachimRitter1,JanPeters3,5,6,7,GeorgiaChalvatzaki3,6Fig.1:Stableobjectplacingsequencewithablindrobot,fromlefttoright:(1)anobjectishandedto...

展开>> 收起<<
Preprint Placing by Touching An empirical study on the importance of tactile sensing for precise object placing.pdf

共9页,预览2页

还剩页未读, 继续阅读

声明:本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。玖贝云文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知玖贝云文库,我们立即给予删除!
分类:图书资源 价格:10玖币 属性:9 页 大小:5.39MB 格式:PDF 时间:2025-05-02

开通VIP享超值会员特权

  • 多端同步记录
  • 高速下载文档
  • 免费文档工具
  • 分享文档赚钱
  • 每日登录抽奖
  • 优质衍生服务
/ 9
客服
关注