论文标题
视觉触觉推理:通过观察可变形的对象相互作用来估计接触力
Visual Haptic Reasoning: Estimating Contact Forces by Observing Deformable Object Interactions
论文作者
论文摘要
机器人对高度可变形的布的操纵提供了一个有前途的机会,可以帮助人们完成几项日常任务,例如洗碗;折叠洗衣;或针对患有严重运动障碍的人的敷料,洗澡和卫生援助。在这项工作中,我们介绍了一种公式,该公式使协作机器人能够用布料执行视觉触觉推理,这是在物理互动过程中推断应用力的位置和大小的行为。我们提出了两种不同的模型表示,即在物理模拟中训练,它们仅使用视觉和机器人运动学观察才能实现触觉推理。我们对这些模型进行了定量评估,以模拟机器人辅助的敷料,沐浴和洗碗任务,并证明训练有素的模型可以通过不同的相互作用,人体大小和物体形状跨越不同的任务。我们还通过现实世界中的移动操纵器提出了结果,该操作器使用我们的模拟训练模型来估计应用接触力,同时用布料执行物理辅助任务。可以在我们的项目网页上找到视频。
Robotic manipulation of highly deformable cloth presents a promising opportunity to assist people with several daily tasks, such as washing dishes; folding laundry; or dressing, bathing, and hygiene assistance for individuals with severe motor impairments. In this work, we introduce a formulation that enables a collaborative robot to perform visual haptic reasoning with cloth -- the act of inferring the location and magnitude of applied forces during physical interaction. We present two distinct model representations, trained in physics simulation, that enable haptic reasoning using only visual and robot kinematic observations. We conducted quantitative evaluations of these models in simulation for robot-assisted dressing, bathing, and dish washing tasks, and demonstrate that the trained models can generalize across different tasks with varying interactions, human body sizes, and object shapes. We also present results with a real-world mobile manipulator, which used our simulation-trained models to estimate applied contact forces while performing physically assistive tasks with cloth. Videos can be found at our project webpage.