论文标题

RGB-D-E:快速6-DOF对象跟踪的活动摄像机校准

RGB-D-E: Event Camera Calibration for Fast 6-DOF Object Tracking

论文作者

Dubeau, Etienne, Garon, Mathieu, Debaque, Benoit, de Charette, Raoul, Lalonde, Jean-François

论文摘要

增强现实设备需要多个传感器来执行各种任务,例如本地化和跟踪。当前,流行的相机主要基于框架(例如RGB和DEPTH),该相机施加了高数据带宽和功率使用情况。凭借仅使用基于框架的传感器的低功率和更响应迅速的增强现实系统的必要性,对需要从环境中需要高频数据的各种算法施加了限制。因此,由于其低功率,带宽和延迟以及非常高的频率数据采集功能,基于事件的传感器变得越来越流行。在本文中,我们首次建议使用基于事件的摄像机来提高3D对象跟踪以6个自由度的速度。该应用需要处理非常高的对象速度以传达引人注目的AR体验。为此,我们提出了一个新系统,该系统将最近的RGB-D传感器(Kinect Azure)与事件摄像头(Davis346)结合在一起。我们开发了一种深度学习方法,该方法将现有的RGB-D网络与基于级联的新型网络结合在一起,并证明我们的方法可以通过使用我们的RGB-D-E管道来大大提高基于最先进的基于框架的6-DOF对象跟踪器的鲁棒性。

Augmented reality devices require multiple sensors to perform various tasks such as localization and tracking. Currently, popular cameras are mostly frame-based (e.g. RGB and Depth) which impose a high data bandwidth and power usage. With the necessity for low power and more responsive augmented reality systems, using solely frame-based sensors imposes limits to the various algorithms that needs high frequency data from the environement. As such, event-based sensors have become increasingly popular due to their low power, bandwidth and latency, as well as their very high frequency data acquisition capabilities. In this paper, we propose, for the first time, to use an event-based camera to increase the speed of 3D object tracking in 6 degrees of freedom. This application requires handling very high object speed to convey compelling AR experiences. To this end, we propose a new system which combines a recent RGB-D sensor (Kinect Azure) with an event camera (DAVIS346). We develop a deep learning approach, which combines an existing RGB-D network along with a novel event-based network in a cascade fashion, and demonstrate that our approach significantly improves the robustness of a state-of-the-art frame-based 6-DOF object tracker using our RGB-D-E pipeline.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源