论文标题

解决相机位置以实际应用边缘设备上的凝视估算

Resolving Camera Position for a Practical Application of Gaze Estimation on Edge Devices

论文作者

Van Ma, Linh, Tran, Tin Trung, Jeon, Moongu

论文摘要

大多数凝视估计研究仅适用于相机完美捕捉眼睛凝视的设置条件。他们尚未从文学上指定如何正确设置一个人的位置。在本文中,我们对凝视估计的研究进行了研究,并具有逻辑相机设置位置。我们通过使用具有现实情况的廉价边缘设备来进一步将研究带入实际应用中。也就是说,我们首先建立了一个购物环境,我们想掌握客户凝视行为。该设置需要一个最佳的相机位置,以维持现有的凝视估计研究的估计准确性。然后,我们应用了几次学习目光估算的最新技术,以减少推理阶段的训练抽样。在实验中,我们对NVIDIA JETSON TX2进行了实施的研究,并达到合理的速度,12 fps比我们的参考工作更快,而凝视估计的精度没有太大降低。源代码可在https://github.com/linh-gist/gazeestimatimatimatimatimatimatimatimatimatimatimatimatimatimatimatimatimatimatimatimatimatimatimatimatimatimationtx2上发布。

Most Gaze estimation research only works on a setup condition that a camera perfectly captures eyes gaze. They have not literarily specified how to set up a camera correctly for a given position of a person. In this paper, we carry out a study on gaze estimation with a logical camera setup position. We further bring our research in a practical application by using inexpensive edge devices with a realistic scenario. That is, we first set up a shopping environment where we want to grasp customers gazing behaviors. This setup needs an optimal camera position in order to maintain estimation accuracy from existing gaze estimation research. We then apply the state-of-the-art of few-shot learning gaze estimation to reduce training sampling in the inference phase. In the experiment, we perform our implemented research on NVIDIA Jetson TX2 and achieve a reasonable speed, 12 FPS which is faster compared with our reference work, without much degradation of gaze estimation accuracy. The source code is released at https://github.com/linh-gist/GazeEstimationTX2.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源