论文标题

三胞胎在线实例匹配人重新识别的损失

Triplet Online Instance Matching Loss for Person Re-identification

论文作者

Li, Ye, Yin, Guangqiang, Liu, Chunhui, Yang, Xiaoyu, Wang, Zhiguo

论文摘要

在不同场景中挖掘相同身份的共同特征,以及在同一场景中不同身份的独特特征,是人重新识别(REID)领域中最重要的挑战。在线实例匹配(OIM)损失功能和三重损失功能是REID人的主要方法。不幸的是,他们俩都有缺点。 OIM损失可以平等处理所有样品,并且不强调硬样品。三胞胎损失过程以复杂而挑剔的方式批处理结构,并缓慢收敛。对于这些问题,我们提出了一个匹配(TOIM)损失功能的三重态实例实例,该实例强调了硬样品,并有效地提高了REID的准确性。它结合了OIM损失和三重损失的优势,并简化了批处理结构的过程,从而导致更快的收敛性。处理联合检测和识别任务时,可以在线训练它。为了验证我们的损失功能,我们根据从监视摄像机拍摄的图像收集和注释一个大规模基准数据集(UESTC-PR),该摄像机包含499个身份和60,437张图像。我们使用Resnet-50评估了我们在Duke,Marker-1501和UESTC-PR上提出的损失功能,结果表明,我们所提出的损失函数的表现优于基线方法最多21.7%,包括SoftMax损失,OIM损失和三重损失。

Mining the shared features of same identity in different scene, and the unique features of different identity in same scene, are most significant challenges in the field of person re-identification (ReID). Online Instance Matching (OIM) loss function and Triplet loss function are main methods for person ReID. Unfortunately, both of them have drawbacks. OIM loss treats all samples equally and puts no emphasis on hard samples. Triplet loss processes batch construction in a complicated and fussy way and converges slowly. For these problems, we propose a Triplet Online Instance Matching (TOIM) loss function, which lays emphasis on the hard samples and improves the accuracy of person ReID effectively. It combines the advantages of OIM loss and Triplet loss and simplifies the process of batch construction, which leads to a more rapid convergence. It can be trained on-line when handle the joint detection and identification task. To validate our loss function, we collect and annotate a large-scale benchmark dataset (UESTC-PR) based on images taken from surveillance cameras, which contains 499 identities and 60,437 images. We evaluated our proposed loss function on Duke, Marker-1501 and UESTC-PR using ResNet-50, and the result shows that our proposed loss function outperforms the baseline methods by a maximum of 21.7%, including Softmax loss, OIM loss and Triplet loss.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源