论文标题
rankdnn:学习排名几乎没有学习
RankDNN: Learning to Rank for Few-shot Learning
论文作者
论文摘要
本文介绍了一条新的几次学习管道,将图像检索作为二进制排名关系分类的相关排名。与图像分类相比,排名关系分类是样本有效的,域不可知。此外,它提供了一些关于几次学习的新观点,并且是最新方法的补充。我们深神经网络的核心组成部分是一个简单的MLP,它将其作为输入的图像三重态,该图像三重态被编码为两个矢量 - kronecker产品之间的差异,并输出二进制相关排名顺序。拟议的RankMLP可以建立在任何最新功能提取器之上,我们的整个深神经网络称为排名深神经网络或RankDNN。同时,RankDNN可以与其他后处理方法灵活融合。在元测试期间,RankDNN根据其与查询样本的相似性对支持图像进行排名,并将每个查询样本分配给其最近邻居的类标签。实验表明,RankDNN可以根据多种骨干有效地提高基准线的性能,并且在多个几次学习基准上的先前最先进的算法都优于先前的最先进的算法,包括Miniimagenet,TieredimageNet,Caltech-ICSD Birds,以及Cifar-fs和Cifar-fs。此外,对跨域挑战的实验证明了RankDNN的卓越可传递性。
This paper introduces a new few-shot learning pipeline that casts relevance ranking for image retrieval as binary ranking relation classification. In comparison to image classification, ranking relation classification is sample efficient and domain agnostic. Besides, it provides a new perspective on few-shot learning and is complementary to state-of-the-art methods. The core component of our deep neural network is a simple MLP, which takes as input an image triplet encoded as the difference between two vector-Kronecker products, and outputs a binary relevance ranking order. The proposed RankMLP can be built on top of any state-of-the-art feature extractors, and our entire deep neural network is called the ranking deep neural network, or RankDNN. Meanwhile, RankDNN can be flexibly fused with other post-processing methods. During the meta test, RankDNN ranks support images according to their similarity with the query samples, and each query sample is assigned the class label of its nearest neighbor. Experiments demonstrate that RankDNN can effectively improve the performance of its baselines based on a variety of backbones and it outperforms previous state-of-the-art algorithms on multiple few-shot learning benchmarks, including miniImageNet, tieredImageNet, Caltech-UCSD Birds, and CIFAR-FS. Furthermore, experiments on the cross-domain challenge demonstrate the superior transferability of RankDNN.The code is available at: https://github.com/guoqianyu-alberta/RankDNN.