论文标题

通过学习特征转换的跨域几乎没有分类

Cross-Domain Few-Shot Classification via Learned Feature-Wise Transformation

论文作者

Tseng, Hung-Yu, Lee, Hsin-Ying, Huang, Jia-Bin, Yang, Ming-Hsuan

论文摘要

很少有射击分类旨在识别每个班级中只有很少的标记图像的新型类别。现有的基于公制的几弹性分类算法通过将查询图像的特征嵌入方式与使用学习度量的标记图像(支持示例)的图像进行比较,预测类别。尽管已经证明了有希望的性能,但由于跨域跨域特征分布的较大差异,这些方法通常无法概括为看不见的域。在这项工作中,我们解决了基于公制方法的域移动下几乎没有射击分类的问题。我们的核心思想是使用特征转换层使用仿射变换来增强图像特征,以模拟训练阶段不同域下的各种特征分布。为了捕获不同域下特征分布的变化,我们进一步采用学习学习方法来搜索特征转换层的超参数。我们使用五个少量分类数据集在域概括设置下进行了广泛的实验和消融研究:Mini-Imagenet,Cub,Cars,Cars,Plose和Plantae。实验结果表明,所提出的特征转换层适用于各种基于度量的模型,并在域移动下对几个弹出分类性能提供了一致的改进。

Few-shot classification aims to recognize novel categories with only few labeled images in each class. Existing metric-based few-shot classification algorithms predict categories by comparing the feature embeddings of query images with those from a few labeled images (support examples) using a learned metric function. While promising performance has been demonstrated, these methods often fail to generalize to unseen domains due to large discrepancy of the feature distribution across domains. In this work, we address the problem of few-shot classification under domain shifts for metric-based methods. Our core idea is to use feature-wise transformation layers for augmenting the image features using affine transforms to simulate various feature distributions under different domains in the training stage. To capture variations of the feature distributions under different domains, we further apply a learning-to-learn approach to search for the hyper-parameters of the feature-wise transformation layers. We conduct extensive experiments and ablation studies under the domain generalization setting using five few-shot classification datasets: mini-ImageNet, CUB, Cars, Places, and Plantae. Experimental results demonstrate that the proposed feature-wise transformation layer is applicable to various metric-based models, and provides consistent improvements on the few-shot classification performance under domain shift.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源