论文标题

在多任务学习中平衡任务难度的一种简单的通用方法

A Simple General Approach to Balance Task Difficulty in Multi-Task Learning

论文作者

Liang, Sicong, Zhang, Yu

论文摘要

在多任务学习中,不同任务的难度级别有所不同。有很多工作可以处理这种情况,我们将它们分为五个类别,包括直接总和方法,加权总和方法,最大方法,课程学习方法和多目标优化方法。这些方法具有自己的局限性,例如,使用手动设计的规则来更新任务权重,非平滑目标功能以及除了训练损失以外的其他功能。在本文中,为了减轻这些局限性,我们提出了一个平衡的多任务学习(BMTL)框架。与依靠任务加权的现有研究不同,BMTL框架建议将每个任务的训练损失转变,以平衡基于直觉的想法的任务之间的难度水平,即在优化过程中,具有较大培训损失的任务将受到更多关注。我们分析转换函数并得出必要的条件。提出的BMTL框架非常简单,可以与大多数多任务学习模型结合使用。实证研究表明,提出的BMTL框架的最新性能。

In multi-task learning, difficulty levels of different tasks are varying. There are many works to handle this situation and we classify them into five categories, including the direct sum approach, the weighted sum approach, the maximum approach, the curriculum learning approach, and the multi-objective optimization approach. Those approaches have their own limitations, for example, using manually designed rules to update task weights, non-smooth objective function, and failing to incorporate other functions than training losses. In this paper, to alleviate those limitations, we propose a Balanced Multi-Task Learning (BMTL) framework. Different from existing studies which rely on task weighting, the BMTL framework proposes to transform the training loss of each task to balance difficulty levels among tasks based on an intuitive idea that tasks with larger training losses will receive more attention during the optimization procedure. We analyze the transformation function and derive necessary conditions. The proposed BMTL framework is very simple and it can be combined with most multi-task learning models. Empirical studies show the state-of-the-art performance of the proposed BMTL framework.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源