论文标题
多目标加速近端梯度方法的单调性
Monotonicity for Multiobjective Accelerated Proximal Gradient Methods
论文作者
论文摘要
众所周知,加速的近端梯度方法也称为快速迭代收缩率阈值算法(FISTA)对于许多应用而言是有效的。最近,Tanabe等人。提出了用于多目标优化问题的Fista扩展。但是,与单目标最小化的情况类似,某些迭代可能会增加目标函数值,并且子问题的不精确计算也可能导致分歧。在此的激励下,我们在这里提出了一个多物理优化的Fista变体,该变体施加了目标函数值的单调性。在单一目标的情况下,我们取回了贝克和泰布尔提出的所谓MFISTA。我们还证明,我们的方法具有$ o(1/k^2)$的全球融合,其中$ k $是迭代的数量,并且在需要单调性方面显示出一些数值优势。
Accelerated proximal gradient methods, which are also called fast iterative shrinkage-thresholding algorithms (FISTA) are known to be efficient for many applications. Recently, Tanabe et al. proposed an extension of FISTA for multiobjective optimization problems. However, similarly to the single-objective minimization case, the objective functions values may increase in some iterations, and inexact computations of subproblems can also lead to divergence. Motivated by this, here we propose a variant of the FISTA for multiobjective optimization, that imposes some monotonicity of the objective functions values. In the single-objective case, we retrieve the so-called MFISTA, proposed by Beck and Teboulle. We also prove that our method has global convergence with rate $O(1/k^2)$, where $k$ is the number of iterations, and show some numerical advantages in requiring monotonicity.