论文标题
衍生词免费优化中的梯度和黑森近似值
Gradient and Hessian approximations in Derivative Free Optimization
论文作者
论文摘要
这项工作研究了有限的差异以及使用插值模型以获得函数的第一和第二个衍生物的近似值。在此表明,如果在插值模型中使用了一组特定的点,则可以在$ \ nathcal {o}(o}}(o}(n)$计算中,可以在$ \ $ \ $ \ $ \ math的情况下,可以在$ \ mathcal {o}(o}(n)$计算中获得相关线性系统的解决方案(即,与Hessian的梯度和对角线的近似值)获得。非结构化线性系统。此外,如果使用“常规最小正基”形成插值点,则梯度近似结合的误差与有限差近似相同。提出了数值实验,该实验表明了如何在现有的衍生自由优化算法中使用衍生估计,从而证明了这些派生近似值的潜在实际用途之一。
This work investigates finite differences and the use of interpolation models to obtain approximations to the first and second derivatives of a function. Here, it is shown that if a particular set of points is used in the interpolation model, then the solution to the associated linear system (i.e., approximations to the gradient and diagonal of the Hessian) can be obtained in $\mathcal{O}(n)$ computations, which is the same cost as finite differences, and is a saving over the $\mathcal{O}(n^3)$ cost when solving a general unstructured linear system. Moreover, if the interpolation points are formed using a `regular minimal positive basis', then the error bound for the gradient approximation is the same as for a finite differences approximation. Numerical experiments are presented that show how the derivative estimates can be employed within an existing derivative free optimization algorithm, thus demonstrating one of the potential practical uses of these derivative approximations.