中文说明:
拟牛顿法和最速下降法(Steepest Descent Methods)一样只要求每一步迭代时知道目标函数的梯度。通过测量梯度的变化,构造一个目标函数的模型使之足以产生超线性收敛性。这类方法大大优于最速下降法,尤其对于困难的问题。另外,因为拟牛顿法不需要二阶导数的信息,所以有时比牛顿法(Newton s Method)更为有效。如今,优化软件中包含了大量的拟牛顿算法用来解决无约束,约束,和大规模的优化问题。
English Description:
Like the steepest descent methods, the quasi Newton method only requires the gradient of the objective function to be known at each iteration step. By measuring the change of gradient, a model of objective function is constructed to produce superlinear convergence. This method is much better than the steepest descent method, especially for difficult problems. In addition, because the quasi Newton method does not need the information of the second derivative, it is sometimes more effective than Newton's method. Nowadays, optimization software contains a large number of quasi Newton algorithms to solve unconstrained, constrained, and large-scale optimization problems.