牛顿法最速下降法构造目标函数的模型我要分享

The model of objective function constructed by Newton method and steepest descent method

拟牛顿迭代法 无约束优化 拟牛顿法 导数约束 优化 s函数

关注次数: 473

下载次数: 0

文件大小: 1KB

代码分类: 一般算法

开发平台: matlab

下载需要积分: 1积分

版权声明:如果侵犯了您的权益请与我们联系,我们将在24小时内删除。

代码描述

中文说明:

拟牛顿法和最速下降法(Steepest Descent Methods)一样只要求每一步迭代时知道目标函数的梯度。通过测量梯度的变化,构造一个目标函数的模型使之足以产生超线性收敛性。这类方法大大优于最速下降法,尤其对于困难的问题。另外,因为拟牛顿法不需要二阶导数的信息,所以有时比牛顿法(Newton s Method)更为有效。如今,优化软件中包含了大量的拟牛顿算法用来解决无约束,约束,和大规模的优化问题。


English Description:

Like the steepest descent methods, the quasi Newton method only requires the gradient of the objective function to be known at each iteration step. By measuring the change of gradient, a model of objective function is constructed to produce superlinear convergence. This method is much better than the steepest descent method, especially for difficult problems. In addition, because the quasi Newton method does not need the information of the second derivative, it is sometimes more effective than Newton's method. Nowadays, optimization software contains a large number of quasi Newton algorithms to solve unconstrained, constrained, and large-scale optimization problems.


代码预览

BFGS.m

ex03_13062324.m

Golden_section.m

grad.m