中文说明:
共轭梯度法(Conjugate Gradient)是介于最速下降法与牛顿法之间的一个方法,它仅需利用一阶导数信息,但克服了最速下降法收敛慢的缺点,又避免了牛顿法需要存储和计算Hesse矩阵并求逆的缺点,共轭梯度法不仅是解决大型线性方程组最有用的方法之一,也是解大型非线性最优化最有效的算法之一。 在各种优化算法中,共轭梯度法是非常重要的一种。其优点是所需存储量小,具有步收敛性,稳定性高,而且不需要任何外来参数。
English Description:
Conjugate gradient method Gradient is a method between steepest descent method and Newton method. It only needs to use the first derivative information, but it overcomes the shortcomings of slow convergence of steepest descent method, and avoids the shortcomings of Newton method which needs to store and calculate Hesse matrix and find the inverse. Conjugate gradient method is not only one of the most useful methods to solve large-scale linear equations, but also the most effective method to solve large-scale nonlinear optimization It's one of the algorithms. Conjugate gradient method is one of the most important optimization algorithms. The advantages of this method are small storage, step convergence, high stability and no need of any external parameters.