共轭梯度法是介于最速下降法与牛顿法之间的一个方法,它 仅需利用一阶导数信息,但克服了最速下降法收敛慢的缺点,又避免了牛顿法需要存储和计算Hesse矩阵并求逆的缺点,共轭梯度法不仅是解决大型线性方程组最有用的方法之一,也是解大型非线性最优化最有效的算法之一。

  共轭梯度法最早是又Hestenes和Stiefle(1952)提出来的,用于解正定系数矩阵的线性方程组,在这个基础上,Fletcher和Reeves(1964)首先提出了解非线性最优化问题的共轭梯度法。由于共轭梯度法不需要矩阵存储,且有较快的收敛速度和二次终止性等优点,现在共轭梯度法已经广泛地应用与实际问题中。

  共轭梯度法是一个典型的共轭方向法,它的每一个搜索方向是互相共轭的,而这些搜索方向d仅仅是负梯度方向与上一次迭代的搜索方向的组合,因此,存储量少,计算方便.

 

matlab toolbox

The conjugate gradient method aims to solve a system of linear equations, Ax=b, where A is symmetric, without calculation of the inverse of A. It only requires a very small amount of membory, hence is particularly suitable for large scale systems.

It is faster than other approach such as Gaussian elimination if A is well-conditioned. For example,

n=1000;
[U,S,V]=svd(randn(n));
s=diag(S);
A=U*diag(s+max(s))*U'; % to make A symmetric, well-contioned
b=randn(1000,1);
tic,x=conjgrad(A,b);toc
tic,x1=A\b;toc
norm(x-x1)
norm(x-A*b)

Conjugate gradient is about two to three times faster than A\b, which uses the Gaissian elimination.