A conjugate direction optimization method is performed by using
sequential line search along directions that bear a strict mathematical relationship to one another.
In particular, this conjugate-gradient method generates a new search direction
by adding a vector βkdk to the negative gradient -gk+1.
The algorithm was originally developed for quadratic problems.
For convex quadratic problems, the algorithm converges in n iterations,
where n is the number of variables.
It still has good convergence properties when applied to non-quadratic problems.