next up previous
Next: Riemannian Conjugate Gradient Up: OPTIMIZATION ALGORITHMS ON RIEMANNIAN Previous: NATURAL GRADIENT ASCENT

CONJUGATE GRADIENT METHODS

For better performance it can be useful to combine natural gradient learning with some standard superlinear optimization algorithm. One such algorithm is the nonlinear conjugate gradient (CG) method (Nocedal, 1991). The conjugate gradient method is a standard tool for solving high dimensional nonlinear optimization problems. During each iteration of the conjugate gradient method, a new search direction is generated by conjugation of the residuals from previous iterations. With this choice the search directions form a Krylov subspace and only the previous search direction and the current gradient are required for the conjugation process, making the algorithm efficient in both time and space complexity (Nocedal, 1991).



Subsections

Tapani Raiko 2007-04-18