next up previous contents index
Next: Implemented Davidson-block iteration scheme Up: Algorithms used in VASP Previous: Efficient single band eigenvalue-minimization   Contents   Index

Conjugate gradient optimization

Instead of the previous iteration scheme, which is just some kind of Quasi-Newton scheme, it also possible to optimize the expectation value of the Hamiltonian using a successive number of conjugate gradient steps. The first step is equal to the steepest descent step in section 7.1.3. In all following steps the preconditioned gradient $ g^N_{n}$ is conjugated to the previous search direction. The resulting conjugate gradient algorithm is almost as efficient as the algorithm given in section 7.1.4. For further reading see [20,21,28].

N.B. Requests for support are to be addressed to: