Geometry Optimisations

Geometry optimisation isa local optimisation method that forms the backbone of structure prediction.

The goal of a geometry optimisation is to modify the atomic coordinates and lattice parameters (in the case of periodic systems) to minimise the energy of the cell.

Geometry optimisations are typically poor at overcoming energy barriers but are the state of the art for identifying local minima. Geometry optimisations are not true structure searching methods by themselves, but are very effective when combined with a global search method such as random structure searching.

Several algorithms have been developed for this task.

Gradient Descent

Perhaps the simplest of a geometry optimisation algorithm is gradient descent. This is an iterative gradient-based method that updates an atoms coordinates based on the force acting on the atom, F, and some scalar step size, γ, which is set by the user. It is commonly written as,

xn+1 = xn + γFx,

where x is the current value of the x-coordinate, xn+1 is the new value of the x-coordinate, and Fx is the x-component of the force.

While simple to implement, gradient descent requires careful selection of γ, which is somewhat system dependent. A value that is too small will take too long to find the minimum, while a value is too large will frequently overshoot the minimum.

Gradient Descent often struggles to converge within an acceptable tolerance and within an acceptable time frame, and therefore is not widely used.

Newton’s Method

Newton’s method is a derivative-based that will optimise a quadratic function in a single step. For general applications, Newton’s method approximates a function as a series of quadratic functions, iteratively optimising the variable(s) to where the minimum is believed to be for the present iteration’s quadratic function.

For a single variable, Newton’s method may be written as,

Xn+1 = Xn – f'(X)/f”(X),

where f'(X) is the derivative of the optimised function, f”(X) is the second derivative of the optimised function, Xn is the current value of the variable and Xn+1 is the new value of the variable.

In the context of geometry optimisations, it is often written as,

Xn+1 = Xn + F/Hf,

where F is the force (note that the sign is now a + because force is the negative of the derivative of the energy) and Hf is the Hessian matrix. While powerful, it is computationally demanding to calculate the exact Hessian for high-dimensional systems, such as in structure prediction.

Newton’s method is therefore rarely implemented for geometry optimisations, and instead, Quasi-Newton methods are preferred.

Quasi-Newton Methods

…..

BFGS

….

L-BFGS