next up previous contents
Next: Direct Inversion of Iterative Up: Second derivative methods Previous: Newton Raphson and quasi-Newton   Contents


Rational Function Optimization

While standard Newton-Raphson is based on the optimization on a quadratic model, by replacing this quadratic model by a rational function approximation we obtain the RFO method [130,131].2.6

$\displaystyle \Delta E= E({\bf q}_k+\Delta {\bf q}_k)-E({\bf q}_k)\cong \frac{\...
...rray}\right) \left(\begin{array}{c} 1 \\  \Delta {\bf q}_k \end{array}\right) }$ (2.76)

The numerator in equation 1.76 is the quadratic model of equation 1.74. The matrix in this numerator is the so called Augmented Hessian (AH). $ {\bf B}_k$ is the Hessian (analytic or approximated). The $ {\bf S}_k$ matrix is a symmetric matrix that has to be specified but normally is taken as the unit matrix $ {\bf I}$. The solution of RFO equation, that is, the displacement vector $ \Delta {\bf q}$ that extremalizes $ \Delta E$ (i.e. $ \nabla_{\bf q}(\Delta E )=0$) is obtained by diagonalization of the Augmented Hessian matrix solving the $ (N+1)$-dimensional eigenvalue equation 1.77

$\displaystyle \left(\begin{array}{cc} 0 & {\bf g}^T_k \\  {\bf g}_k & {\bf B}_k...
...mbda^{(k)}_\theta {\bf v}^{(k)}_\theta \qquad \forall \quad \theta=1,\ldots,N+1$ (2.77)

and then the displacement vector $ \Delta {\bf q}_k$ for the $ k_{th}$ step is evaluated as

$\displaystyle \Delta {\bf q}_k=\frac{1}{ v^{(k)}_{1,\theta}}{\bf v}^{'(k)}_\theta$ (2.78)

where

$\displaystyle ({\bf v}^{'(k)}_\theta)^T=( v^{(k)}_{2,\theta},\ldots, v^{(k)}_{N+1,\theta})$ (2.79)

In equation 1.79, if one is interested in locating a minimum then $ \theta=1$, and for a transition structure $ \theta=2$. As the optimization process converges, $ v^{(k)}_{1,\theta}$ tends to 1 and $ \lambda^{(k)}_\theta$ to 0.
next up previous contents
Next: Direct Inversion of Iterative Up: Second derivative methods Previous: Newton Raphson and quasi-Newton   Contents
Xavier Prat Resina 2004-09-09