Typos
This commit is contained in:
parent
11c9c7bb71
commit
d5b2568371
@ -321,9 +321,9 @@ It is also worth pointing out that, after each orbital rotation, the one- and tw
|
|||||||
|
|
||||||
To enhance the convergence of the microiteration process, we employ a variant of the Newton-Raphson method known as ``trust region''. \cite{Nocedal_1999}
|
To enhance the convergence of the microiteration process, we employ a variant of the Newton-Raphson method known as ``trust region''. \cite{Nocedal_1999}
|
||||||
This popular variant defines a region where the quadratic approximation \eqref{eq:EvarTaylor} is an adequate representation of the objective energy function \eqref{eq:Evar_c_k} and it evolves during the optimization process in order to preserve the adequacy via a constraint on the step size preventing it from overstepping, \ie, $\norm{\bk^{(\ell+1)}} \leq \Delta^{(\ell)}$, where $\Delta^{(\ell)}$ is the trust radius at the $\ell$th microiteration.
|
This popular variant defines a region where the quadratic approximation \eqref{eq:EvarTaylor} is an adequate representation of the objective energy function \eqref{eq:Evar_c_k} and it evolves during the optimization process in order to preserve the adequacy via a constraint on the step size preventing it from overstepping, \ie, $\norm{\bk^{(\ell+1)}} \leq \Delta^{(\ell)}$, where $\Delta^{(\ell)}$ is the trust radius at the $\ell$th microiteration.
|
||||||
By introduction a Lagrange multiplier $\lambda$ to control the trust-region size, one obtains $\bk^{(\ell+1)} = - (\bH^{(\ell)} + \lambda \bI)^{-1} \cdot \bg^{(\ell)}$.
|
By introducing a Lagrange multiplier $\lambda$ to control the trust-region size, one obtains $\bk^{(\ell+1)} = - (\bH^{(\ell)} + \lambda \bI)^{-1} \cdot \bg^{(\ell)}$.
|
||||||
The addition of the level shift $\lambda \geq 0$ removes the negative eigenvalues and ensure the positive definiteness of the Hessian matrix by reducing the step size.
|
The addition of the level shift $\lambda \geq 0$ removes the negative eigenvalues and ensure the positive definiteness of the Hessian matrix by reducing the step size.
|
||||||
By choosing the right value of $\lambda$, the step size is constraint into a hypersphere of radius $\Delta^{(\ell)}$ and is able to evolve from the Newton direction at $\lambda = 0$ to the steepest descent direction as $\lambda$ grows.
|
By choosing the right value of $\lambda$, the step size is constrained into a hypersphere of radius $\Delta^{(\ell)}$ and is able to evolve from the Newton direction at $\lambda = 0$ to the steepest descent direction as $\lambda$ grows.
|
||||||
The evolution of the trust radius during the optimization and the use of a condition to cancel the step where the energy rises ensure the convergence of the algorithm.
|
The evolution of the trust radius during the optimization and the use of a condition to cancel the step where the energy rises ensure the convergence of the algorithm.
|
||||||
More details can be found in Ref.~\onlinecite{Nocedal_1999}.
|
More details can be found in Ref.~\onlinecite{Nocedal_1999}.
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user