module Bfgs:BFGS (Broyden-Fletcher-Goldfarb-Shanno) quasi-Newton local optimization method.
The pure Newton method uses a local quadratic approximation of the
f being minimized:
f(x)=f(x0)+ t(x-x0)G + t(x-x0)H(x-x0)
H are respectively the gradient vector and the hessian
matrix evaluated at
t is the transpose operator.
When this approximation is valid, and when the function is locally convex,
the minimum is found at
G + H(x-x0) = 0. This gives us the
x= x0 - inv(H)G, where
inv(H) is the inverse hessian.
Quasi-Newton methods rely on an iterative approximation of the inverse
hessian, usually starting with the identity matrix as an initial
approximation (which means that we follow the gradient descent direction).
At each step
k of the algorithm, the function is minimized along the
Let us call
lambda the step size along the quasi-Newton direction,
h(k) the approximation of the INVERSE hessian at point
The algorithm is the following. At each step
1) minimize the function along the quasi-Newton direction. The line search
f(x(k)-lambda.h(k)G) as a function of
2) make a step in the quasi-Newton direction. If the minimum along
that direction is found at
lambda_k, the quasi-Newton step is then
x(k+1)= x(k) - lambda_k.h(k)G
3) compute a new approximation
h(k+1) of the inverse hessian.
Several expressions can be found for this update. We used the
h(k+1)= t(I-yk.t(sk)/t(yk).sk).h(k).(I-yk.t(sk)/t(yk).sk) - sk.t(sk)/t(yk).sk
yk= G(k+1) - G(k)
sk= x(k+1) - x(k)
The search stops when the value of the objective function falls beyond and absolute tolerance, or when it cannot be improved anymore.
Now that we have presented the main features of the algorithm, let us talk about a few additional subtleties. Three line search methods have been implemented: backracking, golden section search, and Brent's parabolic interpolation.
The backtracking starts with an initial step (default size is 1) along
the quasi-Newton direction.
val verbose :
val logpoints :
val linesearch_method :
bt_epsilon are parameters for the
backtracking method. Backtracking constists in trying a candidate step
along the search direction. If the objective function is sufficiently
decreased, then the step is accepted. The decrease must be at least equal
!bt_acctol multiplied by the norm of the gradient projection along the
search direction (simplified Armijo-Goldstein condition).
Otherwise, if the current step is not accepted, a new step is tried
by multiplying the current step by a reduction factor
The search stops when the step size falls under
parameters for golden section search and Brent's parabolic interpolation.
For these methods, the search stops when the range of the search interval
falls under an absolute tolerance
!ls_abstol, or when it cannot be
significantly reduced (relative tolerance
!ls_reltol). If none of these
criteria is met, the search stops after
val bt_acctol :
val bt_reduc :
val bt_epsilon :
val ls_max_iter :
val ls_abstol :
val ls_reltol :
val step :
val bfgs :
(float array -> float) ->
(float array -> float array) ->
float array ->
float -> int -> (int -> bool) -> int * float array * float * float array
bfgs f g xinit max_iter abstol reltol freq_verbose debugreturns a tuple
iis the iteration at which the BFGS search stopped,
xis the best local minimum found for the function
gxis the value of the gradient
gat this point, and
hxis the approximation of the inverse hessian at the minimum.
The search of a minimum stops when the maximum number of iterations
is reached or when two successive values
f2 of the objective
function are closer than the absolute tolerance
abstol, or when the
relative difference between the two values falls under the relative
The exact formulation of the two last conditions is:
|f2-f1|< abstol or
|f1| to take account of the case when
f1 is zero).
xinit is the initial point at which the search starts,
freq_verbose is the frequency at which some information is printed
on the standard output.
debug is a function of the iteration number
i. Outputs for debugging
are printed when
debug i is true.