[ Search |  Up Level |  Project home |  Index |  Class hierarchy ]

 Algorithms.ox

Methods to Optimize an Objective or Solve a Non-Linear System.  ⇩ 

Algorithms are methods to optimize/solve a vector of parameters \((\psi)\) for either an objective to maximize or a system of equations to solve. Each algorithm is derived from the Algorithm class.

See the FiveO class hierarchy to see all the available algorithms.

Author:
© 2011-2023 Christopher Ferrall

Documentation of Items Defined in Algorithms.ox  ⇧ 

 Algorithm

Base class for optimization and system-solving algorithms.

To use an algorithm:
Declare a class for your Objective (e.g. a BlackBox or System).
Create an object of your objective class.
Create an object of the algorithm class you want to use, sending your objective to the creator.
If desired, call the algorithm's Tune() method to tweak parameters and/or change the volume setting.
Call the Iterate() method of the algorithm.
Public fields
 convergence Convergence code.
 holdF
 IIterate Running on the Client node or no MPI .
 inparallel Running in parallel.
 istr @internal.
 iter current iteration count.
 itoler static const Default top level convergence tolerance.
 logf log file
 lognm const name of log file
 logpfx const prefix on log before timestamp added.
 maxiter maximum iterations
 N .
 NormalStart Not restarting from alg.
 O const User's objective.
 path sequence of structural parameters .
 StorePath Store path of iterations in path.
 tolerance top level convergence tolerance
 Volume output level, NoiseLevels
Public methods
 Algorithm Base class for non-linear programming algorithms.
 CheckPoint virtual
 ItEnd virtual Finish up iteration.
 Iterate virtual
 ItStartCheck virtual This routine is called at the start if Iterate().
 out virtual
 Paths virtual
 Tune virtual Tune Parameters of the Algorithm.

 BFGS : QuasiNewton : HillClimbing : GradientBased : Algorithm

Broyden Fletcher Goldfarb Shanno Updating of the hessian H.

See Wikipedia::BFGS

To use this algorithm:
Declare a class for your Objective (e.g. a BlackBox or System).
Create an object of your objective class.
Create a BFGS object, sending your objective to the creator.
If desired, call Tune() method to tweak parameters and/or change the volume setting.
Call BFGS::Iterate().

class MyObjective : BlackBox{
    ⋮   // parameters should be declared as members of your class
    vfunc(subp=DoAll);  // you have to define the objective
    }
⋮
decl myobj = new MyObjective();
⋮
decl b = new BFGS(myobj);
⋮
b->Iterate();
See GetStarted for an example of using BFGS
Tune the parameters of the algorithm with Tune();
Public methods
 BFGS Create an object of the BFGS algorithm.
 Hupdate virtual Apply BFGS updating on the Hessian.
Inherited methods from HillClimbing:
HillClimbing
Inherited methods from GradientBased:
CheckPoint, Direction, GradientBased, Gupdate, HHupdate, Iterate, ItStartCheck, Tune
Inherited methods from Algorithm:
Algorithm, ItEnd, out, Paths
Inherited fields from GradientBased:
cmsg, deltaG, deltaX, gradtoler, Hresetcnt, IamBHHH, IamNewt, igradtoler, LM, LMitmax, LMmaxstep
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 BHHH : Newton : HillClimbing : GradientBased : Algorithm

Berndt Hall Hall Hausman Updating. Update Hessian with outer product of the gradient.
Public methods
 BHHH Create an object of the BHHH algorithm.
 Hupdate virtual UPdate the Hessian for the BHHH algorithm.
Inherited methods from Newton:
Newton
Inherited methods from HillClimbing:
HillClimbing
Inherited methods from GradientBased:
CheckPoint, Direction, GradientBased, Gupdate, HHupdate, Iterate, ItStartCheck, Tune
Inherited methods from Algorithm:
Algorithm, ItEnd, out, Paths
Inherited fields from GradientBased:
cmsg, deltaG, deltaX, gradtoler, Hresetcnt, IamBHHH, IamNewt, igradtoler, LM, LMitmax, LMmaxstep
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 Broyden : RootFinding : GradientBased : Algorithm

Broyden approximation to the Jacobian.
Public methods
 Broyden Create a new Broyden solution for a system of equations.
 Jupdate virtual Apply Broyden's formula to update the Jacobian.
Inherited methods from RootFinding:
Direction, Gupdate, Iterate, ItStartCheck, JJupdate, RootFinding
Inherited methods from GradientBased:
CheckPoint, GradientBased, HHupdate, Hupdate, Tune
Inherited methods from Algorithm:
Algorithm, ItEnd, out, Paths
Inherited fields from RootFinding:
dg, USELM
Inherited fields from GradientBased:
cmsg, deltaG, deltaX, gradtoler, Hresetcnt, IamBHHH, IamNewt, igradtoler, LM, LMitmax, LMmaxstep
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 CLineMax : LineMax : LineMethod : NonGradient : Algorithm

Constrained line maximization.
Public methods
 CLineMax Create a Constrained Line Maximization object.
Inherited methods from LineMax:
LineMax
Inherited methods from LineMethod:
Bracket, Golden, Iterate, LineMethod, PTry
Inherited methods from Algorithm:
Algorithm, CheckPoint, ItEnd, ItStartCheck, out, Paths, Tune
Inherited fields from LineMethod:
Delta, improved, maxstp
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 DFP : QuasiNewton : HillClimbing : GradientBased : Algorithm

[not coded yet] Davidon Fletcher Powell Updating of H.
Public methods
 DFP Create an object of the DFP algorithm.
 Hupdate virtual
Inherited methods from HillClimbing:
HillClimbing
Inherited methods from GradientBased:
CheckPoint, Direction, GradientBased, Gupdate, HHupdate, Iterate, ItStartCheck, Tune
Inherited methods from Algorithm:
Algorithm, ItEnd, out, Paths
Inherited fields from GradientBased:
cmsg, deltaG, deltaX, gradtoler, Hresetcnt, IamBHHH, IamNewt, igradtoler, LM, LMitmax, LMmaxstep
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 GradientBased : Algorithm

Gradient based algorithms.

Algorithms of this class use the gradient, \(\nabla f(\psi)\).
Public fields
 cmsg static const
 deltaG |∇m-∇m-1|.
 deltaX |xm-xm-1.
 gradtoler tolerance for strong convergence.
 Hresetcnt # of time H reset to I
 IamBHHH BHHH .
 IamNewt Newton version .
 igradtoler static const
 LM const
 LMitmax max iterations on line search.
 LMmaxstep maximum step size in line search.
Public methods
 CheckPoint virtual Handle gradient based checkpoint files.
 Direction virtual Compute the direction for the current line search.
 GradientBased Initialize a Gradient-Based algorithm.
 Gupdate virtual Update the gradient ∇ f(ψ).
 HHupdate Update the Hessian Hf.
 Hupdate virtual Default Hessian Update: Steepest Descent, so H f(ψ) does not change.
 Iterate virtual Iterate on a gradient-based algorithm.
 ItStartCheck virtual This routine is called at the start if Iterate().
 Tune virtual Tune Parameters of the Algorithm.

Inherited methods from Algorithm:
Algorithm, ItEnd, out, Paths
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 HillClimbing : GradientBased : Algorithm

Algorithms that optimize an objective based on gradient and/or Hessian information. This is a container class. If you create an object of this type it is the same as creating GradientBased object (i.e. steepest descent).
Public methods
 HillClimbing Base for Hill Climbing objects.
Inherited methods from GradientBased:
CheckPoint, Direction, GradientBased, Gupdate, HHupdate, Hupdate, Iterate, ItStartCheck, Tune
Inherited methods from Algorithm:
Algorithm, ItEnd, out, Paths
Inherited fields from GradientBased:
cmsg, deltaG, deltaX, gradtoler, Hresetcnt, IamBHHH, IamNewt, igradtoler, LM, LMitmax, LMmaxstep
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 LineMax : LineMethod : NonGradient : Algorithm

One-dimensional line search for a maximum.

Bracket a maximum, then Golden Rule iteration to reduce the interval.

This method is called by GradientBased routines, but it can be used for one-dimensional search.
Public methods
 LineMax A method to optimize along a line.

Inherited methods from LineMethod:
Bracket, Golden, Iterate, LineMethod, PTry
Inherited methods from Algorithm:
Algorithm, CheckPoint, ItEnd, ItStartCheck, out, Paths, Tune
Inherited fields from LineMethod:
Delta, improved, maxstp
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 LineMethod : NonGradient : Algorithm

Methods specific to solving or optimizing in one dimension.

These methods are embedded into other algorithms and in some cases can be used on their own.
Public fields
 Delta Direction vector.
 improved .
 maxstp .
Public methods
 Bracket Bracket a local maximum along the line.
 Golden Golden ratio line search.
 Iterate Optimize on a line optimization.
 LineMethod
 PTry virtual

Inherited methods from Algorithm:
Algorithm, CheckPoint, ItEnd, ItStartCheck, out, Paths, Tune
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 NelderMead : NonGradient : Algorithm

The Nelder and Mead Amoeba (Simplex) algorithm.

Nelder and Mead Amoeba (Simplex) algorithm.

at Wikipedia

To use this algorithm:
Declare a class for your Objective (e.g. a BlackBox or System).
Create an object of your objective class.
Create a NelderMead object, sending your objective to the creator.
If desired, call Tune() method to tweak parameters and/or change the volume setting.
Call Iterate().

class MyObjective : BlackBox{
    ⋮
    vfunc(subp=DoAll);
    }
⋮
decl myobj = new MyObjective();
⋮
decl nm = new NelderMead(myobj);
⋮
nm -> Iterate();
See GetStarted for an example of using NelderMead
Public fields
 alpha static const α.
 beta static const β.
 fdiff |f()-mean(f)|
 gamma static const γ.
 istep static const default initial step size.
 itoler static const default Plexsize tolerance.
 mxstarts number simplex resets.
 n_func function evaluations.
 nodeV function values on the simples.
 nodeX free parameter simplex.
 plexsize current area of plex.
 step current step size to create simplex
Public methods
 CheckPoint
 Iterate Iterate on the Amoeba algorithm.
 ItStartCheck
 NelderMead Initialize a Nelder-Mead Simplex Maximization.
 Reflect Reflect through simplex.
 SimplexSize Compute size of the current simplex.
 Tune Tune NelderMead Parameters .
Enumerations
 TryResults Results from an amoeba try.
Inherited methods from Algorithm:
Algorithm, ItEnd, out, Paths
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 Newton : HillClimbing : GradientBased : Algorithm

Newton Updating of H .

Evaluate Hessian at each iteration.

To use this algorithm:
Declare a class for your Objective (e.g. a BlackBox or System).
Create an object of your objective class.
Create a Newton object, sending your objective to the creator.
If desired, call Tune() method to tweak parameters and/or change the volume setting.
Call Newton::Iterate().

class MyObjective : BlackBox{
    ⋮   // parameters should be declared as members of your class
    vfunc(subp=DoAll);  // you have to define the objective
    }
⋮
decl myobj = new MyObjective();
⋮
decl newt = new Newton(myobj);
⋮
newt -> Iterate();
Public methods
 Hupdate virtual Compute the objective's Hessian and the current parameter vector.
 Newton Create an object of the Newton optimization algorithm.
Inherited methods from HillClimbing:
HillClimbing
Inherited methods from GradientBased:
CheckPoint, Direction, GradientBased, Gupdate, HHupdate, Iterate, ItStartCheck, Tune
Inherited methods from Algorithm:
Algorithm, ItEnd, out, Paths
Inherited fields from GradientBased:
cmsg, deltaG, deltaX, gradtoler, Hresetcnt, IamBHHH, IamNewt, igradtoler, LM, LMitmax, LMmaxstep
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 NewtonRaphson : RootFinding : GradientBased : Algorithm

Update the Jacobian on each iteration.
Public methods
 Jupdate virtual Compute the objective's Jacobian.
 NewtonRaphson Create a Newton-Raphson object to solve a system of equations.
Inherited methods from RootFinding:
Direction, Gupdate, Iterate, ItStartCheck, JJupdate, RootFinding
Inherited methods from GradientBased:
CheckPoint, GradientBased, HHupdate, Hupdate, Tune
Inherited methods from Algorithm:
Algorithm, ItEnd, out, Paths
Inherited fields from RootFinding:
dg, USELM
Inherited fields from GradientBased:
cmsg, deltaG, deltaX, gradtoler, Hresetcnt, IamBHHH, IamNewt, igradtoler, LM, LMitmax, LMmaxstep
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 NonGradient : Algorithm

Container for algorithms that do not rely on gradients.
Inherited methods from Algorithm:
Algorithm, CheckPoint, ItEnd, Iterate, ItStartCheck, out, Paths, Tune
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 OneDimRoot : SysMax : LineMethod : NonGradient : Algorithm

Solve for the root of a OneDimSystem system using Bracket-Bisect.
Public fields
 defmxiter static const
 istepmin static const minimum initial step size.
 itoler static const
Public methods
 Bracket Bracket a root of the equation.
 EndOneDim
 Iterate Find a 1-dimensional root.
 OneDimRoot Object to iterate to find the root a non-linear equation (OneDimSystem).
Inherited methods from SysMax:
SysMax
Inherited methods from LineMethod:
Golden, LineMethod, PTry
Inherited methods from Algorithm:
Algorithm, CheckPoint, ItEnd, ItStartCheck, out, Paths, Tune
Inherited fields from LineMethod:
Delta, improved, maxstp
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 QuasiNewton : HillClimbing : GradientBased : Algorithm

Container for algorithms that use but do not compute the Hessian matrix H.

at Wikipedia

Inherited methods from HillClimbing:
HillClimbing
Inherited methods from GradientBased:
CheckPoint, Direction, GradientBased, Gupdate, HHupdate, Hupdate, Iterate, ItStartCheck, Tune
Inherited methods from Algorithm:
Algorithm, ItEnd, out, Paths
Inherited fields from GradientBased:
cmsg, deltaG, deltaX, gradtoler, Hresetcnt, IamBHHH, IamNewt, igradtoler, LM, LMitmax, LMmaxstep
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 RandomSearch : SimulatedAnnealing : NonGradient : Algorithm

A special case of annealing in which the temperature stays the same and only improvements are accepted.
See also:
Explore
Public methods
 RandomSearch Random search without annealing.
Inherited methods from SimulatedAnnealing:
Iterate, Metropolis, SimulatedAnnealing, Tune
Inherited methods from Algorithm:
Algorithm, CheckPoint, ItEnd, ItStartCheck, out, Paths
Inherited fields from SimulatedAnnealing:
accept, chol, cooling, heat, holdpt, M, shrinkage, tries, Vtries, vtries
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 RootFinding : GradientBased : Algorithm

Solve system of equations using Jacobian information.
Public fields
 dg
 USELM use line search.
Public methods
 Direction Compute the direction.
 Gupdate Upate Jacobian when using line search aggregation of the system.
 Iterate Iterate to solve a non-linear system.
 ItStartCheck virtual This routine is called at the start if Iterate().
 JJupdate Wrapper for updating the Jacobian of the system.
 Jupdate virtual Compute the objective's Jacobian.
 RootFinding Solve a non-linear system of equations.
Inherited methods from GradientBased:
CheckPoint, GradientBased, HHupdate, Hupdate, Tune
Inherited methods from Algorithm:
Algorithm, ItEnd, out, Paths
Inherited fields from GradientBased:
cmsg, deltaG, deltaX, gradtoler, Hresetcnt, IamBHHH, IamNewt, igradtoler, LM, LMitmax, LMmaxstep
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 SimulatedAnnealing : NonGradient : Algorithm

Metropolis Simulated Annealing algorithm .

at Wikipedia

To use an algorithm:
Declare a class for your Objective (e.g. a BlackBox or System).
Create an object of your objective class.
Create a SimulatedAnnealing object, sending your objective to the creator.
If desired, call Tune() method to tweak parameters and/or change the volume setting.
Call Iterate().

class MyObjective : BlackBox{
    ⋮
    vfunc(subp=DoAll);
    }
⋮
decl myobj = new MyObjective();
⋮
decl sa = sa SimulatedAnnealing(myobj);
⋮
sa -> Iterate();
See ?? for an example
Tune the parameters of the algorithm with Tune();
Public fields
 accept
 chol cholesky matrix for random parameter change, default I
 cooling rate to cool down.
 heat current heat default intial =1.0
 holdpt
 M
 shrinkage rate to shrink variance.default=0.5
 tries
 Vtries
 vtries
Public methods
 Iterate Carry out annealing.
 Metropolis accept or reject.
 SimulatedAnnealing Initialize Simulated Annealing.
 Tune Tune annealing parameters.
Inherited methods from Algorithm:
Algorithm, CheckPoint, ItEnd, ItStartCheck, out, Paths
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 SQP : GradientBased : Algorithm

Sequential Quadratic Programming for constrained optimization.
Public fields
 BETA static const
 ne
 ni
Public methods
 Gupdate .
 HHupdate .
 Hupdate virtual .
 Iterate Iterate.
 SQP Create a new Sequential Quadratic Programming object.
Inherited methods from GradientBased:
CheckPoint, Direction, GradientBased, ItStartCheck, Tune
Inherited methods from Algorithm:
Algorithm, ItEnd, out, Paths
Inherited fields from GradientBased:
cmsg, deltaG, deltaX, gradtoler, Hresetcnt, IamBHHH, IamNewt, igradtoler, LM, LMitmax, LMmaxstep
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 SQPBFGS : SQP : GradientBased : Algorithm

Public methods
 SQPBFGS
Inherited methods from SQP:
Gupdate, HHupdate, Hupdate, Iterate, SQP
Inherited methods from GradientBased:
CheckPoint, Direction, GradientBased, ItStartCheck, Tune
Inherited methods from Algorithm:
Algorithm, ItEnd, out, Paths
Inherited fields from SQP:
BETA, ne, ni
Inherited fields from GradientBased:
cmsg, deltaG, deltaX, gradtoler, Hresetcnt, IamBHHH, IamNewt, igradtoler, LM, LMitmax, LMmaxstep
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 SQPNEWTON : SQP : GradientBased : Algorithm

Public methods
 SQPNEWTON
Inherited methods from SQP:
Gupdate, HHupdate, Hupdate, Iterate, SQP
Inherited methods from GradientBased:
CheckPoint, Direction, GradientBased, ItStartCheck, Tune
Inherited methods from Algorithm:
Algorithm, ItEnd, out, Paths
Inherited fields from SQP:
BETA, ne, ni
Inherited fields from GradientBased:
cmsg, deltaG, deltaX, gradtoler, Hresetcnt, IamBHHH, IamNewt, igradtoler, LM, LMitmax, LMmaxstep
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 SysMax : LineMethod : NonGradient : Algorithm

Systems line maximization. This specializes line optimization to be used inside the system solver.

The negative of the norm of the system is used as the objective to evaluate improvement.

A special case is OneDimRoot which brackets the root not a local optimum in the norm.
Public methods
 SysMax A method to optimize along a line.

Inherited methods from LineMethod:
Bracket, Golden, Iterate, LineMethod, PTry
Inherited methods from Algorithm:
Algorithm, CheckPoint, ItEnd, ItStartCheck, out, Paths, Tune
Inherited fields from LineMethod:
Delta, improved, maxstp
Inherited fields from Algorithm:
convergence, holdF, IIterate, inparallel, istr, iter, itoler, logf, lognm, logpfx, maxiter, N, NormalStart, O, path, StorePath, tolerance, Volume

 Algorithm

 Algorithm

Algorithm :: Algorithm ( O )
Base class for non-linear programming algorithms.
Parameters:
O Objective to work on.

A timestamped log file is created for the algorithm when it is created. You can control the output level by setting Volume to the desired NoiseLevels. The default volume level is SILENT.


 CheckPoint


 convergence

decl convergence [public]
Convergence code. See ConvergenceResults

 holdF

decl holdF [public]

 IIterate

decl IIterate [public]
Running on the Client node or no MPI .

 inparallel

decl inparallel [public]
Running in parallel.

 istr

decl istr [public]
@internal.

 ItEnd

virtual Algorithm :: ItEnd ( )
Finish up iteration. The current free parameter vector is decoded into structural parameters. If running in parallel then the client node will tell servers to stop and then announce the new structural parameter vector. The servers will drop out of the server loop and all nodes will get past Iterate()

 iter

decl iter [public]
current iteration count.

 Iterate


 itoler

static const decl itoler [public]
Default top level convergence tolerance.

 ItStartCheck

virtual Algorithm :: ItStartCheck ( ReadCheckPoint )
This routine is called at the start if Iterate().
Parameters:
ReadCheckPoint TRUE means starting point coming from the checkpoint file.

If running in parallel it is this routine that server nodes go into the server loop. The client node will enter into the main iteration loop.


 logf

decl logf [public]
log file

 lognm

const decl lognm [public]
name of log file

 logpfx

const decl logpfx [public]
prefix on log before timestamp added.

 maxiter

decl maxiter [public]
maximum iterations

 N

decl N [public]
.

 NormalStart

decl NormalStart [public]
Not restarting from alg. checkpoint.

 O

const decl O [public]
User's objective.

 out

virtual Algorithm :: out ( fn )

 path

decl path [public]
sequence of structural parameters .

 Paths

virtual Algorithm :: Paths ( starts )

 StorePath

decl StorePath [public]
Store path of iterations in path.

 tolerance

decl tolerance [public]
top level convergence tolerance

 Tune

virtual Algorithm :: Tune ( maxiter , toler , nfuncmax )
Tune Parameters of the Algorithm. This is the base version. Some derived algorithms replace this with their own Tune methods.
Parameters:
maxiter integer max. number of iterations
0 use current/default value
toler double, simplex tolerance for convergence
0 use current/default value
nfuncmax integer, number of function evaluations before resetting the simplex
0 use current/default value

 Volume

decl Volume [public]
output level, NoiseLevels

 BFGS

 BFGS

BFGS :: BFGS ( O )
Create an object of the BFGS algorithm.
Parameters:
O the Objective object to apply this algorithm to.

Example:
decl myobj = new MyObjective();
decl bfgs = new BFGS(myobj);
bfgs->Iterate();

See GetStarted

 Hupdate

virtual BFGS :: Hupdate ( )
Apply BFGS updating on the Hessian.

Active Ox code in this routine:
decl
  dg = (OC.G - oldG),
  A = double(dx*dg'),
  B = double(outer(dx,OC.H));
  if (fabs(A) < SQRT_EPS*norm(dg,2)*deltaX ) return FAIL;
  OC.H += outer(dg,<>,'o')/A - outer(dx*OC.H,<>,'o')/B;
  return NONE;
Returns:
FAIL if the updating tolerances are not met.
Otherwise NONE
See also:
ConvergenceResults, NoiseLevels

 BHHH

 BHHH

BHHH :: BHHH ( O )
Create an object of the BHHH algorithm.
Parameters:
O the Objective object to apply this algorithm to.

Example:
decl myobj = new MyObjective();
decl bhhh = new BHHH(myobj);
bhhh->Iterate();

See GetStarted

 Hupdate

virtual BHHH :: Hupdate ( )
UPdate the Hessian for the BHHH algorithm.
Returns:
NONE

 Broyden

 Broyden

Broyden :: Broyden ( O , USELM )
Create a new Broyden solution for a system of equations.
Parameters:
O, the System-derived object to solve.
USELM FALSE [default] do not do a line search each iteration.
TRUE

Example:
obj = new MyObjective();
broy = new Broyden(obj);
broy -> Iterate();

Also see GetStarted

 Jupdate

virtual Broyden :: Jupdate ( dx )
Apply Broyden's formula to update the Jacobian.
Parameters:
dx xt+1 - xt

 CLineMax

 CLineMax

CLineMax :: CLineMax ( O )
Create a Constrained Line Maximization object.
Parameters:
O Objective

 DFP

 DFP

DFP :: DFP ( O )
Create an object of the DFP algorithm.
Parameters:
O the Objective object to apply this algorithm to.

Example:
decl myobj = new MyObjective();
decl dfp = new DFP(myobj);
bfgs->Iterate();

 Hupdate

 GradientBased

 CheckPoint

virtual GradientBased :: CheckPoint ( WriteOut )
Handle gradient based checkpoint files.
Parameters:
WriteOut TRUE - write mode
FALSE read from file mode

 cmsg

static const decl cmsg [public]

 deltaG

decl deltaG [public]
|∇m-∇m-1|.

 deltaX

decl deltaX [public]
|xm-xm-1.

 Direction

virtual GradientBased :: Direction ( )
Compute the direction for the current line search. If inversion of H fails, reset to I
Returns:
direction vector.

 GradientBased

GradientBased :: GradientBased ( O )
Initialize a Gradient-Based algorithm.

 gradtoler

decl gradtoler [public]
tolerance for strong convergence.

 Gupdate

virtual GradientBased :: Gupdate ( )
Update the gradient ∇ f(ψ).

This routine is called inside Iterate().
Creates a copy of the current gradient.
Calls Gradient() routine to compute &nabla f(ψ), which is (stored internally in G).
Then computes the size of ∇ using the built-in Ox norm(m,2) routine.
If Volume is louder than QUIET ∇ f() is printed to the screen.
Returns:
TRUE if ∇f(ψ) < gradtoler
FALSE otherwise

 HHupdate

GradientBased :: HHupdate ( FORCE )
Update the Hessian Hf.
Parameters:
FORCE see below.

This is a wrapper for the virtual Hupdate() routine and is called on each iteration (except possibly the first if convergence is already achieved).

Computes the difference in parameter values from this iteration and the last as well is the norm (both used by QuasiNewton algorithms).
If FORCE is TRUE, call Gupdate() and check both STRONG and WEAK convergence criteria.
Then call the algorithm-specific Hupdate() routine.

Returns:
the output of Hupdate()
See also:
ConvergenceResults

 Hresetcnt

decl Hresetcnt [public]
# of time H reset to I

 Hupdate

virtual GradientBased :: Hupdate ( )
Default Hessian Update: Steepest Descent, so H f(ψ) does not change.

Since the default gradient-based algorithm is steepest descent, and since this algorithm does not use the Hessian, this return does nothing and returns

Returns:
NONE
See also:
ConvergenceResults

 IamBHHH

decl IamBHHH [public]
BHHH .

 IamNewt

decl IamNewt [public]
Newton version .

 igradtoler

static const decl igradtoler [public]

 Iterate

virtual GradientBased :: Iterate ( H )
Iterate on a gradient-based algorithm.
Parameters:
H matrix, initial Hessian
integer, use the identity I as the initial value, or compute H if Newton.

This routine works the CFMPI to execute the parallel task of computing the gradient.

Gradient-based algorithms conduct a LineMaximization on each iteration.

See also:
ConvergenceResults

 ItStartCheck

virtual GradientBased :: ItStartCheck ( H )
This routine is called at the start if Iterate().
Parameters:
ReadCheckPoint TRUE means starting point coming from the checkpoint file.

If running in parallel it is this routine that server nodes go into the server loop. The client node will enter into the main iteration loop.


 LM

const decl LM [public]

 LMitmax

decl LMitmax [public]
max iterations on line search.
See also:
Tune

 LMmaxstep

decl LMmaxstep [public]
maximum step size in line search.

 Tune

virtual GradientBased :: Tune ( maxiter , toler , nfuncmax , LMitmax , LMmaxstep )
Tune Parameters of the Algorithm.
Parameters:
maxiter integer max. number of iterations
0 use current/default value
toler double, simplex tolerance for convergence
0 use current/default value
nfuncmax integer, number of function evaluations before resetting the simplex
0 use current/default value
LMitmax integer, max number of line maximization iterions
0 use current/default value
LMmaxstep double, maximum change in line maximization

 HillClimbing

 HillClimbing

HillClimbing :: HillClimbing ( O )
Base for Hill Climbing objects.

 LineMax

 LineMax

LineMax :: LineMax ( O )
A method to optimize along a line.
Parameters:
O Objective

 LineMethod

 Bracket

LineMethod :: Bracket ( )
Bracket a local maximum along the line.

 Delta

decl Delta [public]
Direction vector.

 Golden

LineMethod :: Golden ( )
Golden ratio line search.

 improved

decl improved [public]
.

 Iterate

LineMethod :: Iterate ( Delta , maxiter , maxstp )
Optimize on a line optimization.
Parameters:
Delta vector of directions to define the line
maxiter > 0 max. number iterations
0 set to 1000
maxstp > 0, maximum step size in parameters.

 LineMethod

LineMethod :: LineMethod ( O )

 maxstp

decl maxstp [public]
.

 PTry

virtual LineMethod :: PTry ( pt , left , right )

 NelderMead

Enumerations
TryResults Results from an amoeba try. hi, nxtlo, lo, worst, TryResults

 alpha

static const decl alpha [public]
α.

 beta

static const decl beta [public]
β.

 CheckPoint

NelderMead :: CheckPoint ( WriteOut )

 fdiff

decl fdiff [public]
|f()-mean(f)|

 gamma

static const decl gamma [public]
γ.

 istep

static const decl istep [public]
default initial step size.

 Iterate

NelderMead :: Iterate ( iplex )
Iterate on the Amoeba algorithm.
Parameters:
iplex N×N+1 matrix of initial steps.
double, initial step size (iplex is set to 0~unit(N))
integer > 0, max starts of the algorithm
0 (default), use current mxstarts.
= -1 (UseGradient), compute gradient and use its normed value as initial simplex
= -2 (UseCheckPoint), read in checkpoint file to return to last state.
Example:
See GetStarted

 itoler

static const decl itoler [public]
default Plexsize tolerance.

 ItStartCheck

NelderMead :: ItStartCheck ( iplex )

 mxstarts

decl mxstarts [public]
number simplex resets.

 n_func

decl n_func [public]
function evaluations.

 NelderMead

NelderMead :: NelderMead ( O )
Initialize a Nelder-Mead Simplex Maximization.
Parameters:
O Objective to work on.

 nodeV

decl nodeV [public]
function values on the simples.

 nodeX

decl nodeX [public]
free parameter simplex.

 plexsize

decl plexsize [public]
current area of plex.

 Reflect

NelderMead :: Reflect ( fac )
Reflect through simplex.
Parameters:
fac factor

 SimplexSize

NelderMead :: SimplexSize ( )
Compute size of the current simplex.
Returns:
j |X-μ|

 step

decl step [public]
current step size to create simplex

 Tune

NelderMead :: Tune ( mxstarts , toler , nfuncmax , maxiter )
Tune NelderMead Parameters .
Parameters:
mxstarts integer number of simplex restarts
0 use current/default value
toler double, simplex tolerance for convergence
0 use current/default value
nfuncmax integer, number of function evaluations before resetting the simplex
0 use current/default value
maxiter integer max. number of iterations
0 use current/default value

 Newton

 Hupdate

virtual Newton :: Hupdate ( )
Compute the objective's Hessian and the current parameter vector.
Returns:
NONE
See also:
ConvergenceResults, NoiseLevels

 Newton

Newton :: Newton ( O )
Create an object of the Newton optimization algorithm.
Parameters:
O the Objective object to apply this algorithm to.

Example:
decl myobj = new MyObjective();
decl newt = new Newtown(myobj);
newt->Iterate();

See GetStarted

 NewtonRaphson

 Jupdate

virtual NewtonRaphson :: Jupdate ( dx )
Compute the objective's Jacobian.

 NewtonRaphson

NewtonRaphson :: NewtonRaphson ( O , USELM )
Create a Newton-Raphson object to solve a system of equations.
Parameters:
O, the System-derived object to solve.
USELM FALSE [default] do not do a line search each iteration.
TRUE

Example:
obj = new MyObjective();
nr = new NewtonRaphson(obj);
hr -> Iterate();

Also see GetStarted

 OneDimRoot

 Bracket

OneDimRoot :: Bracket ( )
Bracket a root of the equation.

 defmxiter

static const decl defmxiter [public]

 EndOneDim


 istepmin

static const decl istepmin [public]
minimum initial step size.

 Iterate

OneDimRoot :: Iterate ( istep , maxiter , toler )
Find a 1-dimensional root.
Parameters:
istep initial step, can be positive or negative [default=1.0]
maxiter > 0 max. number iterations > 0 [default=50]
toler > 0 tolerance
0 [default=SSQ_TOLER] This uses the fact that vcur.v contains the norm of the equation, not the negative,

 itoler

static const decl itoler [public]

 OneDimRoot

OneDimRoot :: OneDimRoot ( O )
Object to iterate to find the root a non-linear equation (OneDimSystem).

 RandomSearch

 RandomSearch

RandomSearch :: RandomSearch ( O )
Random search without annealing.

 RootFinding

 dg

decl dg [public]

 Direction

RootFinding :: Direction ( )
Compute the direction. If inversion of J fails, reset to I
Returns:
direction

 Gupdate

RootFinding :: Gupdate ( )
Upate Jacobian when using line search aggregation of the system.

 Iterate

RootFinding :: Iterate ( J )
Iterate to solve a non-linear system.
Parameters:
J matrix, initial Jacobian for Broyden.
integer, set to identity

This routine is shared (inherited) by derived algorithms.


 ItStartCheck

virtual RootFinding :: ItStartCheck ( J )
This routine is called at the start if Iterate().
Parameters:
ReadCheckPoint TRUE means starting point coming from the checkpoint file.

If running in parallel it is this routine that server nodes go into the server loop. The client node will enter into the main iteration loop.


 JJupdate

RootFinding :: JJupdate ( )
Wrapper for updating the Jacobian of the system.

This is a wrapper for the virtual Jupdate() routine.
Returns:
FAIL if change in parameter values is too small. Otherwise, return NONE
See also:
GradientBased::HHudpate

 Jupdate

virtual RootFinding :: Jupdate ( dx )
Compute the objective's Jacobian.
Parameters:
dx argument is ignored (default=0)

 RootFinding

RootFinding :: RootFinding ( O , USELM )
Solve a non-linear system of equations.
Parameters:
O System object
USELM FALSE [default] do not do a line search each iteration.
TRUE

 USELM

decl USELM [public]
use line search.

 SimulatedAnnealing

 accept

decl accept [public]

 chol

decl chol [public]
cholesky matrix for random parameter change, default I

 cooling

decl cooling [public]
rate to cool down. default = 0.85

 heat

decl heat [public]
current heat default intial =1.0

 holdpt

decl holdpt [public]

 Iterate

SimulatedAnnealing :: Iterate ( chol )
Carry out annealing.
Parameters:
chol Determines the Choleski matrix for random draws, C
0 [default] \(C = I\), the identity matrix
matrix, \(C = \) chol.
double, common standard deviation, \(C = chol I\).

 M

decl M [public]

 Metropolis

SimulatedAnnealing :: Metropolis ( )
accept or reject.

 shrinkage

decl shrinkage [public]
rate to shrink variance.default=0.5

 SimulatedAnnealing

SimulatedAnnealing :: SimulatedAnnealing ( O )
Initialize Simulated Annealing.
Parameters:
O Objective to work on.

 tries

decl tries [public]

 Tune

SimulatedAnnealing :: Tune ( maxiter , heat , cooling , shrinkage )
Tune annealing parameters.
Parameters:
maxiter > 0, number of iterations
0 do not change
heat > 0, temperature parameter
0 do not change
cooling ∈ (0,1], rate to cool,
0 do not change
shrinkage ∈ (0,1], rate to shrink,
0 do not change

 Vtries

decl Vtries [public]

 vtries

decl vtries [public]

 SQP

 BETA

static const decl BETA [public]

 Gupdate

SQP :: Gupdate ( )
.

 HHupdate

SQP :: HHupdate ( FORCE )
.

 Hupdate

virtual SQP :: Hupdate ( )
.

 Iterate

SQP :: Iterate ( H )
Iterate.
Parameters:
H initial Hessian
integer, use I

 ne

decl ne [public]

 ni

decl ni [public]

 SQP

SQP :: SQP ( O )
Create a new Sequential Quadratic Programming object.
Parameters:
O Constrained objective

 SQPBFGS

 SQPBFGS

SQPBFGS :: SQPBFGS ( O )

 SQPNEWTON

 SQPNEWTON

 SysMax

 SysMax

SysMax :: SysMax ( O )
A method to optimize along a line.
Parameters:
O Objective