de.jstacs.algorithms.optimization
Class Optimizer

java.lang.Object
  extended by de.jstacs.algorithms.optimization.Optimizer

public class Optimizer
extends Object

This class can be used for optimization purposes. The function will be minimized. To maximize a function use the classes NegativeFunction and NegativeDifferentiableFunction as input. All methods for multidimensional optimization need an implementation of DifferentiableFunction, that will be minimized. The optimizing methods try to do as less as possible function evaluation, which is important for function that are costly to evaluate. So if you have a costly function and you will fasten the computation, try to do as much effort as you can to make the methods evaluateFunction and evaluateGradientOfFunction in your implementation as fast as possible.
You are enabled to choose different termination modes.

Author:
Jens Keilwagen
See Also:
OneDimensionalFunction, DifferentiableFunction, NegativeOneDimensionalFunction, NegativeDifferentiableFunction, NumericalDifferentiableFunction

Nested Class Summary
static class Optimizer.TerminationCondition
          This enum defines all kinds of termination conditions for the Optimizer.
 
Field Summary
static byte CONJUGATE_GRADIENTS_FR
          This constant can be used to specify that conjugate gradients (by Fletcher and Reeves) should be used in the optimize-method.
static byte CONJUGATE_GRADIENTS_PRP
          This constant can be used to specify that conjugate gradients (by Polak and Ribiere should be used in the optimize-method.
static byte QUASI_NEWTON_BFGS
          This constant can be used to specify that the quasi Newton method of Broyden-Fletcher-Goldfarb-Shanno should be used in the optimize-method.
static byte QUASI_NEWTON_DFP
          This constant can be used to specify that the quasi Newton method of Davidon-Fletcher-Powell should be used in the optimize-method.
static byte STEEPEST_DESCENT
          This constant can be used to specify that steepest descent should be used in the optimize-method.
 
Constructor Summary
Optimizer()
           
 
Method Summary
static double[] brentsMethod(OneDimensionalFunction f, double a, double x, double b, double tol)
          Approximates a minimum (not necessary the global) in the interval [lower,upper].
static double[] brentsMethod(OneDimensionalFunction f, double a, double x, double fx, double b, double tol)
          Approximates a minimum (not necessary the global) in the interval [lower,upper].
static int conjugateGradientsFR(DifferentiableFunction f, double[] currentValues, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, Time t)
          The conjugate gradient algorithm by Fletcher and Reeves.
static int conjugateGradientsPR(DifferentiableFunction f, double[] currentValues, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, Time t)
          The conjugate gradient algorithm by Polak and Ribiere.
static int conjugateGradientsPRP(DifferentiableFunction f, double[] currentValues, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, Time t)
          The conjugate gradient algorithm by Polak and Ribiere called Polak-Ribiere-Positive.
static double[] findBracket(OneDimensionalFunction f, double lower, double startDistance)
          This method returns a bracket containing a minimum.
static double[] findBracket(OneDimensionalFunction f, double lower, double fLower, double startDistance)
          This method returns a bracket containing a minimum.
static double[] goldenRatio(OneDimensionalFunction f, double lower, double upper, double eps)
          Approximates a minimum (not necessary the global) in the interval [lower,upper].
static double[] goldenRatio(OneDimensionalFunction f, double lower, double p1, double fP1, double upper, double eps)
          Approximates a minimum (not necessary the global) in the interval [lower,upper].
static int limitedMemoryBFGS(DifferentiableFunction f, double[] currentValues, byte m, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, Time t)
          The Broyden-Fletcher-Goldfarb-Shanno version of limited memory quasi Newton methods.
static int optimize(byte algorithm, DifferentiableFunction f, double[] currentValues, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out)
          This method enables you to use all different implemented optimization algorithms by only one method.
static int optimize(byte algorithm, DifferentiableFunction f, double[] currentValues, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, Time t)
          This method enables you to use all different implemented optimization algorithms by only one method.
static double[] parabolicInterpolation(OneDimensionalFunction f, double lower, double p1, double upper, double eps)
          Approximates a minimum (not necessary the global) in the intervall [lower,upper].
static double[] parabolicInterpolation(OneDimensionalFunction f, double lower, double fLower, double p1, double fP1, double upper, double fUpper, double eps)
          Approximates a minimum (not necessary the global) in the intervall [lower,upper].
static int quasiNewtonBFGS(DifferentiableFunction f, double[] currentValues, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, Time t)
          The Broyden-Fletcher-Goldfarb-Shanno version of quasi Newton method.
static int quasiNewtonDFP(DifferentiableFunction f, double[] currentValues, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, Time t)
          The Davidon-Fletcher-Powell version of quasi Newton method.
static int steepestDescent(DifferentiableFunction f, double[] currentValues, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, Time t)
          The steepest descent.
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Field Detail

STEEPEST_DESCENT

public static final byte STEEPEST_DESCENT
This constant can be used to specify that steepest descent should be used in the optimize-method.

See Also:
Constant Field Values

CONJUGATE_GRADIENTS_FR

public static final byte CONJUGATE_GRADIENTS_FR
This constant can be used to specify that conjugate gradients (by Fletcher and Reeves) should be used in the optimize-method.

See Also:
Constant Field Values

CONJUGATE_GRADIENTS_PRP

public static final byte CONJUGATE_GRADIENTS_PRP
This constant can be used to specify that conjugate gradients (by Polak and Ribiere should be used in the optimize-method.

See Also:
Constant Field Values

QUASI_NEWTON_DFP

public static final byte QUASI_NEWTON_DFP
This constant can be used to specify that the quasi Newton method of Davidon-Fletcher-Powell should be used in the optimize-method.

See Also:
Constant Field Values

QUASI_NEWTON_BFGS

public static final byte QUASI_NEWTON_BFGS
This constant can be used to specify that the quasi Newton method of Broyden-Fletcher-Goldfarb-Shanno should be used in the optimize-method.

See Also:
Constant Field Values
Constructor Detail

Optimizer

public Optimizer()
Method Detail

findBracket

public static double[] findBracket(OneDimensionalFunction f,
                                   double lower,
                                   double fLower,
                                   double startDistance)
                            throws EvaluationException
This method returns a bracket containing a minimum. This method can be used to find a initial bracket for a minimization method.

Parameters:
f - the function to be minimized
lower - the initial value of x, lower bound
fLower - the value f(x)
startDistance - the inital distance for bracketing the minimum
Returns:
an array containing x_lower, f(x_lower), x_middle, f(x_middle), x_upper and f(x_upper) at positions 0 to 5
Throws:
EvaluationException - if there was a mistake in evaluating the function
See Also:
brentsMethod( OneDimensionalFunction f, double a, double x, double fx, double b, double tol ), goldenRatio( OneDimensionalFunction f, double lower, double p1, double fP1, double upper, double eps )

findBracket

public static double[] findBracket(OneDimensionalFunction f,
                                   double lower,
                                   double startDistance)
                            throws EvaluationException
This method returns a bracket containing a minimum.

Parameters:
f - the function to be minimized
lower - the initial value of x, lower bound
startDistance - the inital distance for bracketing the minimum
Returns:
an array containing x_lower, f(x_lower), x_inside, f(x_inside), x_upper and f(x_upper) at positions 0 to 5
Throws:
EvaluationException - if there was a mistake in evaluating the function

brentsMethod

public static double[] brentsMethod(OneDimensionalFunction f,
                                    double a,
                                    double x,
                                    double fx,
                                    double b,
                                    double tol)
                             throws EvaluationException
Approximates a minimum (not necessary the global) in the interval [lower,upper]. The method is using the &qt;Brent's method&qt;.

This method is an modified copy of a method download from the internet.

Parameters:
f - the function to be minimized
a - the initial lower boundary
x - a abscissa inside the interval [a,b]
fx - the ordinate corresponding to x
b - the initial upper bound
tol - the termination parameter (interval size)
Returns:
the minimum x (position 0) and the value of minimum f(x) (position 1)
Throws:
EvaluationException - if there was a mistake in evaluating the function

brentsMethod

public static double[] brentsMethod(OneDimensionalFunction f,
                                    double a,
                                    double x,
                                    double b,
                                    double tol)
                             throws EvaluationException
Approximates a minimum (not necessary the global) in the interval [lower,upper]. The method is using the &qt;Brent's method&qt;.

This method is an modified copy of a method download from the internet.

Parameters:
f - the function to be minimized
a - the initial lower boundary
x - a abscissa inside the interval [a,b]
b - the initial upper bound
tol - the termination parameter (interval size)
Returns:
the minimum x (position 0) and the value of minimum f(x) (position 1)
Throws:
EvaluationException - if there was a mistake in evaluating the function

goldenRatio

public static double[] goldenRatio(OneDimensionalFunction f,
                                   double lower,
                                   double p1,
                                   double fP1,
                                   double upper,
                                   double eps)
                            throws EvaluationException
Approximates a minimum (not necessary the global) in the interval [lower,upper]. The method is using the &qt;golden section&qt;.

Parameters:
f - the function to be minimized
lower - the lower interval boundary
p1 - a abscissa in the interval
fP1 - the ordinate corresponding to p1
upper - the upper interval boundary
eps - the minimal interval size
Returns:
the minimum x (position 0) and the value of minimum f(x) (position 1)
Throws:
EvaluationException - if there was a mistake in evaluating the function

goldenRatio

public static double[] goldenRatio(OneDimensionalFunction f,
                                   double lower,
                                   double upper,
                                   double eps)
                            throws EvaluationException
Approximates a minimum (not necessary the global) in the interval [lower,upper]. The method is using the &qt;golden section&qt;.

Parameters:
f - the function to be minimized
lower - the lower interval boundary
upper - the upper interval boundary
eps - the minimal interval size
Returns:
the minimum x (position 0) and the value of minimum f(x) (position 1)
Throws:
EvaluationException - if there was a mistake in evaluating the function

parabolicInterpolation

public static double[] parabolicInterpolation(OneDimensionalFunction f,
                                              double lower,
                                              double fLower,
                                              double p1,
                                              double fP1,
                                              double upper,
                                              double fUpper,
                                              double eps)
                                       throws Exception
Approximates a minimum (not necessary the global) in the intervall [lower,upper]. The method is using the &qt;parabolic interpolation&qt;.

This method should work well for convex functions.

Parameters:
f - the function to be minimzed
lower - the lower interval boundary
fLower - the ordinate corresponding to lower
p1 - a abscissa in the interval
fP1 - the ordinate corresponding to p1
upper - the upper interval boundary
fUpper - the ordinate corresponding to upper
eps - the minimal interval size
Returns:
the minimum x (position 0) and the value of minimum f(x) (position 1)
Throws:
Exception - if there was a mistake in evaluating the function

parabolicInterpolation

public static double[] parabolicInterpolation(OneDimensionalFunction f,
                                              double lower,
                                              double p1,
                                              double upper,
                                              double eps)
                                       throws Exception
Approximates a minimum (not necessary the global) in the intervall [lower,upper]. The method is using the &qt;parabolic interpolation&qt;.

Parameters:
f - the function to be minimized
lower - the lower interval boundary
p1 - a abscissa in the interval
upper - the upper interval boundary
eps - the minimal interval size
Returns:
the minimum x (position 0) and the value of minimum f(x) (position 1)
Throws:
Exception - if there was a mistake in evaluating the function

steepestDescent

public static int steepestDescent(DifferentiableFunction f,
                                  double[] currentValues,
                                  Optimizer.TerminationCondition terminationMode,
                                  double eps,
                                  double linEps,
                                  StartDistanceForecaster startDistance,
                                  SafeOutputStream out,
                                  Time t)
                           throws DimensionException,
                                  TerminationException,
                                  EvaluationException,
                                  IOException
The steepest descent.

Parameters:
f - the function to be minimized
currentValues - at the begin the start vector and at the end the minimum vector
terminationMode - the choice how to terminate the algorithm
eps - the bound for stopping the algorithm
linEps - the bound for stopping the linear search
startDistance - the initial step for the linear search
out - the SafeOutputStream for writing the some information about the iterations, if null nothing will be written.
t - a Time instance used to measure the elapsed time
Returns:
the number of iterations
Throws:
EvaluationException - if there was a mistake in evaluating the function
TerminationException - if the termination condition is unknown
DimensionException - if the dimension of the scope of the function and the array of arguments do not match
IOException - if some mistakes in the IO happened

conjugateGradientsFR

public static int conjugateGradientsFR(DifferentiableFunction f,
                                       double[] currentValues,
                                       Optimizer.TerminationCondition terminationMode,
                                       double eps,
                                       double linEps,
                                       StartDistanceForecaster startDistance,
                                       SafeOutputStream out,
                                       Time t)
                                throws DimensionException,
                                       TerminationException,
                                       IOException,
                                       EvaluationException
The conjugate gradient algorithm by Fletcher and Reeves.

Parameters:
f - the function to be minimized
currentValues - at the begin the start vector and at the end the minimum vector
terminationMode - the choice how to terminate the algorithm
eps - the bound for stopping the algorithm
linEps - the bound for stopping the linear search
startDistance - the initial step for the linear search
out - the SafeOutputStream for writing the some information about the iterations, if null nothing will be written.
t - a Time instance used to measure the elapsed time
Returns:
the number of iterations
Throws:
EvaluationException - if there was a mistake in evaluating the function
TerminationException - if the termination condition is unknown
DimensionException - if the dimension of the scope of the function and the array of arguments do not match
IOException - if some mistakes in the IO happened

conjugateGradientsPR

public static int conjugateGradientsPR(DifferentiableFunction f,
                                       double[] currentValues,
                                       Optimizer.TerminationCondition terminationMode,
                                       double eps,
                                       double linEps,
                                       StartDistanceForecaster startDistance,
                                       SafeOutputStream out,
                                       Time t)
                                throws DimensionException,
                                       TerminationException,
                                       IOException,
                                       EvaluationException
The conjugate gradient algorithm by Polak and Ribiere.

Parameters:
f - the function to be minimized
currentValues - at the begin the start vector and at the end the minimum vector
terminationMode - the choice how to terminate the algorithm
eps - the bound for stopping the algorithm
linEps - the bound for stopping the linear search
startDistance - the initial step for the linear search
out - the SafeOutputStream for writing the some information about the iterations, if null nothing will be written.
t - a Time instance used to measure the elapsed time
Returns:
the number of iterations
Throws:
EvaluationException - if there was a mistake in evaluating the function
TerminationException - if the termination condition is unknown
DimensionException - if the dimension of the scope of the function and the array of arguments do not match
IOException - if some mistakes in the IO happened

conjugateGradientsPRP

public static int conjugateGradientsPRP(DifferentiableFunction f,
                                        double[] currentValues,
                                        Optimizer.TerminationCondition terminationMode,
                                        double eps,
                                        double linEps,
                                        StartDistanceForecaster startDistance,
                                        SafeOutputStream out,
                                        Time t)
                                 throws DimensionException,
                                        TerminationException,
                                        IOException,
                                        EvaluationException
The conjugate gradient algorithm by Polak and Ribiere called Polak-Ribiere-Positive.

Parameters:
f - the function to be minimized
currentValues - at the begin the start vector and at the end the minimum vector
terminationMode - the choice how to terminate the algorithm
eps - the bound for stopping the algorithm
linEps - the bound for stopping the linear search
startDistance - the initial step for the linear search
out - the SafeOutputStream for writing the some information about the iterations, if null nothing will be written.
t - a Time instance used to measure the elapsed time
Returns:
the number of iterations
Throws:
EvaluationException - if there was a mistake in evaluating the function
TerminationException - if the termination condition is unknown
DimensionException - if the dimension of the scope of the function and the array of arguments do not match
IOException - if some mistakes in the IO happened

quasiNewtonDFP

public static int quasiNewtonDFP(DifferentiableFunction f,
                                 double[] currentValues,
                                 Optimizer.TerminationCondition terminationMode,
                                 double eps,
                                 double linEps,
                                 StartDistanceForecaster startDistance,
                                 SafeOutputStream out,
                                 Time t)
                          throws DimensionException,
                                 TerminationException,
                                 IOException,
                                 EvaluationException
The Davidon-Fletcher-Powell version of quasi Newton method.

Parameters:
f - the function to be minimized
currentValues - at the begin the start vector and at the end the minimum vector
terminationMode - the choice how to terminate the algorithm
eps - the bound for stopping the algorithm
linEps - the bound for stopping the linear search
startDistance - the initial step for the linear search
out - the SafeOutputStream for writing the some information about the iterations, if null nothing will be written.
t - a Time instance used to measure the elapsed time
Returns:
the number of iterations
Throws:
EvaluationException - if there was a mistake in evaluating the function
TerminationException - if the termination condition is unknown
DimensionException - if the dimension of the scope of the function and the array of arguments do not match
IOException - if some mistakes in the IO happened

quasiNewtonBFGS

public static int quasiNewtonBFGS(DifferentiableFunction f,
                                  double[] currentValues,
                                  Optimizer.TerminationCondition terminationMode,
                                  double eps,
                                  double linEps,
                                  StartDistanceForecaster startDistance,
                                  SafeOutputStream out,
                                  Time t)
                           throws DimensionException,
                                  TerminationException,
                                  IOException,
                                  EvaluationException
The Broyden-Fletcher-Goldfarb-Shanno version of quasi Newton method.

Parameters:
f - the function to be minimized
currentValues - at the begin the start vector and at the end the minimum vector
terminationMode - the choice how to terminate the algorithm
eps - the bound for stopping the algorithm
linEps - the bound for stopping the linear search
startDistance - the initial step for the linear search
out - the SafeOutputStream for writing the some information about the iterations, if null nothing will be written.
t - a Time instance used to measure the elapsed time
Returns:
the number of iterations
Throws:
EvaluationException - if there was a mistake in evaluating the function
TerminationException - if the termination condition is unknown
DimensionException - if the dimension of the scope of the function and the array of arguments do not match
IOException - if some mistakes in the IO happened

limitedMemoryBFGS

public static int limitedMemoryBFGS(DifferentiableFunction f,
                                    double[] currentValues,
                                    byte m,
                                    Optimizer.TerminationCondition terminationMode,
                                    double eps,
                                    double linEps,
                                    StartDistanceForecaster startDistance,
                                    SafeOutputStream out,
                                    Time t)
                             throws DimensionException,
                                    TerminationException,
                                    IOException,
                                    EvaluationException
The Broyden-Fletcher-Goldfarb-Shanno version of limited memory quasi Newton methods.

Parameters:
f - the function to be minimized
currentValues - at the begin the start vector and at the end the minimum vector
m - the current number of vectors used to approximated the Hessian, typically between 3 and 10
terminationMode - the choice how to terminate the algorithm
eps - the bound for stopping the algorithm
linEps - the bound for stopping the linear search
startDistance - the initial step for the linear search
out - the SafeOutputStream for writing the some information about the iterations, if null nothing will be written.
t - a Time instance used to measure the elapsed time
Returns:
the number of iterations
Throws:
EvaluationException - if there was a mistake in evaluating the function
TerminationException - if the termination condition is unknown
DimensionException - if the dimension of the scope of the function and the array of arguments do not match
IOException - if some mistakes in the IO happened

optimize

public static int optimize(byte algorithm,
                           DifferentiableFunction f,
                           double[] currentValues,
                           Optimizer.TerminationCondition terminationMode,
                           double eps,
                           double linEps,
                           StartDistanceForecaster startDistance,
                           SafeOutputStream out)
                    throws DimensionException,
                           TerminationException,
                           IOException,
                           EvaluationException
This method enables you to use all different implemented optimization algorithms by only one method. You just have to change the parameter algorithm.

Parameters:
algorithm - the algorithm that should be used, use 2 < x > 11 for limitedMemoryBFGS with m=x or the constants of the class
f - the function to be minimized
currentValues - at the begin the start vector and at the end the minimum vector
terminationMode - the choice how to terminate the algorithm
eps - the bound for stopping the algorithm
linEps - the bound for stopping the linear search
startDistance - the initial step for the linear search
out - the SafeOutputStream for writing the some information about the iterations, if null nothing will be written.
Returns:
the number of iterations
Throws:
EvaluationException - if there was a mistake in evaluating the function
TerminationException - if the termination condition is unknown
DimensionException - if the dimension of the scope of the function and the array of arguments do not match
IOException - if some mistakes in the IO happened
See Also:
STEEPEST_DESCENT, CONJUGATE_GRADIENTS_FR, CONJUGATE_GRADIENTS_PRP, QUASI_NEWTON_DFP, QUASI_NEWTON_BFGS

optimize

public static int optimize(byte algorithm,
                           DifferentiableFunction f,
                           double[] currentValues,
                           Optimizer.TerminationCondition terminationMode,
                           double eps,
                           double linEps,
                           StartDistanceForecaster startDistance,
                           SafeOutputStream out,
                           Time t)
                    throws DimensionException,
                           TerminationException,
                           IOException,
                           EvaluationException
This method enables you to use all different implemented optimization algorithms by only one method. You just have to change the parameter algorithm.

Parameters:
algorithm - the algorithm that should be used, use 2 < x > 11 for limitedMemoryBFGS with m=x or the constants of the class
f - the function to be minimized
currentValues - at the begin the start vector and at the end the minimum vector
terminationMode - the choice how to terminate the algorithm
eps - the bound for stopping the algorithm
linEps - the bound for stopping the linear search
startDistance - the initial step for the linear search
out - the SafeOutputStream for writing the some information about the iterations, if null nothing will be written.
t - a Time instance used to measure the elapsed time
Returns:
the number of iterations
Throws:
EvaluationException - if there was a mistake in evaluating the function
TerminationException - if the termination condition is unknown
DimensionException - if the dimension of the scope of the function and the array of arguments do not match
IOException - if some mistakes in the IO happened
See Also:
STEEPEST_DESCENT, CONJUGATE_GRADIENTS_FR, CONJUGATE_GRADIENTS_PRP, QUASI_NEWTON_DFP, QUASI_NEWTON_BFGS