Uses of Class
de.jstacs.algorithms.optimization.DimensionException

Packages that use DimensionException
de.jstacs.algorithms.optimization Provides classes for different types of algorithms that are not directly linked to the modelling components of Jstacs: Algorithms on graphs, algorithms for numerical optimization, and a basic alignment algorithm.
de.jstacs.classifiers.differentiableSequenceScoreBased Provides the classes for Classifiers that are based on SequenceScores. 
de.jstacs.classifiers.differentiableSequenceScoreBased.gendismix Provides an implementation of a classifier that allows to train the parameters of a set of DifferentiableStatisticalModels by a unified generative-discriminative learning principle 
de.jstacs.classifiers.differentiableSequenceScoreBased.logPrior Provides a general definition of a parameter log-prior and a number of implementations of Laplace and Gaussian priors 
 

Uses of DimensionException in de.jstacs.algorithms.optimization
 

Methods in de.jstacs.algorithms.optimization that throw DimensionException
static int Optimizer.conjugateGradientsFR(DifferentiableFunction f, double[] currentValues, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out, Time t)
          The conjugate gradient algorithm by Fletcher and Reeves.
static int Optimizer.conjugateGradientsPR(DifferentiableFunction f, double[] currentValues, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out, Time t)
          The conjugate gradient algorithm by Polak and Ribière.
static int Optimizer.conjugateGradientsPRP(DifferentiableFunction f, double[] currentValues, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out, Time t)
          The conjugate gradient algorithm by Polak and Ribière called "Polak-Ribière-Positive".
 double OneDimensionalFunction.evaluateFunction(double[] x)
           
 double NegativeOneDimensionalFunction.evaluateFunction(double[] x)
           
 double NegativeFunction.evaluateFunction(double[] x)
           
 double NegativeDifferentiableFunction.evaluateFunction(double[] x)
           
 double Function.evaluateFunction(double[] x)
          Evaluates the function at a certain vector (in mathematical sense) x.
 double[] NumericalDifferentiableFunction.evaluateGradientOfFunction(double[] x)
          Evaluates the gradient of a function at a certain vector (in mathematical sense) x numerically.
 double[] NegativeDifferentiableFunction.evaluateGradientOfFunction(double[] x)
           
abstract  double[] DifferentiableFunction.evaluateGradientOfFunction(double[] x)
          Evaluates the gradient of a function at a certain vector (in mathematical sense) x, i.e., $\nabla f(\underline{x}) = \left(\frac{\partial f(\underline{x})}{\partial x_1},\ldots,\frac{\partial f(\underline{x})}{\partial x_n}\right)$.
protected  double[] DifferentiableFunction.findOneDimensionalMin(double[] x, double[] d, double alpha_0, double fAlpha_0, double linEps, double startDistance)
          This method is used to find an approximation of an one-dimensional subfunction.
static int Optimizer.limitedMemoryBFGS(DifferentiableFunction f, double[] currentValues, byte m, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out, Time t)
          The Broyden-Fletcher-Goldfarb-Shanno version of limited memory quasi-Newton methods.
static int Optimizer.optimize(byte algorithm, DifferentiableFunction f, double[] currentValues, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out)
          This method enables you to use all different implemented optimization algorithms by only one method.
static int Optimizer.optimize(byte algorithm, DifferentiableFunction f, double[] currentValues, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out, Time t)
          This method enables you to use all different implemented optimization algorithms by only one method.
static int Optimizer.quasiNewtonBFGS(DifferentiableFunction f, double[] currentValues, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out, Time t)
          The Broyden-Fletcher-Goldfarb-Shanno version of the quasi-Newton method.
static int Optimizer.quasiNewtonDFP(DifferentiableFunction f, double[] currentValues, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out, Time t)
          The Davidon-Fletcher-Powell version of the quasi-Newton method.
static int Optimizer.steepestDescent(DifferentiableFunction f, double[] currentValues, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out, Time t)
          The steepest descent.
 

Constructors in de.jstacs.algorithms.optimization that throw DimensionException
OneDimensionalSubFunction(Function f, double[] current, double[] d)
          Creates a new OneDimensionalSubFunction from a Function f for the line search.
 

Uses of DimensionException in de.jstacs.classifiers.differentiableSequenceScoreBased
 

Methods in de.jstacs.classifiers.differentiableSequenceScoreBased that throw DimensionException
 double AbstractMultiThreadedOptimizableFunction.evaluateFunction(double[] x)
           
 double[] AbstractMultiThreadedOptimizableFunction.evaluateGradientOfFunction(double[] x)
           
protected abstract  double AbstractMultiThreadedOptimizableFunction.joinFunction()
          This method joins the partial results that have been computed using AbstractMultiThreadedOptimizableFunction.evaluateFunction(int, int, int, int, int).
abstract  void OptimizableFunction.setParams(double[] current)
          Sets the current values as parameters.
 void AbstractMultiThreadedOptimizableFunction.setParams(double[] params)
           
protected  void DiffSSBasedOptimizableFunction.setParams(int index)
           
protected abstract  void AbstractMultiThreadedOptimizableFunction.setParams(int index)
          This method sets the parameters for thread index
protected  void DiffSSBasedOptimizableFunction.setThreadIndependentParameters()
           
protected abstract  void AbstractMultiThreadedOptimizableFunction.setThreadIndependentParameters()
          This method allows to set thread independent parameters.
 

Uses of DimensionException in de.jstacs.classifiers.differentiableSequenceScoreBased.gendismix
 

Methods in de.jstacs.classifiers.differentiableSequenceScoreBased.gendismix that throw DimensionException
protected  double LogGenDisMixFunction.joinFunction()
           
 

Uses of DimensionException in de.jstacs.classifiers.differentiableSequenceScoreBased.logPrior
 

Methods in de.jstacs.classifiers.differentiableSequenceScoreBased.logPrior that throw DimensionException
 double SeparateLaplaceLogPrior.evaluateFunction(double[] x)
           
 double SeparateGaussianLogPrior.evaluateFunction(double[] x)
           
 double CompositeLogPrior.evaluateFunction(double[] x)