Uses of Class
de.jstacs.algorithms.optimization.DifferentiableFunction

Packages that use DifferentiableFunction
de.jstacs.algorithms.optimization Provides classes for different types of algorithms that are not directly linked to the modelling components of Jstacs: Algorithms on graphs, algorithms for numerical optimization, and a basic alignment algorithm.
de.jstacs.classifier.scoringFunctionBased Provides the classes for Classifiers that are based on ScoringFunctions. 
de.jstacs.classifier.scoringFunctionBased.cll Provides the implementation of the log conditional likelihood as an OptimizableFunction and a classifier that uses log conditional likelihood or supervised posterior to learn the parameters of a set of ScoringFunctions 
de.jstacs.classifier.scoringFunctionBased.logPrior Provides a general definition of a parameter log-prior and a number of implementations of Laplace and Gaussian priors 
 

Uses of DifferentiableFunction in de.jstacs.algorithms.optimization
 

Subclasses of DifferentiableFunction in de.jstacs.algorithms.optimization
 class NegativeDifferentiableFunction
          The Function -f for a given function f.
 class NumericalDifferentiableFunction
          This class is the framework for any function f: R^n -> R.
 

Methods in de.jstacs.algorithms.optimization with parameters of type DifferentiableFunction
static int Optimizer.conjugateGradientsFR(DifferentiableFunction f, double[] currentValues, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, Time t)
          The conjugate gradient algorithm by Fletcher and Reeves.
static int Optimizer.conjugateGradientsPR(DifferentiableFunction f, double[] currentValues, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, Time t)
          The conjugate gradient algorithm by Polak and Ribiere.
static int Optimizer.conjugateGradientsPRP(DifferentiableFunction f, double[] currentValues, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, Time t)
          The conjugate gradient algorithm by Polak and Ribiere called Polak-Ribiere-Positive.
static int Optimizer.limitedMemoryBFGS(DifferentiableFunction f, double[] currentValues, byte m, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, Time t)
          The Broyden-Fletcher-Goldfarb-Shanno version of limited memory quasi Newton methods.
static int Optimizer.optimize(byte algorithm, DifferentiableFunction f, double[] currentValues, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out)
          This method enables you to use all different implemented optimization algorithms by only one method.
static int Optimizer.optimize(byte algorithm, DifferentiableFunction f, double[] currentValues, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, Time t)
          This method enables you to use all different implemented optimization algorithms by only one method.
static int Optimizer.quasiNewtonBFGS(DifferentiableFunction f, double[] currentValues, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, Time t)
          The Broyden-Fletcher-Goldfarb-Shanno version of quasi Newton method.
static int Optimizer.quasiNewtonDFP(DifferentiableFunction f, double[] currentValues, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, Time t)
          The Davidon-Fletcher-Powell version of quasi Newton method.
static int Optimizer.steepestDescent(DifferentiableFunction f, double[] currentValues, Optimizer.TerminationCondition terminationMode, double eps, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, Time t)
          The steepest descent.
 

Constructors in de.jstacs.algorithms.optimization with parameters of type DifferentiableFunction
NegativeDifferentiableFunction(DifferentiableFunction f)
           
 

Uses of DifferentiableFunction in de.jstacs.classifier.scoringFunctionBased
 

Subclasses of DifferentiableFunction in de.jstacs.classifier.scoringFunctionBased
 class OptimizableFunction
          This is the main function for the ScoreClassifier.
 

Uses of DifferentiableFunction in de.jstacs.classifier.scoringFunctionBased.cll
 

Subclasses of DifferentiableFunction in de.jstacs.classifier.scoringFunctionBased.cll
 class NormConditionalLogLikelihood
          This class implements the normalized log conditional likelihood.
 

Uses of DifferentiableFunction in de.jstacs.classifier.scoringFunctionBased.logPrior
 

Subclasses of DifferentiableFunction in de.jstacs.classifier.scoringFunctionBased.logPrior
 class DoesNothingLogPrior
          This class defines a LogPrior that does not penalize any parameter.
 class LogPrior
          The abstract class for any log-prior used e.g. for maximum supervised posterior optimization.
 class SeparateGaussianLogPrior
          Class for a LogPrior that defines a Gaussian prior on the parameters of a set of NormalizableScoringFunctions and a set of class-parameters.
 class SeparateLaplaceLogPrior
          Class for a LogPrior that defines a Laplace-prior on the parameters of a set of NormalizableScoringFunctions and a set of class-parameters.
 class SeparateLogPrior
          Abstract class for priors that penalize each parameter value independently and have some variance (and possible mean) as hyper-parameters.
 class SimpleGaussianSumLogPrior
          This class implements a prior that is a product of Gaussian distributions with mean 0 and equal variance for each parameter.