Uses of Class
de.jstacs.algorithms.optimization.DifferentiableFunction

Packages that use DifferentiableFunction
de.jstacs.algorithms.optimization Provides classes for different types of algorithms that are not directly linked to the modelling components of Jstacs: Algorithms on graphs, algorithms for numerical optimization, and a basic alignment algorithm.
de.jstacs.classifiers.differentiableSequenceScoreBased Provides the classes for Classifiers that are based on SequenceScores. 
de.jstacs.classifiers.differentiableSequenceScoreBased.gendismix Provides an implementation of a classifier that allows to train the parameters of a set of DifferentiableStatisticalModels by a unified generative-discriminative learning principle 
de.jstacs.classifiers.differentiableSequenceScoreBased.logPrior Provides a general definition of a parameter log-prior and a number of implementations of Laplace and Gaussian priors 
de.jstacs.motifDiscovery This package provides the framework including the interface for any de novo motif discoverer 
 

Uses of DifferentiableFunction in de.jstacs.algorithms.optimization
 

Subclasses of DifferentiableFunction in de.jstacs.algorithms.optimization
 class NegativeDifferentiableFunction
          The negative function -f for a given DifferentiableFunction f.
 class NumericalDifferentiableFunction
          This class is the framework for any numerical differentiable function $f: \mathbb{R}^n \to \mathbb{R}$.
 

Methods in de.jstacs.algorithms.optimization with parameters of type DifferentiableFunction
static int Optimizer.conjugateGradientsFR(DifferentiableFunction f, double[] currentValues, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out, Time t)
          The conjugate gradient algorithm by Fletcher and Reeves.
static int Optimizer.conjugateGradientsPR(DifferentiableFunction f, double[] currentValues, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out, Time t)
          The conjugate gradient algorithm by Polak and Ribière.
static int Optimizer.conjugateGradientsPRP(DifferentiableFunction f, double[] currentValues, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out, Time t)
          The conjugate gradient algorithm by Polak and Ribière called "Polak-Ribière-Positive".
static int Optimizer.limitedMemoryBFGS(DifferentiableFunction f, double[] currentValues, byte m, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out, Time t)
          The Broyden-Fletcher-Goldfarb-Shanno version of limited memory quasi-Newton methods.
static int Optimizer.optimize(byte algorithm, DifferentiableFunction f, double[] currentValues, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out)
          This method enables you to use all different implemented optimization algorithms by only one method.
static int Optimizer.optimize(byte algorithm, DifferentiableFunction f, double[] currentValues, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out, Time t)
          This method enables you to use all different implemented optimization algorithms by only one method.
static int Optimizer.quasiNewtonBFGS(DifferentiableFunction f, double[] currentValues, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out, Time t)
          The Broyden-Fletcher-Goldfarb-Shanno version of the quasi-Newton method.
static int Optimizer.quasiNewtonDFP(DifferentiableFunction f, double[] currentValues, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out, Time t)
          The Davidon-Fletcher-Powell version of the quasi-Newton method.
static int Optimizer.steepestDescent(DifferentiableFunction f, double[] currentValues, TerminationCondition terminationMode, double linEps, StartDistanceForecaster startDistance, OutputStream out, Time t)
          The steepest descent.
 

Constructors in de.jstacs.algorithms.optimization with parameters of type DifferentiableFunction
NegativeDifferentiableFunction(DifferentiableFunction f)
          Creates the DifferentiableFunction f for which -f should be calculated.
 

Uses of DifferentiableFunction in de.jstacs.classifiers.differentiableSequenceScoreBased
 

Subclasses of DifferentiableFunction in de.jstacs.classifiers.differentiableSequenceScoreBased
 class AbstractMultiThreadedOptimizableFunction
          This class enables the user to exploit all CPUs of an computer by using threads.
 class AbstractOptimizableFunction
          This class extends OptimizableFunction and implements some common methods.
 class DiffSSBasedOptimizableFunction
          This abstract class is the basis of all multi-threaded OptimizableFunctions that are based on DifferentiableSequenceScores.
 class OptimizableFunction
          This is the main function for the ScoreClassifier.
 

Uses of DifferentiableFunction in de.jstacs.classifiers.differentiableSequenceScoreBased.gendismix
 

Subclasses of DifferentiableFunction in de.jstacs.classifiers.differentiableSequenceScoreBased.gendismix
 class LogGenDisMixFunction
          This class implements the the following function
\[f(\underline{\lambda}|C,D,\underline{\alpha},\underline{\beta})
The weights $\beta_i$ have to sum to 1.
 class OneDataSetLogGenDisMixFunction
          This class implements the the following function
\[f(\underline{\lambda}|C,D,\underline{w},\underline{\alpha},\underline{\beta})
where $w_{c,n}$ is the weight for sequence $d_n$ and class $c$.
 

Uses of DifferentiableFunction in de.jstacs.classifiers.differentiableSequenceScoreBased.logPrior
 

Subclasses of DifferentiableFunction in de.jstacs.classifiers.differentiableSequenceScoreBased.logPrior
 class CompositeLogPrior
          This class implements a composite prior that can be used for DifferentiableStatisticalModel.
 class DoesNothingLogPrior
          This class defines a LogPrior that does not penalize any parameter.
 class LogPrior
          The abstract class for any log-prior used e.g. for maximum supervised posterior optimization.
 class SeparateGaussianLogPrior
          Class for a LogPrior that defines a Gaussian prior on the parameters of a set of DifferentiableStatisticalModels and a set of class parameters.
 class SeparateLaplaceLogPrior
          Class for a LogPrior that defines a Laplace prior on the parameters of a set of DifferentiableStatisticalModels and a set of class parameters.
 class SeparateLogPrior
          Abstract class for priors that penalize each parameter value independently and have some variances (and possible means) as hyperparameters.
 class SimpleGaussianSumLogPrior
          This class implements a prior that is a product of Gaussian distributions with mean 0 and equal variance for each parameter.
 

Uses of DifferentiableFunction in de.jstacs.motifDiscovery
 

Methods in de.jstacs.motifDiscovery with parameters of type DifferentiableFunction
static boolean MutableMotifDiscovererToolbox.doHeuristicSteps(DifferentiableSequenceScore[] funs, DataSet[] data, double[][] weights, DiffSSBasedOptimizableFunction opt, DifferentiableFunction neg, byte algorithm, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, boolean breakOnChanged, History[][] hist, int[][] minimalNewLength, boolean maxPos)
          This method tries to make some heuristic step if at least one DifferentiableSequenceScore is a MutableMotifDiscoverer.
static boolean MutableMotifDiscovererToolbox.findModification(int clazz, int motif, MutableMotifDiscoverer mmd, DifferentiableSequenceScore[] score, DataSet[] data, double[][] weights, DiffSSBasedOptimizableFunction opt, DifferentiableFunction neg, byte algo, double linEps, StartDistanceForecaster startDistance, SafeOutputStream out, History hist, int minimalNewLength, boolean maxPos)
          This method tries to find a modification, i.e. shifting, shrinking, or expanding a motif, that is promising.