de.jstacs.classifier.scoringFunctionBased.gendismix
Class OneSampleLogGenDisMixFunction

java.lang.Object
  extended by de.jstacs.algorithms.optimization.DifferentiableFunction
      extended by de.jstacs.classifier.scoringFunctionBased.OptimizableFunction
          extended by de.jstacs.classifier.scoringFunctionBased.AbstractOptimizableFunction
              extended by de.jstacs.classifier.scoringFunctionBased.AbstractMultiThreadedOptimizableFunction
                  extended by de.jstacs.classifier.scoringFunctionBased.SFBasedOptimizableFunction
                      extended by de.jstacs.classifier.scoringFunctionBased.gendismix.LogGenDisMixFunction
                          extended by de.jstacs.classifier.scoringFunctionBased.gendismix.OneSampleLogGenDisMixFunction
All Implemented Interfaces:
Function

public class OneSampleLogGenDisMixFunction
extends LogGenDisMixFunction

This class implements the the following function

\[f(\underline{\lambda}|C,D,\underline{w},\underline{\alpha},\underline{\beta})
where $w_{c,n}$ is the weight for sequence $d_n$ and class $c$. The weights $\beta_i$ have to sum to 1. For special weights the optimization turns out to be well known Of course, there are also some very interesting cases with other weights.

It can be used to maximize the parameters. This can also be done with LogGenDisMixFunction. However, we implemented this class to allow a faster function and gradient evaluation leading to a faster optimization. This becomes especially interesting if the number of classes increases.

This class enables the user to exploit all CPUs of the computer by using threads. The number of compute threads can be determined in the constructor.

It is very important for this class that the ScoringFunction.clone() method works correctly, since each thread works on its own clones.

Author:
Jens Keilwagen

Nested Class Summary
 
Nested classes/interfaces inherited from class de.jstacs.classifier.scoringFunctionBased.OptimizableFunction
OptimizableFunction.KindOfParameter
 
Field Summary
 
Fields inherited from class de.jstacs.classifier.scoringFunctionBased.gendismix.LogGenDisMixFunction
beta, cllGrad, helpArray, llGrad, prGrad
 
Fields inherited from class de.jstacs.classifier.scoringFunctionBased.SFBasedOptimizableFunction
dList, iList, prior, score, shortcut
 
Fields inherited from class de.jstacs.classifier.scoringFunctionBased.AbstractMultiThreadedOptimizableFunction
params
 
Fields inherited from class de.jstacs.classifier.scoringFunctionBased.AbstractOptimizableFunction
cl, clazz, data, freeParams, logClazz, norm, sum, weights
 
Constructor Summary
OneSampleLogGenDisMixFunction(int threads, ScoringFunction[] score, Sample data, double[][] weights, LogPrior prior, double[] beta, boolean norm, boolean freeParams)
          The constructor for creating an instance that can be used in an Optimizer.
 
Method Summary
protected  void evaluateFunction(int index, int startClass, int startSeq, int endClass, int endSeq)
          This method evaluates the function for a part of the data.
protected  void evaluateGradientOfFunction(int index, int startClass, int startSeq, int endClass, int endSeq)
          This method evaluates the gradient of the function for a part of the data.
 Sample[] getData()
          Returns the data for each class used in this OptimizableFunction.
 void setDataAndWeights(Sample[] data, double[][] weights)
          This method sets the data set and the sequence weights to be used.
 
Methods inherited from class de.jstacs.classifier.scoringFunctionBased.gendismix.LogGenDisMixFunction
joinFunction, joinGradients, reset
 
Methods inherited from class de.jstacs.classifier.scoringFunctionBased.SFBasedOptimizableFunction
addTermToClassParameter, getClassParams, getDimensionOfScope, getParameters, reset, setParams, setThreadIndependentParameters
 
Methods inherited from class de.jstacs.classifier.scoringFunctionBased.AbstractMultiThreadedOptimizableFunction
evaluateFunction, evaluateGradientOfFunction, getNumberOfAvailableProcessors, getNumberOfThreads, setParams, stopThreads
 
Methods inherited from class de.jstacs.classifier.scoringFunctionBased.AbstractOptimizableFunction
getParameters, getSequenceWeights
 
Methods inherited from class de.jstacs.algorithms.optimization.DifferentiableFunction
findOneDimensionalMin
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Constructor Detail

OneSampleLogGenDisMixFunction

public OneSampleLogGenDisMixFunction(int threads,
                                     ScoringFunction[] score,
                                     Sample data,
                                     double[][] weights,
                                     LogPrior prior,
                                     double[] beta,
                                     boolean norm,
                                     boolean freeParams)
                              throws IllegalArgumentException
The constructor for creating an instance that can be used in an Optimizer.

Parameters:
threads - the number of threads used for evaluating the function and determining the gradient of the function
score - an array containing the ScoringFunctions that are used for determining the sequences scores; if the weight beta[LearningPrinciple.LIKELIHOOD_INDEX] is positive all elements of score have to be NormalizableScoringFunction
data - the array of Samples containing the data that is needed to evaluate the function
weights - the weights for each Sequence in each Sample of data
prior - the prior that is used for learning the parameters
beta - the beta-weights for the three terms of the learning principle
norm - the switch for using the normalization (division by the number of sequences)
freeParams - the switch for using only the free parameters
Throws:
IllegalArgumentException - if the number of threads is not positive, the number of classes or the dimension of the weights is not correct
Method Detail

setDataAndWeights

public void setDataAndWeights(Sample[] data,
                              double[][] weights)
                       throws IllegalArgumentException
Description copied from class: OptimizableFunction
This method sets the data set and the sequence weights to be used. It also allows to do further preparation for the computation on this data.

Overrides:
setDataAndWeights in class AbstractMultiThreadedOptimizableFunction
Parameters:
data - the data sets
weights - the sequence weights for each sequence in each data set
Throws:
IllegalArgumentException - if the data or the weights can not be used

getData

public Sample[] getData()
Description copied from class: OptimizableFunction
Returns the data for each class used in this OptimizableFunction.

Overrides:
getData in class AbstractOptimizableFunction
Returns:
the data for each class
See Also:
OptimizableFunction.getSequenceWeights()

evaluateGradientOfFunction

protected void evaluateGradientOfFunction(int index,
                                          int startClass,
                                          int startSeq,
                                          int endClass,
                                          int endSeq)
Description copied from class: AbstractMultiThreadedOptimizableFunction
This method evaluates the gradient of the function for a part of the data.

Overrides:
evaluateGradientOfFunction in class LogGenDisMixFunction
Parameters:
index - the index of the part
startClass - the index of the start class
startSeq - the index of the start sequence
endClass - the index of the end class (inclusive)
endSeq - the index of the end sequence (exclusive)

evaluateFunction

protected void evaluateFunction(int index,
                                int startClass,
                                int startSeq,
                                int endClass,
                                int endSeq)
                         throws EvaluationException
Description copied from class: AbstractMultiThreadedOptimizableFunction
This method evaluates the function for a part of the data.

Overrides:
evaluateFunction in class LogGenDisMixFunction
Parameters:
index - the index of the part
startClass - the index of the start class
startSeq - the index of the start sequence
endClass - the index of the end class (inclusive)
endSeq - the index of the end sequence (exclusive)
Throws:
EvaluationException - if the gradient could not be evaluated properly