public class OneDataSetLogGenDisMixFunction extends LogGenDisMixFunction
LogGenDisMixFunction
.
However, we implemented this class to allow a faster function and gradient evaluation leading to a faster optimization.
This becomes especially interesting if the number of classes increases.
DifferentiableSequenceScore.clone()
method works correctly, since each thread works on its own clones.OptimizableFunction.KindOfParameter
beta, cllGrad, helpArray, llGrad, prGrad
dList, iList, prior, score, shortcut
params, worker
cl, clazz, data, freeParams, logClazz, norm, sum, weights
Constructor and Description |
---|
OneDataSetLogGenDisMixFunction(int threads,
DifferentiableSequenceScore[] score,
DataSet data,
double[][] weights,
LogPrior prior,
double[] beta,
boolean norm,
boolean freeParams)
The constructor for creating an instance that can be used in an
Optimizer . |
Modifier and Type | Method and Description |
---|---|
protected void |
evaluateFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
This method evaluates the function for a part of the data.
|
protected void |
evaluateGradientOfFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
This method evaluates the gradient of the function for a part of the data.
|
DataSet[] |
getData()
Returns the data for each class used in this
OptimizableFunction . |
void |
setDataAndWeights(DataSet[] data,
double[][] weights)
This method sets the data set and the sequence weights to be used.
|
joinFunction, joinGradients, reset
addTermToClassParameter, getClassParams, getDimensionOfScope, getParameters, reset, setParams, setThreadIndependentParameters
evaluateFunction, evaluateGradientOfFunction, getNumberOfAvailableProcessors, getNumberOfThreads, prepareThreads, setParams, stopThreads
getParameters, getSequenceWeights
findOneDimensionalMin
public OneDataSetLogGenDisMixFunction(int threads, DifferentiableSequenceScore[] score, DataSet data, double[][] weights, LogPrior prior, double[] beta, boolean norm, boolean freeParams) throws IllegalArgumentException
Optimizer
.threads
- the number of threads used for evaluating the function and determining the gradient of the functionscore
- an array containing the DifferentiableSequenceScore
s that are used for determining the sequences scores;
if the weight beta[LearningPrinciple.LIKELIHOOD_INDEX]
is positive all elements of score
have to be DifferentiableStatisticalModel
data
- the array of DataSet
s containing the data that is needed to evaluate the functionweights
- the weights for each Sequence
in each DataSet
of data
prior
- the prior that is used for learning the parametersbeta
- the beta-weights for the three terms of the learning principlenorm
- the switch for using the normalization (division by the number
of sequences)freeParams
- the switch for using only the free parametersIllegalArgumentException
- if the number of threads is not positive, the number of classes or the dimension of the weights is not correctpublic void setDataAndWeights(DataSet[] data, double[][] weights) throws IllegalArgumentException
OptimizableFunction
setDataAndWeights
in class AbstractMultiThreadedOptimizableFunction
data
- the data setsweights
- the sequence weights for each sequence in each data setIllegalArgumentException
- if the data or the weights can not be usedpublic DataSet[] getData()
OptimizableFunction
OptimizableFunction
.getData
in class AbstractOptimizableFunction
OptimizableFunction.getSequenceWeights()
protected void evaluateGradientOfFunction(int index, int startClass, int startSeq, int endClass, int endSeq)
AbstractMultiThreadedOptimizableFunction
evaluateGradientOfFunction
in class LogGenDisMixFunction
index
- the index of the partstartClass
- the index of the start classstartSeq
- the index of the start sequenceendClass
- the index of the end class (inclusive)endSeq
- the index of the end sequence (exclusive)protected void evaluateFunction(int index, int startClass, int startSeq, int endClass, int endSeq) throws EvaluationException
AbstractMultiThreadedOptimizableFunction
evaluateFunction
in class LogGenDisMixFunction
index
- the index of the partstartClass
- the index of the start classstartSeq
- the index of the start sequenceendClass
- the index of the end class (inclusive)endSeq
- the index of the end sequence (exclusive)EvaluationException
- if the gradient could not be evaluated properly