public class LogGenDisMixFunction extends DiffSSBasedOptimizableFunction
DifferentiableSequenceScore.clone()
method works correctly, since each thread works on its own clones.OptimizableFunction.KindOfParameter
Modifier and Type | Field and Description |
---|---|
protected double[] |
beta
The mixture parameters of the GenDisMix
|
protected double[][] |
cllGrad
Array for the gradient of the conditional log-likelihood
|
protected double[][] |
helpArray
General temporary array
|
protected double[][] |
llGrad
Array for the gradient of the log-likelihood
|
protected double[] |
prGrad
Array for the gradient of the prior
|
dList, iList, prior, score, shortcut
params, worker
cl, clazz, data, freeParams, logClazz, norm, sum, weights
Constructor and Description |
---|
LogGenDisMixFunction(int threads,
DifferentiableSequenceScore[] score,
DataSet[] data,
double[][] weights,
LogPrior prior,
double[] beta,
boolean norm,
boolean freeParams)
The constructor for creating an instance that can be used in an
Optimizer . |
Modifier and Type | Method and Description |
---|---|
protected void |
evaluateFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
This method evaluates the function for a part of the data.
|
protected void |
evaluateGradientOfFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
This method evaluates the gradient of the function for a part of the data.
|
protected double |
joinFunction()
This method joins the partial results that have been computed using
AbstractMultiThreadedOptimizableFunction.evaluateFunction(int, int, int, int, int) . |
protected double[] |
joinGradients()
This method joins the gradients of each part that have been computed using
AbstractMultiThreadedOptimizableFunction.evaluateGradientOfFunction(int, int, int, int, int) . |
void |
reset(DifferentiableSequenceScore[] funs)
This method allows to reset the internally used functions and the corresponding objects.
|
addTermToClassParameter, getClassParams, getDimensionOfScope, getParameters, reset, setParams, setThreadIndependentParameters
evaluateFunction, evaluateGradientOfFunction, getNumberOfAvailableProcessors, getNumberOfThreads, prepareThreads, setDataAndWeights, setParams, stopThreads
getData, getParameters, getSequenceWeights
findOneDimensionalMin
protected double[][] helpArray
protected double[][] llGrad
protected double[][] cllGrad
protected double[] beta
protected double[] prGrad
public LogGenDisMixFunction(int threads, DifferentiableSequenceScore[] score, DataSet[] data, double[][] weights, LogPrior prior, double[] beta, boolean norm, boolean freeParams) throws IllegalArgumentException
Optimizer
.threads
- the number of threads used for evaluating the function and determining the gradient of the functionscore
- an array containing the DifferentiableSequenceScore
s that are used for determining the sequences scores;
if the weight beta[LearningPrinciple.LIKELIHOOD_INDEX]
is positive all elements of score
have to be DifferentiableStatisticalModel
data
- the array of DataSet
s containing the data that is needed to evaluate the functionweights
- the weights for each Sequence
in each DataSet
of data
prior
- the prior that is used for learning the parametersbeta
- the beta-weights for the three terms of the learning principlenorm
- the switch for using the normalization (division by the number
of sequences)freeParams
- the switch for using only the free parametersIllegalArgumentException
- if the number of threads is not positive, the number of classes or the dimension of the weights is not correctprotected double[] joinGradients() throws EvaluationException
AbstractMultiThreadedOptimizableFunction
AbstractMultiThreadedOptimizableFunction.evaluateGradientOfFunction(int, int, int, int, int)
.joinGradients
in class AbstractMultiThreadedOptimizableFunction
EvaluationException
- if the gradient could not be evaluated properlyprotected void evaluateGradientOfFunction(int index, int startClass, int startSeq, int endClass, int endSeq)
AbstractMultiThreadedOptimizableFunction
evaluateGradientOfFunction
in class AbstractMultiThreadedOptimizableFunction
index
- the index of the partstartClass
- the index of the start classstartSeq
- the index of the start sequenceendClass
- the index of the end class (inclusive)endSeq
- the index of the end sequence (exclusive)protected double joinFunction() throws DimensionException, EvaluationException
AbstractMultiThreadedOptimizableFunction
AbstractMultiThreadedOptimizableFunction.evaluateFunction(int, int, int, int, int)
.joinFunction
in class AbstractMultiThreadedOptimizableFunction
DimensionException
- if the parameters could not be setEvaluationException
- if the gradient could not be evaluated properlyprotected void evaluateFunction(int index, int startClass, int startSeq, int endClass, int endSeq) throws EvaluationException
AbstractMultiThreadedOptimizableFunction
evaluateFunction
in class AbstractMultiThreadedOptimizableFunction
index
- the index of the partstartClass
- the index of the start classstartSeq
- the index of the start sequenceendClass
- the index of the end class (inclusive)endSeq
- the index of the end sequence (exclusive)EvaluationException
- if the gradient could not be evaluated properlypublic void reset(DifferentiableSequenceScore[] funs) throws Exception
DiffSSBasedOptimizableFunction
reset
in class DiffSSBasedOptimizableFunction
funs
- the new instancesException
- if something went wrong