public class LogGenDisMixFunction extends DiffSSBasedOptimizableFunction

have to sum to 1. For special weights the optimization turns out to be
well known
=0, one obtains the generative-discriminative trade-off,
=0.5, one obtains the penalized generative-discriminative trade-off.
DifferentiableSequenceScore.clone() method works correctly, since each thread works on its own clones.OptimizableFunction.KindOfParameter| Modifier and Type | Field and Description |
|---|---|
protected double[] |
beta
The mixture parameters of the GenDisMix
|
protected double[][] |
cllGrad
Array for the gradient of the conditional log-likelihood
|
protected double[][] |
helpArray
General temporary array
|
protected double[][] |
llGrad
Array for the gradient of the log-likelihood
|
protected double[] |
prGrad
Array for the gradient of the prior
|
dList, iList, prior, score, shortcutparams, workercl, clazz, data, freeParams, logClazz, norm, sum, weights| Constructor and Description |
|---|
LogGenDisMixFunction(int threads,
DifferentiableSequenceScore[] score,
DataSet[] data,
double[][] weights,
LogPrior prior,
double[] beta,
boolean norm,
boolean freeParams)
The constructor for creating an instance that can be used in an
Optimizer. |
| Modifier and Type | Method and Description |
|---|---|
protected void |
evaluateFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
This method evaluates the function for a part of the data.
|
protected void |
evaluateGradientOfFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
This method evaluates the gradient of the function for a part of the data.
|
protected double |
joinFunction()
This method joins the partial results that have been computed using
AbstractMultiThreadedOptimizableFunction.evaluateFunction(int, int, int, int, int). |
protected double[] |
joinGradients()
This method joins the gradients of each part that have been computed using
AbstractMultiThreadedOptimizableFunction.evaluateGradientOfFunction(int, int, int, int, int). |
void |
reset(DifferentiableSequenceScore[] funs)
This method allows to reset the internally used functions and the corresponding objects.
|
addTermToClassParameter, getClassParams, getDimensionOfScope, getParameters, reset, setParams, setThreadIndependentParametersevaluateFunction, evaluateGradientOfFunction, getNumberOfAvailableProcessors, getNumberOfThreads, prepareThreads, setDataAndWeights, setParams, stopThreadsgetData, getParameters, getSequenceWeightsfindOneDimensionalMinprotected double[][] helpArray
protected double[][] llGrad
protected double[][] cllGrad
protected double[] beta
protected double[] prGrad
public LogGenDisMixFunction(int threads,
DifferentiableSequenceScore[] score,
DataSet[] data,
double[][] weights,
LogPrior prior,
double[] beta,
boolean norm,
boolean freeParams)
throws IllegalArgumentException
Optimizer.threads - the number of threads used for evaluating the function and determining the gradient of the functionscore - an array containing the DifferentiableSequenceScores that are used for determining the sequences scores;
if the weight beta[LearningPrinciple.LIKELIHOOD_INDEX] is positive all elements of score have to be DifferentiableStatisticalModeldata - the array of DataSets containing the data that is needed to evaluate the functionweights - the weights for each Sequence in each DataSet of dataprior - the prior that is used for learning the parametersbeta - the beta-weights for the three terms of the learning principlenorm - the switch for using the normalization (division by the number
of sequences)freeParams - the switch for using only the free parametersIllegalArgumentException - if the number of threads is not positive, the number of classes or the dimension of the weights is not correctprotected double[] joinGradients()
throws EvaluationException
AbstractMultiThreadedOptimizableFunctionAbstractMultiThreadedOptimizableFunction.evaluateGradientOfFunction(int, int, int, int, int).joinGradients in class AbstractMultiThreadedOptimizableFunctionEvaluationException - if the gradient could not be evaluated properlyprotected void evaluateGradientOfFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
AbstractMultiThreadedOptimizableFunctionevaluateGradientOfFunction in class AbstractMultiThreadedOptimizableFunctionindex - the index of the partstartClass - the index of the start classstartSeq - the index of the start sequenceendClass - the index of the end class (inclusive)endSeq - the index of the end sequence (exclusive)protected double joinFunction()
throws DimensionException,
EvaluationException
AbstractMultiThreadedOptimizableFunctionAbstractMultiThreadedOptimizableFunction.evaluateFunction(int, int, int, int, int).joinFunction in class AbstractMultiThreadedOptimizableFunctionDimensionException - if the parameters could not be setEvaluationException - if the gradient could not be evaluated properlyprotected void evaluateFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
throws EvaluationException
AbstractMultiThreadedOptimizableFunctionevaluateFunction in class AbstractMultiThreadedOptimizableFunctionindex - the index of the partstartClass - the index of the start classstartSeq - the index of the start sequenceendClass - the index of the end class (inclusive)endSeq - the index of the end sequence (exclusive)EvaluationException - if the gradient could not be evaluated properlypublic void reset(DifferentiableSequenceScore[] funs) throws Exception
DiffSSBasedOptimizableFunctionreset in class DiffSSBasedOptimizableFunctionfuns - the new instancesException - if something went wrong