public class OneDataSetLogGenDisMixFunction extends LogGenDisMixFunction

is the weight for sequence
and class
.
The weights
have to sum to 1. For special weights the optimization turns out to be
well known
=0, one obtains the generative-discriminative trade-off,
=0.5, one obtains the penalized generative-discriminative trade-off.
LogGenDisMixFunction.
However, we implemented this class to allow a faster function and gradient evaluation leading to a faster optimization.
This becomes especially interesting if the number of classes increases.
DifferentiableSequenceScore.clone() method works correctly, since each thread works on its own clones.OptimizableFunction.KindOfParameterbeta, cllGrad, helpArray, llGrad, prGraddList, iList, prior, score, shortcutparams, workercl, clazz, data, freeParams, logClazz, norm, sum, weights| Constructor and Description |
|---|
OneDataSetLogGenDisMixFunction(int threads,
DifferentiableSequenceScore[] score,
DataSet data,
double[][] weights,
LogPrior prior,
double[] beta,
boolean norm,
boolean freeParams)
The constructor for creating an instance that can be used in an
Optimizer. |
| Modifier and Type | Method and Description |
|---|---|
protected void |
evaluateFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
This method evaluates the function for a part of the data.
|
protected void |
evaluateGradientOfFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
This method evaluates the gradient of the function for a part of the data.
|
DataSet[] |
getData()
Returns the data for each class used in this
OptimizableFunction. |
void |
setDataAndWeights(DataSet[] data,
double[][] weights)
This method sets the data set and the sequence weights to be used.
|
joinFunction, joinGradients, resetaddTermToClassParameter, getClassParams, getDimensionOfScope, getParameters, reset, setParams, setThreadIndependentParametersevaluateFunction, evaluateGradientOfFunction, getNumberOfAvailableProcessors, getNumberOfThreads, prepareThreads, setParams, stopThreadsgetParameters, getSequenceWeightsfindOneDimensionalMinpublic OneDataSetLogGenDisMixFunction(int threads,
DifferentiableSequenceScore[] score,
DataSet data,
double[][] weights,
LogPrior prior,
double[] beta,
boolean norm,
boolean freeParams)
throws IllegalArgumentException
Optimizer.threads - the number of threads used for evaluating the function and determining the gradient of the functionscore - an array containing the DifferentiableSequenceScores that are used for determining the sequences scores;
if the weight beta[LearningPrinciple.LIKELIHOOD_INDEX] is positive all elements of score have to be DifferentiableStatisticalModeldata - the array of DataSets containing the data that is needed to evaluate the functionweights - the weights for each Sequence in each DataSet of dataprior - the prior that is used for learning the parametersbeta - the beta-weights for the three terms of the learning principlenorm - the switch for using the normalization (division by the number
of sequences)freeParams - the switch for using only the free parametersIllegalArgumentException - if the number of threads is not positive, the number of classes or the dimension of the weights is not correctpublic void setDataAndWeights(DataSet[] data, double[][] weights) throws IllegalArgumentException
OptimizableFunctionsetDataAndWeights in class AbstractMultiThreadedOptimizableFunctiondata - the data setsweights - the sequence weights for each sequence in each data setIllegalArgumentException - if the data or the weights can not be usedpublic DataSet[] getData()
OptimizableFunctionOptimizableFunction.getData in class AbstractOptimizableFunctionOptimizableFunction.getSequenceWeights()protected void evaluateGradientOfFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
AbstractMultiThreadedOptimizableFunctionevaluateGradientOfFunction in class LogGenDisMixFunctionindex - the index of the partstartClass - the index of the start classstartSeq - the index of the start sequenceendClass - the index of the end class (inclusive)endSeq - the index of the end sequence (exclusive)protected void evaluateFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
throws EvaluationException
AbstractMultiThreadedOptimizableFunctionevaluateFunction in class LogGenDisMixFunctionindex - the index of the partstartClass - the index of the start classstartSeq - the index of the start sequenceendClass - the index of the end class (inclusive)endSeq - the index of the end sequence (exclusive)EvaluationException - if the gradient could not be evaluated properly