|
||||||||||
| PREV CLASS NEXT CLASS | FRAMES NO FRAMES All Classes | |||||||||
| SUMMARY: NESTED | FIELD | CONSTR | METHOD | DETAIL: FIELD | CONSTR | METHOD | |||||||||
java.lang.Objectde.jstacs.algorithms.optimization.DifferentiableFunction
de.jstacs.classifier.scoringFunctionBased.OptimizableFunction
de.jstacs.classifier.scoringFunctionBased.AbstractOptimizableFunction
de.jstacs.classifier.scoringFunctionBased.AbstractMultiThreadedOptimizableFunction
de.jstacs.classifier.scoringFunctionBased.SFBasedOptimizableFunction
de.jstacs.classifier.scoringFunctionBased.gendismix.LogGenDisMixFunction
public class LogGenDisMixFunction
This class implements the the following function

have to sum to 1. For special weights the optimization turns out to be
well known
=0, one obtains the generative-discriminative trade-off,
=0.5, one obtains the penalized generative-discriminative trade-off.
ScoringFunction.clone() method works correctly, since each thread works on its own clones.
| Nested Class Summary |
|---|
| Nested classes/interfaces inherited from class de.jstacs.classifier.scoringFunctionBased.OptimizableFunction |
|---|
OptimizableFunction.KindOfParameter |
| Field Summary | |
|---|---|
protected double[] |
beta
The mixture parameters of the GenDisMix |
protected double[][] |
cllGrad
Array for the gradient of the conditional log-likelihood |
protected double[][] |
helpArray
General temporary array |
protected double[][] |
llGrad
Array for the gradient of the log-likelihood |
protected double[] |
prGrad
Array for the gradient of the prior |
| Fields inherited from class de.jstacs.classifier.scoringFunctionBased.SFBasedOptimizableFunction |
|---|
dList, iList, prior, score, shortcut |
| Fields inherited from class de.jstacs.classifier.scoringFunctionBased.AbstractMultiThreadedOptimizableFunction |
|---|
params |
| Fields inherited from class de.jstacs.classifier.scoringFunctionBased.AbstractOptimizableFunction |
|---|
cl, clazz, data, freeParams, logClazz, norm, sum, weights |
| Constructor Summary | |
|---|---|
LogGenDisMixFunction(int threads,
ScoringFunction[] score,
Sample[] data,
double[][] weights,
LogPrior prior,
double[] beta,
boolean norm,
boolean freeParams)
The constructor for creating an instance that can be used in an Optimizer. |
|
| Method Summary | |
|---|---|
protected void |
evaluateFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
This method evaluates the function for a part of the data. |
protected void |
evaluateGradientOfFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
This method evaluates the gradient of the function for a part of the data. |
protected double |
joinFunction()
This method joins the partial results that have been computed using AbstractMultiThreadedOptimizableFunction.evaluateFunction(int, int, int, int, int). |
protected double[] |
joinGradients()
This method joins the gradients of each part that have been computed using AbstractMultiThreadedOptimizableFunction.evaluateGradientOfFunction(int, int, int, int, int). |
void |
reset(ScoringFunction[] funs)
This method allows to reset the internally used functions and the corresponding objects. |
| Methods inherited from class de.jstacs.classifier.scoringFunctionBased.SFBasedOptimizableFunction |
|---|
addTermToClassParameter, getClassParams, getDimensionOfScope, getParameters, reset, setParams, setThreadIndependentParameters |
| Methods inherited from class de.jstacs.classifier.scoringFunctionBased.AbstractMultiThreadedOptimizableFunction |
|---|
evaluateFunction, evaluateGradientOfFunction, getNumberOfAvailableProcessors, getNumberOfThreads, setDataAndWeights, setParams, stopThreads |
| Methods inherited from class de.jstacs.classifier.scoringFunctionBased.AbstractOptimizableFunction |
|---|
getData, getParameters, getSequenceWeights |
| Methods inherited from class de.jstacs.algorithms.optimization.DifferentiableFunction |
|---|
findOneDimensionalMin |
| Methods inherited from class java.lang.Object |
|---|
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
| Field Detail |
|---|
protected double[][] helpArray
protected double[][] llGrad
protected double[][] cllGrad
protected double[] beta
protected double[] prGrad
| Constructor Detail |
|---|
public LogGenDisMixFunction(int threads,
ScoringFunction[] score,
Sample[] data,
double[][] weights,
LogPrior prior,
double[] beta,
boolean norm,
boolean freeParams)
throws IllegalArgumentException
Optimizer.
threads - the number of threads used for evaluating the function and determining the gradient of the functionscore - an array containing the ScoringFunctions that are used for determining the sequences scores;
if the weight beta[LearningPrinciple.LIKELIHOOD_INDEX] is positive all elements of score have to be NormalizableScoringFunctiondata - the array of Samples containing the data that is needed to evaluate the functionweights - the weights for each Sequence in each Sample of dataprior - the prior that is used for learning the parametersbeta - the beta-weights for the three terms of the learning principlenorm - the switch for using the normalization (division by the number
of sequences)freeParams - the switch for using only the free parameters
IllegalArgumentException - if the number of threads is not positive, the number of classes or the dimension of the weights is not correct| Method Detail |
|---|
protected double[] joinGradients()
throws EvaluationException
AbstractMultiThreadedOptimizableFunctionAbstractMultiThreadedOptimizableFunction.evaluateGradientOfFunction(int, int, int, int, int).
joinGradients in class AbstractMultiThreadedOptimizableFunctionEvaluationException - if the gradient could not be evaluated properlyprotected void evaluateGradientOfFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
AbstractMultiThreadedOptimizableFunction
evaluateGradientOfFunction in class AbstractMultiThreadedOptimizableFunctionindex - the index of the partstartClass - the index of the start classstartSeq - the index of the start sequenceendClass - the index of the end class (inclusive)endSeq - the index of the end sequence (exclusive)protected double joinFunction()
throws DimensionException,
EvaluationException
AbstractMultiThreadedOptimizableFunctionAbstractMultiThreadedOptimizableFunction.evaluateFunction(int, int, int, int, int).
joinFunction in class AbstractMultiThreadedOptimizableFunctionDimensionException - if the parameters could not be set
EvaluationException - if the gradient could not be evaluated properlyprotected void evaluateFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
throws EvaluationException
AbstractMultiThreadedOptimizableFunction
evaluateFunction in class AbstractMultiThreadedOptimizableFunctionindex - the index of the partstartClass - the index of the start classstartSeq - the index of the start sequenceendClass - the index of the end class (inclusive)endSeq - the index of the end sequence (exclusive)
EvaluationException - if the gradient could not be evaluated properlypublic void reset(ScoringFunction[] funs)
throws Exception
SFBasedOptimizableFunction
reset in class SFBasedOptimizableFunctionfuns - the new instances
Exception - if something went wrong
|
||||||||||
| PREV CLASS NEXT CLASS | FRAMES NO FRAMES All Classes | |||||||||
| SUMMARY: NESTED | FIELD | CONSTR | METHOD | DETAIL: FIELD | CONSTR | METHOD | |||||||||