|
||||||||||
| PREV PACKAGE NEXT PACKAGE | FRAMES NO FRAMES | |||||||||
DifferentiableStatisticalModels, which can compute the gradient with
respect to their parameters for a given input Sequence.
See:
Description
| Interface Summary | |
|---|---|
| DifferentiableStatisticalModel | The interface for normalizable DifferentiableSequenceScores. |
| SamplingDifferentiableStatisticalModel | Interface for DifferentiableStatisticalModels that can be used for
Metropolis-Hastings sampling in a SamplingScoreBasedClassifier. |
| VariableLengthDiffSM | This is an interface for all DifferentiableStatisticalModels that allow to score
subsequences of arbitrary length. |
| Class Summary | |
|---|---|
| AbstractDifferentiableStatisticalModel | This class is the main part of any ScoreClassifier. |
| AbstractVariableLengthDiffSM | This abstract class implements some methods declared in DifferentiableStatisticalModel based on the declaration
of methods in VariableLengthDiffSM. |
| CyclicMarkovModelDiffSM | This scoring function implements a cyclic Markov model of arbitrary order and periodicity for any sequence length. |
| DifferentiableStatisticalModelFactory | This class allows to easily create some frequently used models. |
| IndependentProductDiffSM | This class enables the user to model parts of a sequence independent of each other. |
| MappingDiffSM | This class implements a DifferentiableStatisticalModel that works on
mapped Sequences. |
| MarkovRandomFieldDiffSM | This class implements the scoring function for any MRF (Markov Random Field). |
| NormalizedDiffSM | This class makes an unnormalized DifferentiableStatisticalModel to a normalized DifferentiableStatisticalModel. |
| UniformDiffSM | This DifferentiableStatisticalModel does nothing. |
Provides all DifferentiableStatisticalModels, which can compute the gradient with
respect to their parameters for a given input Sequence.
The parameters of DifferentiableStatisticalModel are learned numerically, typically by
gradient-based method like provided in Optimizer.
This is especially used in Jstacs for learning the parameters by discriminative learning principles like maximum conditional likelihood or
maximum supervised posterior (see MSPClassifier) or by a unified learning principle (see
GenDisMixClassifier).
The sub-package de.jstacs.sequenceScores.statisticalModels.differentiable.directedGraphicalModels contains Bayesian networks and inhomogeneous Markov models.
The sub-package de.jstacs.sequenceScores.statisticalModels.differentiable.homogeneous provides homogeneous models like homogeneous Markov models.
The sub-package de.jstacs.sequenceScores.statisticalModels.differentiable.mixture provides mixture models including an extended ZOOPS model
for de-novo motif discovery.
Some of the provided DifferentiableStatisticalModels also implement the interface
SamplingDifferentiableStatisticalModel and can be used for
Metropolis-Hastings parameter sampling in a SamplingGenDisMixClassifier.
|
||||||||||
| PREV PACKAGE NEXT PACKAGE | FRAMES NO FRAMES | |||||||||