Skip navigation links

Package de.jstacs.sequenceScores.statisticalModels.differentiable

Provides all DifferentiableStatisticalModels, which can compute the gradient with respect to their parameters for a given input Sequence.

See: Description

Package de.jstacs.sequenceScores.statisticalModels.differentiable Description

Provides all DifferentiableStatisticalModels, which can compute the gradient with respect to their parameters for a given input Sequence. The parameters of DifferentiableStatisticalModel are learned numerically, typically by gradient-based method like provided in Optimizer.
This is especially used in Jstacs for learning the parameters by discriminative learning principles like maximum conditional likelihood or maximum supervised posterior (see MSPClassifier) or by a unified learning principle (see GenDisMixClassifier).
The sub-package de.jstacs.sequenceScores.statisticalModels.differentiable.directedGraphicalModels contains Bayesian networks and inhomogeneous Markov models.
The sub-package de.jstacs.sequenceScores.statisticalModels.differentiable.homogeneous provides homogeneous models like homogeneous Markov models.
The sub-package de.jstacs.sequenceScores.statisticalModels.differentiable.mixture provides mixture models including an extended ZOOPS model for de-novo motif discovery.
Some of the provided DifferentiableStatisticalModels also implement the interface SamplingDifferentiableStatisticalModel and can be used for Metropolis-Hastings parameter sampling in a SamplingGenDisMixClassifier.
Skip navigation links