| Package | Description |
|---|---|
| de.jstacs.algorithms.optimization |
Provides classes for different types of algorithms that are not directly linked to the modelling components of Jstacs: Algorithms on graphs, algorithms for numerical optimization, and a basic alignment algorithm.
|
| de.jstacs.classifiers.differentiableSequenceScoreBased |
Provides the classes for
Classifiers that are based on SequenceScores.It includes a sub-package for discriminative objective functions, namely conditional likelihood and supervised posterior, and a separate sub-package for the parameter priors, that can be used for the supervised posterior. |
| de.jstacs.classifiers.differentiableSequenceScoreBased.gendismix |
Provides an implementation of a classifier that allows to train the parameters of a set of
DifferentiableStatisticalModels by
a unified generative-discriminative learning principle. |
| de.jstacs.classifiers.differentiableSequenceScoreBased.logPrior |
Provides a general definition of a parameter log-prior and a number of implementations of Laplace and Gaussian priors.
|
| de.jstacs.sequenceScores.statisticalModels.trainable.discrete.inhomogeneous |
This package contains various inhomogeneous models.
|
| Modifier and Type | Class and Description |
|---|---|
class |
DifferentiableFunction
This class is the framework for any (at least) one time differentiable
function
. |
class |
NegativeDifferentiableFunction
|
class |
NegativeFunction
The negative function -f for a given
Function f. |
class |
NegativeOneDimensionalFunction
This class extends the class
OneDimensionalFunction. |
class |
NumericalDifferentiableFunction
This class is the framework for any numerical differentiable function
. |
class |
OneDimensionalFunction
This class implements the interface
Function for an one-dimensional
function. |
class |
OneDimensionalSubFunction
This class is used to do the line search.
|
class |
QuadraticFunction
This class implements a quadratic function.
|
| Constructor and Description |
|---|
NegativeFunction(Function f)
|
OneDimensionalSubFunction(Function f,
double[] current,
double[] d)
|
| Modifier and Type | Class and Description |
|---|---|
class |
AbstractMultiThreadedOptimizableFunction
This class enables the user to exploit all CPUs of an computer by using threads.
|
class |
AbstractOptimizableFunction
This class extends
OptimizableFunction and implements some common
methods. |
class |
DiffSSBasedOptimizableFunction
This abstract class is the basis of all multi-threaded
OptimizableFunctions that are based on DifferentiableSequenceScores. |
class |
OptimizableFunction
This is the main function for the
ScoreClassifier. |
| Modifier and Type | Class and Description |
|---|---|
class |
LogGenDisMixFunction
This class implements the the following function
have to sum to 1. |
class |
OneDataSetLogGenDisMixFunction
This class implements the the following function
is the weight for sequence and class . |
| Modifier and Type | Class and Description |
|---|---|
class |
CompositeLogPrior
This class implements a composite prior that can be used for DifferentiableStatisticalModel.
|
class |
DoesNothingLogPrior
This class defines a
LogPrior that does not penalize any parameter. |
class |
LogPrior
The abstract class for any log-prior used e.g.
|
class |
SeparateGaussianLogPrior
Class for a
LogPrior that defines a Gaussian prior on the parameters
of a set of DifferentiableStatisticalModels
and a set of class parameters. |
class |
SeparateLaplaceLogPrior
Class for a
LogPrior that defines a Laplace prior on the parameters
of a set of DifferentiableStatisticalModels
and a set of class parameters. |
class |
SeparateLogPrior
Abstract class for priors that penalize each parameter value independently
and have some variances (and possible means) as hyperparameters.
|
class |
SimpleGaussianSumLogPrior
This class implements a prior that is a product of Gaussian distributions
with mean 0 and equal variance for each parameter.
|
| Modifier and Type | Class and Description |
|---|---|
static class |
MEMTools.DualFunction
The dual function to the constraint problem of learning MEM's.
|