Documentation ¶
Index ¶
- func CoI(pxyz [][][]float64) float64
- func ConditionalEntropy(pxy [][]float64, log lnFunc) float64
- func ConditionalEntropyBase2(pxy [][]float64) float64
- func ConditionalEntropyBaseE(pxy [][]float64) float64
- func ConditionalMutualInformation(pxyz [][][]float64, ln lnFunc) float64
- func ConditionalMutualInformationBase2(pxyz [][][]float64) float64
- func ConditionalMutualInformationBaseE(pxyz [][][]float64) float64
- func Create2D(xDim, yDim int) [][]float64
- func Create2DInt(xDim, yDim int) [][]int
- func Create3D(xDim, yDim, zDim int) [][][]float64
- func Create3DInt(xDim, yDim, zDim int) [][][]int
- func Create4D(xDim, yDim, zDim, wDim int) [][][][]float64
- func Empirical1D(d []int) []float64
- func Empirical2D(d [][]int) [][]float64
- func Empirical2DSparse(d [][]int) sm.SparseMatrix
- func Empirical3D(d [][]int) [][][]float64
- func Empirical3DSparse(d [][]int) sm.SparseMatrix
- func Empirical4D(d [][]int) [][][][]float64
- func Entropy(p []float64, ln lnFunc) float64
- func EntropyBase2(p []float64) float64
- func EntropyBase2Sparse(p sm.SparseMatrix) float64
- func EntropyBaseE(p []float64) float64
- func EntropyBaseESparse(p sm.SparseMatrix) float64
- func EntropyChaoShen(data []int, ln lnFunc) float64
- func EntropyChaoShenBase2(data []int) float64
- func EntropyChaoShenBaseE(data []int) float64
- func EntropyHorvitzThompson(data []int, ln lnFunc) float64
- func EntropyHorvitzThompsonBase2(data []int) float64
- func EntropyHorvitzThompsonBaseE(data []int) float64
- func EntropyMLBC(data []int, ln lnFunc) float64
- func EntropyMLBCBase2(data []int) float64
- func EntropyMLBCBaseE(data []int) float64
- func EntropySparse(p sm.SparseMatrix, ln lnFunc) float64
- func H1(px []float64) float64
- func H2(pxy [][]float64) float64
- func H3(pxyz [][][]float64) float64
- func InformationDecomposition(pxyz [][][]float64, resolution int) (float64, float64, float64)
- func MiXvY(pxyz [][][]float64) float64
- func MiXvYZ(pxyz [][][]float64) float64
- func MiXvYgZ(pxyz [][][]float64) float64
- func MiXvZgY(pxyz [][][]float64) float64
- func MinMax(pxyz [][][]float64, resolution int) (float64, float64, float64, float64, float64, float64)
- func MutualInformation(pxy [][]float64, log lnFunc) float64
- func MutualInformationBase2(pxy [][]float64) float64
- func MutualInformationBaseE(pxy [][]float64) float64
- func Normalise1D(a []float64) []float64
- func Normalise2D(a [][]float64) [][]float64
- func Normalise3D(a [][][]float64) [][][]float64
- func Normalise4D(a [][][][]float64) [][][][]float64
- func PX(pxyz [][][]float64) []float64
- func PXY(pxyz [][][]float64) [][]float64
- func PXZ(pxyz [][][]float64) [][]float64
- func PY(pxyz [][][]float64) []float64
- func PYZ(pxyz [][][]float64) [][]float64
- func PZ(pxyz [][][]float64) []float64
- func Pt(pxyz [][][]float64, a, b float64) [][][]float64
- type IterativeScaling
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func ConditionalEntropy ¶
ConditionalEntropy calculates the conditional entropy of a probability distribution. It takes the log function as an additional parameter, so that the base can be chosen
H(X|Y) = -\sum_x p(x,y) lnFunc(p(x,y)/p(y))
func ConditionalEntropyBase2 ¶
ConditionalEntropyBase2 calculates the conditional entropy of a probability distribution in bits
H(X|Y) = -\sum_x p(x,y) log2(p(x,y)/p(y))
func ConditionalEntropyBaseE ¶
ConditionalEntropyBaseE calculates the conditional entropy of a probability distribution in nats
H(X|Y) = -\sum_x p(x,y) ln(p(x,y)/p(y))
func ConditionalMutualInformation ¶
ConditionalMutualInformation calculates the conditional mutual information with the given lnFunc function
I(X,Y|Z) = \sum_x,y, p(x,y,z) (lnFunc(p(x,y|z)) - lnFunc(p(x|z)p(y|z)))
func ConditionalMutualInformationBase2 ¶
ConditionalMutualInformationBase2 calculates the conditional mutual information with base 2
I(X,Y|Z) = \sum_x,y, p(x,y,z) (log2(p(x,y|z)) - log2(p(x|z)p(y|z)))
func ConditionalMutualInformationBaseE ¶
ConditionalMutualInformationBaseE calculates the conditional mutual information with base e I(X,Y|Z) = \sum_x,y, p(x,y,z) (ln(p(x,y|z)) - ln(p(x|z)p(y|z)))
func Create2DInt ¶
Create2DInt creates a 2-dimensional slice
func Create3DInt ¶
Create3DInt creates a 3-dimensional slice
func Empirical1D ¶
Empirical1D is an empirical estimator for a one-dimensional probability distribution
func Empirical2D ¶
Empirical2D is an empirical estimator for a two-dimensional probability distribution
func Empirical2DSparse ¶
func Empirical2DSparse(d [][]int) sm.SparseMatrix
Empirical2DSparse is an empirical estimator for a two-dimensional probability distribution
func Empirical3D ¶
Empirical3D is an empirical estimator for a three-dimensional probability distribution
func Empirical3DSparse ¶
func Empirical3DSparse(d [][]int) sm.SparseMatrix
Empirical3DSparse is an empirical estimator for a three-dimensional probability distribution
func Empirical4D ¶
Empirical4D is an empirical estimator for a three-dimensional probability distribution
func Entropy ¶
Entropy calculates the entropy of a probability distribution. It takes the log function as an additional parameter, so that the base can be chosen:
H(X) = -\sum_x p(x) lnFunc(p(x))
func EntropyBase2 ¶
EntropyBase2 calculates the entropy of a probability distribution with base 2
H(X) = -\sum_x p(x) log2(p(x))
func EntropyBase2Sparse ¶
func EntropyBase2Sparse(p sm.SparseMatrix) float64
EntropyBase2Sparse calculates the entropy of a probability distribution with base 2
H(X) = -\sum_x p(x) log2(p(x))
func EntropyBaseE ¶
EntropyBaseE calculates the entropy of a probability distribution with base e
H(X) = -\sum_x p(x) ln(p(x))
func EntropyBaseESparse ¶
func EntropyBaseESparse(p sm.SparseMatrix) float64
EntropyBaseESparse calculates the entropy of a probability distribution with base e
H(X) = -\sum_x p(x) ln(p(x))
func EntropyChaoShen ¶
EntropyChaoShen is the Chao-Shen entropy estimator. It take discretised data and the log-function as input Implemented from A. Chao and T.-J. Shen. Nonparametric estimation of shannon’s index of diversity when there are unseen species in sample. Environmental and Ecological Statistics, 10(4):429–443, 2003.
func EntropyChaoShenBase2 ¶
EntropyChaoShenBase2 is the Chao-Shen entropy estimator. It take discretised data and return bits. Implemented from A. Chao and T.-J. Shen. Nonparametric estimation of shannon’s index of diversity when there are unseen species in sample. Environmental and Ecological Statistics, 10(4):429–443, 2003.
func EntropyChaoShenBaseE ¶
EntropyChaoShenBaseE is the Chao-Shen entropy estimator. It take discretised data and return nats. Implemented from A. Chao and T.-J. Shen. Nonparametric estimation of shannon’s index of diversity when there are unseen species in sample. Environmental and Ecological Statistics, 10(4):429–443, 2003.
func EntropyHorvitzThompson ¶
EntropyHorvitzThompson is the Horvitz-Thompson entropy estimator. It takes discretised data and log function as input. Implemented from A. Chao and T.-J. Shen. Nonparametric estimation of shannon’s index of diversity when there are unseen species in sample. Environmental and Ecological Statistics, 10(4):429–443, 2003.
func EntropyHorvitzThompsonBase2 ¶
EntropyHorvitzThompsonBase2 is the Horvitz-Thompson entropy estimator. It takes discretised data as input and return the entropy in bits. Implemented from A. Chao and T.-J. Shen. Nonparametric estimation of shannon’s index of diversity when there are unseen species in sample. Environmental and Ecological Statistics, 10(4):429–443, 2003.
func EntropyHorvitzThompsonBaseE ¶
EntropyHorvitzThompsonBaseE is the Horvitz-Thompson entropy estimator. It takes discretised data as input and return the entropy in nats. Implemented from A. Chao and T.-J. Shen. Nonparametric estimation of shannon’s index of diversity when there are unseen species in sample. Environmental and Ecological Statistics, 10(4):429–443, 2003.
func EntropyMLBC ¶
EntropyMLBC is maximum likelihood estimator with bias correction It takes discretised data and the log function as input. Implemented from A. Chao and T.-J. Shen. Nonparametric estimation of shannon’s index of diversity when there are unseen species in sample. Environmental and Ecological Statistics, 10(4):429–443, 2003.
func EntropyMLBCBase2 ¶
EntropyMLBCBase2 is maximum likelihood estimator with bias correction It takes discretised data as input and returns the entropy in bits. Implemented from A. Chao and T.-J. Shen. Nonparametric estimation of shannon’s index of diversity when there are unseen species in sample. Environmental and Ecological Statistics, 10(4):429–443, 2003.
func EntropyMLBCBaseE ¶
EntropyMLBCBaseE is maximum likelihood estimator with bias correction It takes discretised data as input and returns the entropy in nats. Implemented from A. Chao and T.-J. Shen. Nonparametric estimation of shannon’s index of diversity when there are unseen species in sample. Environmental and Ecological Statistics, 10(4):429–443, 2003.
func EntropySparse ¶
func EntropySparse(p sm.SparseMatrix, ln lnFunc) float64
EntropySparse calculates the entropy of a probability distribution. It takes the log function as an additional parameter, so that the base can be chosen:
H(X) = -\sum_x p(x) lnFunc(p(x))
func InformationDecomposition ¶
InformationDecomposition return the UI(X;Y\Z), UI(X;Z\Y), CI(X;Y,Z), and SI(X;Y,Z) according to N. Bertschinger, J. Rauh, E. Olbrich, J. Jost, and N. Ay, Quantifying unique information, CoRR, 2013
func MutualInformation ¶
MutualInformation calculates the mutual information with the given lnFunc function
I(X,Y) = \sum_x,y p(x,y) (lnFunc(p(x,y)) - lnFunc(p(x)p(y)))
func MutualInformationBase2 ¶
MutualInformationBase2 calculates the mutual information with base 2
I(X,Y) = \sum_x,y p(x,y) (log2(p(x,y)) - log2(p(x)p(y)))
func MutualInformationBaseE ¶
MutualInformationBaseE calculates the mutual information with base e
I(X,Y) = \sum_x,y p(x,y) (ln(p(x,y)) - ln(p(x)p(y)))
Types ¶
type IterativeScaling ¶
type IterativeScaling struct { PTarget []float64 PEstimate []float64 Features map[string][]int NrOfIterations int ErrorThreshold float64 Alphabet [][]int NrOfStates []int NrOfVariables int CurrentFeatureIndex int CurrentIteration int LastKLStep float64 Keys []string }
IterativeScaling contains the configuration and data required for the iterative scaling algorithm
func NewIterativeScaling ¶
func NewIterativeScaling() *IterativeScaling
NewIterativeScaling Creates a new struct
func (*IterativeScaling) CalculateMarginalProbability ¶
func (data *IterativeScaling) CalculateMarginalProbability(feature []int) float64
CalculateMarginalProbability calculates the marginal probability p(x)
func (*IterativeScaling) CreateAlphabet ¶
func (data *IterativeScaling) CreateAlphabet()
CreateAlphabet creates the alphabet given NrOfStates and NrOfVariables
func (*IterativeScaling) Init ¶
func (data *IterativeScaling) Init()
Init extract the feature names for faster processing
func (*IterativeScaling) Iterate ¶
func (data *IterativeScaling) Iterate()
Iterate implements the iterative scaling algorithm as described in I. Csiszar. i-divergence geometry of probability distributions and minimization problems. Ann. Probab., 3(1):146–158, 02 1975. Input is a probability distribution, a feature set, and a number of iterations. The output is the maximum entropy estimation of p given the feature set p_est^(n+1)(x) = p_target(x_a) * p_est^(n)(x_{without a}|x_a) where a is cycled through the list of features We calculate it in the following form: p_est^(n+1)(x) = p_est^(n)(x) * p_target(x_a) / p_est(x_a)