Documentation
¶
Index ¶
- func Achlioptas(m mat.Matrix, generator *rand.LockedRand)
- func Constant(m mat.Matrix, n mat.Float)
- func Gain(f ag.OpName) mat.Float
- func Normal(m mat.Matrix, mean, std mat.Float, generator *rand.LockedRand)
- func Ones(m mat.Matrix)
- func Uniform(m mat.Matrix, min, max mat.Float, generator *rand.LockedRand)
- func XavierNormal(m mat.Matrix, gain mat.Float, generator *rand.LockedRand)
- func XavierUniform(m mat.Matrix, gain mat.Float, generator *rand.LockedRand)
- func Zeros(m mat.Matrix)
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func Achlioptas ¶
func Achlioptas(m mat.Matrix, generator *rand.LockedRand)
Achlioptas fills the input matrix with values according to the mthod described on "Database-friendly random projections: Johnson-Lindenstrauss with binary coins", by Dimitris Achlioptas 2001 (https://core.ac.uk/download/pdf/82724427.pdf)
func Gain ¶
Gain returns a coefficient that help to initialize the params in a way to keep gradients stable. Use it to find the gain value for Xavier initializations.
func Normal ¶
Normal fills the input matrix with random samples from a normal (Gaussian) distribution.
func Uniform ¶
Uniform fills the input matrix m with a uniform distribution where a is the lower bound and b is the upper bound.
func XavierNormal ¶
XavierNormal fills the input matrix with values according to the method described in "Understanding the difficulty of training deep feedforward neural networks" - Glorot, X. & Bengio, Y. (2010), using a normal distribution.
func XavierUniform ¶
XavierUniform fills the input `m` with values according to the method described in `Understanding the difficulty of training deep feedforward neural networks` - Glorot, X. & Bengio, Y. (2010), using a uniform distribution.
Types ¶
This section is empty.