Documentation
¶
Overview ¶
Package base contains miscellaneous utilities common to other packages
Index ¶
- Variables
- func FromDense(dst mat.Mutable, dense *mat.Dense) *mat.Dense
- func MatDenseColSlice(src mat.RawMatrixer, j, l int) *mat.Dense
- func MatDenseRowSlice(src mat.RawMatrixer, i, k int) *mat.Dense
- func MatDenseSlice(src mat.RawMatrixer, i, k, j, l int) *mat.Dense
- func MatDimsCheck(op string, R, X, Y mat.Matrix)
- func MatDimsString(mats ...mat.Matrix) string
- func MatGeneralColSlice(M blas64.General, j, l int) blas64.General
- func MatGeneralRowSlice(M blas64.General, i, k int) blas64.General
- func MatGeneralSlice(M blas64.General, i, k, j, l int) blas64.General
- func MatStr(Xs ...mat.Matrix) string
- func NewSource(seed uint64) *randomkit.RKState
- func Parallelize(threads, NSamples int, f func(th, start, end int))
- func ToDense(m mat.Matrix) *mat.Dense
- type Activation
- type Fiter
- type Float64er
- type GOMethodCreator
- type Identity
- type Intner
- type LockedSource
- type Logistic
- type MatConst
- type MatOnesPrepended
- type MatRowSlice
- type MatTranspose
- type NormFloat64er
- type OptimCreator
- type Optimizer
- type Permer
- type Predicter
- type RandomState
- type ReLU
- type SGDOptimizer
- func (s *SGDOptimizer) GetTheta() *mat.Dense
- func (s *SGDOptimizer) GetTimeStep() uint64
- func (s *SGDOptimizer) GetUpdate(update *mat.Dense, grad mat.Matrix)
- func (s *SGDOptimizer) Init(dim, tasks int) int
- func (s *SGDOptimizer) Run(operation chan<- optimize.Task, result <-chan optimize.Task, ...)
- func (s *SGDOptimizer) SetTheta(Theta *mat.Dense)
- func (s *SGDOptimizer) String() string
- func (s *SGDOptimizer) UpdateParams(grad mat.Matrix)
- func (*SGDOptimizer) Uses(has optimize.Available) (optimize.Available, error)
- type Shuffler
- type Sigmoid
- type Source
- type SourceCloner
- type Tanh
- type Transformer
Constants ¶
This section is empty.
Variables ¶
var Activations map[string]Activation
Activations is the map of implemented activation functions
var GOMethodCreators = map[string]GOMethodCreator{ "sgd": func() optimize.Method { return NewSGDOptimizer() }, "adagrad": func() optimize.Method { return NewAdagradOptimizer() }, "rmsprop": func() optimize.Method { return NewRMSPropOptimizer() }, "adadelta": func() optimize.Method { return NewAdadeltaOptimizer() }, "adam": func() optimize.Method { return NewAdamOptimizer() }, "bfgs": func() optimize.Method { return &optimize.BFGS{} }, "cg": func() optimize.Method { return &optimize.CG{} }, "gradientdescent": func() optimize.Method { return &optimize.GradientDescent{} }, "lbfgs": func() optimize.Method { return &optimize.LBFGS{} }, "neldermead": func() optimize.Method { return &optimize.NelderMead{} }, }
GOMethodCreators is the map of all gonum optimize method creators (gonum native method creators + base.Optimizer creators)
var Solvers = map[string]OptimCreator{ "sgd": func() Optimizer { return NewSGDOptimizer() }, "adagrad": func() Optimizer { return NewAdagradOptimizer() }, "rmsprop": func() Optimizer { return NewRMSPropOptimizer() }, "adadelta": func() Optimizer { return NewAdadeltaOptimizer() }, "adam": func() Optimizer { return NewAdamOptimizer() }, }
Solvers is the map for common Optimizer creators agd,adagrad,rmsprop,adadelta,adam
Functions ¶
func MatDenseColSlice ¶
func MatDenseColSlice(src mat.RawMatrixer, j, l int) *mat.Dense
MatDenseColSlice returns a *mat.Dense view of partial underlaying data of M
func MatDenseRowSlice ¶
func MatDenseRowSlice(src mat.RawMatrixer, i, k int) *mat.Dense
MatDenseRowSlice returns a *mat.Dense view of partial underlaying data of M
func MatDenseSlice ¶
func MatDenseSlice(src mat.RawMatrixer, i, k, j, l int) *mat.Dense
MatDenseSlice returns a *mat.Dense view of partial underlaying data of M
func MatDimsCheck ¶
MatDimsCheck checks compat of operator op and its Matrix parameters Dims. R is result of op, X and Y are operands of op. "." is dot product. "+","-","*","/" are elementwize opts
func MatDimsString ¶
MatDimsString returns a string representing Dims of its several Matrix parameters
func MatGeneralColSlice ¶
MatGeneralColSlice returns a blas64.General view of partial underlaying data of M
func MatGeneralRowSlice ¶
MatGeneralRowSlice returns a blas64.General view of partial underlaying data of M
func MatGeneralSlice ¶
MatGeneralSlice returns a blas64.General view of partial underlaying data of M
func Parallelize ¶
Parallelize split execution over NSamples across threads
Types ¶
type Activation ¶
Activation is the inteface for an activation function
type Float64er ¶
type Float64er interface {
Float64() float64
}
Float64er is implemented by a random source having a method Float64() float64
type GOMethodCreator ¶
GOMethodCreator is a func that creates a gonum/optimize.Method
type LockedSource ¶
type LockedSource struct {
// contains filtered or unexported fields
}
LockedSource is an implementation of Source that is concurrency-safe. It is just a standard Source with its operations protected by a sync.Mutex.
func NewLockedSource ¶
func NewLockedSource(seed uint64) *LockedSource
NewLockedSource returns a rand.Source safe for concurrent access
func (*LockedSource) WithLock ¶
func (s *LockedSource) WithLock(f func(Source))
WithLock executes f while s is locked
type MatOnesPrepended ¶
MatOnesPrepended is a matrix override representing its initializer with an initial column of ones added
type MatRowSlice ¶
MatRowSlice is a matrix row chunk
type MatTranspose ¶
MatTranspose is a matrix override to transpose a mat.Matrix from its initializer
type NormFloat64er ¶
type NormFloat64er interface {
NormFloat64() float64
}
NormFloat64er is implemented by a random source having a method NormFloat64() float64
type OptimCreator ¶
type OptimCreator func() Optimizer
OptimCreator is the type for functions returning an Optimizer
type Optimizer ¶
type Optimizer interface { GetUpdate(update *mat.Dense, grad mat.Matrix) UpdateParams(grad mat.Matrix) SetTheta(Theta *mat.Dense) GetTheta() *mat.Dense GetTimeStep() uint64 String() string }
Optimizer has updateParams method to update theta from gradient
func NewOptimizer ¶
NewOptimizer only accepts SGD|adagrad|adadelta|rmsprop|adam
type Predicter ¶
type Predicter interface { Fiter GetNOutputs() int Predict(X mat.Matrix, Y mat.Mutable) *mat.Dense Score(X, Y mat.Matrix) float64 IsClassifier() bool PredicterClone() Predicter }
Predicter have Predict(Matrix,Mutable). if 2nd arg (result receiver) is nil, il will be allocated and returned by Predict
type RandomState ¶
type RandomState = Source
RandomState represents a bit more than random_state pythonic attribute. it's not only a seed but a source with a state as it's name states
type SGDOptimizer ¶
type SGDOptimizer struct {
// StepSize is used for all variants
// Momentum can be used for all variants
// GradientClipping is used if >0 to limit gradient L2 norm
// RMSPropGamma is the momentum for rmsprop and adadelta
// Epsilon is used to avoid division by zero in adagrad,rmsprop,adadelta,adam
StepSize, Momentum, GradientClipping, RMSPropGamma, Epsilon, BatchPart float64
// Adagrad, Adadelta, RMSProp, Adam are variants. At most one should be true
Adagrad, Adadelta, RMSProp, Adam bool
// NFeature,NOutputs need only to be initialized wher SGDOptimizer is used as an optimize.Method
NFeatures, NOutputs int
// running Parameters (don't set them yourself)
GtNorm, Theta, PrevUpdate, Update, AdagradG, AdadeltaU *mat.Dense
TimeStep float64
// Adam specific
Beta1, Beta2 float64
Mt, Vt *mat.Dense
// contains filtered or unexported fields
}
SGDOptimizer is struct for SGD solver v https://en.wikipedia.org/wiki/Stochastic_gradient_descent
func NewAdadeltaOptimizer ¶
func NewAdadeltaOptimizer() *SGDOptimizer
NewAdadeltaOptimizer return a *SGDOptimizer setup for adadelta
func NewAdagradOptimizer ¶
func NewAdagradOptimizer() *SGDOptimizer
NewAdagradOptimizer return a *SGDOptimizer setup for adagrad
func NewAdamOptimizer ¶
func NewAdamOptimizer() *SGDOptimizer
NewAdamOptimizer returns an initialized adam solver
func NewRMSPropOptimizer ¶
func NewRMSPropOptimizer() *SGDOptimizer
NewRMSPropOptimizer return a *SGDOptimizer setup for rmsprop
func NewSGDOptimizer ¶
func NewSGDOptimizer() *SGDOptimizer
NewSGDOptimizer returns an initialized *SGDOptimizer with stepsize 1e-4 and momentum 0.9
func (*SGDOptimizer) GetTheta ¶
func (s *SGDOptimizer) GetTheta() *mat.Dense
GetTheta can be called anytime after SetTheta to get read access to theta
func (*SGDOptimizer) GetTimeStep ¶
func (s *SGDOptimizer) GetTimeStep() uint64
GetTimeStep return the number of theta updates already occurred
func (*SGDOptimizer) GetUpdate ¶
func (s *SGDOptimizer) GetUpdate(update *mat.Dense, grad mat.Matrix)
GetUpdate compute the update from grad
func (*SGDOptimizer) Init ¶
func (s *SGDOptimizer) Init(dim, tasks int) int
Init initializes the method based on the initial data in loc, updates it and returns the first operation to be carried out by the caller. The initial location must be valid as specified by Needs.
func (*SGDOptimizer) Run ¶
func (s *SGDOptimizer) Run(operation chan<- optimize.Task, result <-chan optimize.Task, tasks []optimize.Task)
Run implements optimize.Method.Run for SGDOptimizer
func (*SGDOptimizer) SetTheta ¶
func (s *SGDOptimizer) SetTheta(Theta *mat.Dense)
SetTheta should be called before first call to UpdateParams to let the solver know the theta pointer
func (*SGDOptimizer) String ¶
func (s *SGDOptimizer) String() string
func (*SGDOptimizer) UpdateParams ¶
func (s *SGDOptimizer) UpdateParams(grad mat.Matrix)
UpdateParams updates theta from gradient. first call allocates required temporary storage
type Shuffler ¶
Shuffler is implemented by a random source having a method Shuffle(int,func(int,int))
type Source ¶
A Source represents a source of uniformly-distributed pseudo-random int64 values in the range [0, 1<<64).
type SourceCloner ¶
type SourceCloner interface {
SourceClone() Source
}
SourceCloner is an "golang.org/x/exp/rand".Source with a Clone method