Documentation
¶
Overview ¶
Package optimization defines commonly used optimization algorithms.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Adam ¶
type Adam struct { LearningRate float64 SmoothFactor1 float64 SmoothFactor2 float64 // contains filtered or unexported fields }
Adam runs the Adam optimization algorithm.
func (*Adam) UpdateParameters ¶
UpdateParameters update the layer parameters using the Adam algorithm.
type Alg ¶
type Alg interface {
UpdateParameters(l Layer)
}
An Alg can uses the gradient to update model parameters.
type Momentum ¶
type Momentum struct { LearningRate float64 SmoothFactor float64 // contains filtered or unexported fields }
Momentum runs a exponentially weighted averaging over gradients.
func NewMomentum ¶
NewMomentum creates a new Momentum SGD algorithm object.
func (*Momentum) UpdateParameters ¶
UpdateParameters modifies the layer parameters using the momentum algorithm.
type RMSProp ¶
type RMSProp struct { LearningRate float64 SmoothFactor float64 // contains filtered or unexported fields }
RMSProp runs RMSProp optimization algorithm.
func NewRMSProp ¶
NewRMSProp creates a new RMSProp optimization algorithm.
func (*RMSProp) UpdateParameters ¶
UpdateParameters modifies the layer parameters using the RMSProp algorithm.
Click to show internal directories.
Click to hide internal directories.