optimization

package
v3.0.3 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 5, 2024 License: MIT Imports: 1 Imported by: 0

Documentation

Overview

Package optimization defines commonly used optimization algorithms.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Adam

type Adam struct {
	LearningRate  float64
	SmoothFactor1 float64
	SmoothFactor2 float64
	// contains filtered or unexported fields
}

Adam runs the Adam optimization algorithm.

func NewAdam

func NewAdam(to tensor.Operator, learningRate float64) *Adam

NewAdam creates a new Adam optimization algorithm.

func (*Adam) UpdateParameters

func (r *Adam) UpdateParameters(layer Layer)

UpdateParameters update the layer parameters using the Adam algorithm.

type Alg

type Alg interface {
	UpdateParameters(l Layer)
}

An Alg can uses the gradient to update model parameters.

type Layer

type Layer interface {
	Parameters() tensor.Tensor
	Gradients() tensor.Tensor
}

Layer define the Layer interface used by the optimization algorithm.

type Momentum

type Momentum struct {
	LearningRate float64
	SmoothFactor float64
	// contains filtered or unexported fields
}

Momentum runs a exponentially weighted averaging over gradients.

func NewMomentum

func NewMomentum(
	to tensor.Operator,
	learningRate float64, smoothFactor float64) *Momentum

NewMomentum creates a new Momentum SGD algorithm object.

func (*Momentum) UpdateParameters

func (m *Momentum) UpdateParameters(layer Layer)

UpdateParameters modifies the layer parameters using the momentum algorithm.

type RMSProp

type RMSProp struct {
	LearningRate float64
	SmoothFactor float64
	// contains filtered or unexported fields
}

RMSProp runs RMSProp optimization algorithm.

func NewRMSProp

func NewRMSProp(to tensor.Operator, learningRate float64) *RMSProp

NewRMSProp creates a new RMSProp optimization algorithm.

func (*RMSProp) UpdateParameters

func (r *RMSProp) UpdateParameters(layer Layer)

UpdateParameters modifies the layer parameters using the RMSProp algorithm.

type SGD

type SGD struct {
	LearningRate float64
	// contains filtered or unexported fields
}

SGD is an optimizer that runs SGD algorithm

func NewSGD

func NewSGD(to tensor.Operator, learningRate float64) *SGD

NewSGD creates a new SGD object.

func (*SGD) UpdateParameters

func (s *SGD) UpdateParameters(layer Layer)

UpdateParameters modifies the layer parameter using the sgd algorithm.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL