batchnorm

package
v0.0.0-...-98db5b7 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 7, 2020 License: CC-BY-4.0 Imports: 6 Imported by: 2

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Layer

type Layer struct {
	// contains filtered or unexported fields
}

Layer the ops of a batch norm

func PerActivationPreset

func PerActivationPreset(handle *cudnn.Handler) (*Layer, error)

PerActivationPreset will presetup some values for the batch norm PerActivation

func SpatialPersistantPreset

func SpatialPersistantPreset(handle *cudnn.Handler) (*Layer, error)

SpatialPersistantPreset will presetup some values for the batch norm SpatialPersistantPreset Mode

func SpatialPreset

func SpatialPreset(handle *cudnn.Handler) (*Layer, error)

SpatialPreset will presetup some values for the batch norm Spatial Mode

func (*Layer) BackProp

func (l *Layer) BackProp(handle *cudnn.Handler, x, dx, dy *layers.Tensor) error

BackProp does the back propagation in training the layer

func (*Layer) Bias

func (l *Layer) Bias() *layers.Tensor

Bias returns the bias of the batch norm

func (*Layer) ForwardInference

func (l *Layer) ForwardInference(handle *cudnn.Handler, x, y *layers.Tensor) error

ForwardInference Does the Testing Forward Prop and used for production

func (*Layer) ForwardProp

func (l *Layer) ForwardProp(
	handle *cudnn.Handler, x, y *layers.Tensor) error

ForwardProp Does the Training Forward Prop of batch norm layer

func (*Layer) LoadTrainer

func (l *Layer) LoadTrainer(handle *cudnn.Handler, trainerscale, trainerbias trainer.Trainer) error

LoadTrainer sets up the momentum trainer

func (*Layer) Scale

func (l *Layer) Scale() *layers.Tensor

Scale returns the scale fo the batch norm

func (*Layer) SetBackwardScalars

func (l *Layer) SetBackwardScalars(alpha, beta float64)

SetBackwardScalars sets the backward scalars

func (*Layer) SetEps

func (l *Layer) SetEps(eps float64)

SetEps sets epsilon

func (*Layer) SetForwardScalars

func (l *Layer) SetForwardScalars(alpha, beta float64)

SetForwardScalars sets the forward scalars

func (*Layer) SetOtherScalars

func (l *Layer) SetOtherScalars(alpha, beta float64)

SetOtherScalars these set the weights

func (*Layer) SetupPreset

func (l *Layer) SetupPreset(handle *cudnn.Handler, x *layers.Tensor) (err error)

SetupPreset will allocate all the memory needed for the batch norm with the values passed when using one of the Preset functions

func (*Layer) Trainers

func (l *Layer) Trainers() (scale, bias trainer.Trainer)

Trainers returns the trainers

func (*Layer) UpdateWeights

func (l *Layer) UpdateWeights(handle *cudnn.Handler, batch, epoch int) error

UpdateWeights does the weight update

type Settings

type Settings struct {
	Mode    gocudnn.BatchNormMode `json:"mode,omitempty"`
	Managed bool                  `json:"managed,omitempty"`
}

Settings contains all the paramters needed to build a batchnorm layer

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL