layer

package
v0.0.0-...-bed4406 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 12, 2020 License: MIT Imports: 7 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

View Source
var ActivationFunctions = map[string]fnType{
	"relu":        relu,
	"identity":    identity,
	"binary_step": binaryStep,
	"sigmoid":     sigmoid,
	"tanh":        tanH,
	"lrelu":       lReLU,
	"rrelu":       rReLU,
	"arctan":      arcTan,
	"softmax":     softmax,
}

ActivationFunctions is a collection of the available functions that can be used in the back and forward propagation steps. If one is not available, the Layer won't be instantiated. Their mathematical formulas can be seen in https://en.wikipedia.org/wiki/Activation_function.

View Source
var DerivativeFunctions = map[string]fnType{
	"relu":        dRelu,
	"identity":    dIdentity,
	"binary_step": dBinaryStep,
	"sigmoid":     dSigmoid,
	"tanh":        dTanH,
	"lrelu":       dLReLU,
	"rrelu":       dRReLU,
	"arctan":      dArcTan,
	"softmax":     dSoftmax,
}

DerivativeFunctions is a map of the derivatives of the available activation functions. It uses the formulas defined in https://en.wikipedia.org/wiki/Activation_function.

Functions

This section is empty.

Types

type Layer

type Layer struct {
	Weights, Output, Sum, Bias matrix.Matrix
	// contains filtered or unexported fields
}

Layer is an abstraction of an array of perceptrons. It contains the weights of all perceptron as well as its activation function. The output and sum matrices will store the vector calculated in the ForwProp needed for the BackProp.

func New

func New(actFn string, inSize, outSize int) (Layer, error)

New creates a layer with a given input and output size. It generates a random matrix of weights and allocates memory for the sum and output vectors. It checks weather the input size is valid and if the activation function exists in map. The bias is activated with a 0 vector.

func (*Layer) BackProp

func (l *Layer) BackProp(prevDelta, weights matrix.Matrix) (matrix.Matrix, error)

BackProp implements the backpropagation algorithm for any layer. It takes the delta loss and weights from layer (l+1) and calculates the error of the current layer.

func (*Layer) BackPropOutLayer

func (l *Layer) BackPropOutLayer(expected matrix.Matrix) (matrix.Matrix, error)

BackPropOutLayer is the special case for the backpropagation algorithm, when performed starting with the output layer. It is very similar to the normal one if the cost function used is the quadratic function. TODO: It will stay as a seperate function to make the refactoring process easier when generalising to any cost function.

func (*Layer) Equal

func (l *Layer) Equal(other Layer) bool

Equal is a comparison method that given another layer checks for the sizes, activation function and Weights and Bias matrices. Sum and Out are just placeholders so they don't need to be compared.

func (*Layer) ForwProp

func (l *Layer) ForwProp(input matrix.Matrix) (matrix.Matrix, error)

ForwProp gets called from the MultiLayerPerceptron for each layer sequentially and calculates the output saving each stage in the sum and output variables, which are needed for the BackProp method.

func (*Layer) InSize

func (l *Layer) InSize() int

InSize returns expected input size vector. It is needed to check for valid input in the ForwProp method.

func (*Layer) MarshalBinary

func (l *Layer) MarshalBinary() ([]byte, error)

func (*Layer) OutSize

func (l *Layer) OutSize() int

OutSize returns the expected output size vector. Both these methods need to be public so they are accessible from the multilayerperceptron module.

func (*Layer) String

func (l *Layer) String() (s string)

func (*Layer) UnmarshalBinary

func (l *Layer) UnmarshalBinary(data []byte) error

func (*Layer) UpdateBias

func (l *Layer) UpdateBias(derived matrix.Matrix) error

UpdateBias receives the multiplyed learning rate and error and is subtracted from the layers bias vector.

func (*Layer) UpdateWeights

func (l *Layer) UpdateWeights(derived matrix.Matrix) error

UpdateWeights receives the multiplyed learning rate and error and is subtracted from the layers weight matrix.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL