decoder

package
v1.1.50 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 2, 2022 License: BSD-3-Clause Imports: 7 Imported by: 6

README

Docs: GoDoc

The decoder package provides standalone decoders that can sample variables from emer network layers and provide a supervised one-layer categorical decoding of what is being represented in those layers. This can provide an important point of reference relative to whatever the network itself is generating, and is especially useful for more self-organizing networks that may not have supervised training at all.

SoftMax

The SoftMax decoder is the best choice for a 1-hot classification decoder, using the widely-used SoftMax function: https://en.wikipedia.org/wiki/Softmax_function

Here's the basic API:

  • InitLayer to initialize with number of categories and layer(s) for input.

  • Decode with variable name to record that variable from layers, and decode based on the current state info for that variable. You can also access the full sorted list of category outputs in the Sorted field of the SoftMax object.

  • Train after Decode with index of current ground-truth category value.

It is also possible to use the decoder without reference to emer Layers by just calling Init, Forward, Sort, and Train.

A learning rate of about 0.05 works well for large layers, and 0.1 can be used for smaller, less complex cases.

Sigmoid

The Sigmoid decoder is the best choice for factorial, independent categories where any number of them might be active at a time. It learns using the delta rule for each output unit. Uses the same API as above, except Decode takes a full slice of target values for each category output, and the results are found in the Units[i].Act variable, which can be returned into a slice using the Output method.

Vote

TopVoteInt takes a slice of ints representing votes for which category index was selected (or anything really), and returns the one with the most votes, choosing at random for any ties at the top, along with the number of votes for it.

TopVoteString is the same as TopVoteInt but with string-valued votes.

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func TopVoteInt added in v1.1.42

func TopVoteInt(votes []int) (int, int)

TopVoteInt returns the choice with the most votes among a list of votes as integer-valued choices, and also returns the number of votes for that item. In the case of ties, it chooses one at random (otherwise it would have a bias toward the lowest numbered item).

func TopVoteString added in v1.1.42

func TopVoteString(votes []string) (string, int)

TopVoteString returns the choice with the most votes among a list of votes as string-valued choices, and also returns the number of votes for that item. In the case of ties, it chooses one at random (otherwise it would have a bias toward the lowest numbered item).

Types

type Sigmoid added in v1.1.44

type Sigmoid struct {
	Lrate    float32                     `def:"0.1" desc:"learning rate"`
	Layers   []emer.Layer                `desc:"layers to decode"`
	NCats    int                         `desc:"number of different categories to decode"`
	Units    []SigmoidUnit               `desc:"unit values -- read this for decoded output"`
	NInputs  int                         `desc:"number of inputs -- total sizes of layer inputs"`
	Inputs   []float32                   `desc:"input values, copied from layers"`
	ValsTsrs map[string]*etensor.Float32 `view:"-" desc:"for holding layer values"`
	Weights  etensor.Float32             `desc:"synaptic weights: outer loop is units, inner loop is inputs"`
}

Sigmoid is a sigmoidal activation function decoder, which is the best choice for factorial, independent categories where any number of them might be active at a time. It learns using the delta rule for each output unit.

func (*Sigmoid) Back added in v1.1.44

func (sm *Sigmoid) Back() float32

Back compute the backward error propagation pass Returns SSE (sum squared error) of difference between targets and outputs.

func (*Sigmoid) Decode added in v1.1.44

func (sm *Sigmoid) Decode(varNm string)

Decode decodes the given variable name from layers (forward pass). Decoded values are in Units[i].Act -- see also Output to get into a []float32

func (*Sigmoid) Forward added in v1.1.44

func (sm *Sigmoid) Forward()

Forward compute the forward pass from input

func (*Sigmoid) Init added in v1.1.44

func (sm *Sigmoid) Init(ncats, ninputs int)

Init initializes detector with number of categories and number of inputs

func (*Sigmoid) InitLayer added in v1.1.44

func (sm *Sigmoid) InitLayer(ncats int, layers []emer.Layer)

InitLayer initializes detector with number of categories and layers

func (*Sigmoid) Input added in v1.1.44

func (sm *Sigmoid) Input(varNm string)

Input grabs the input from given variable in layers

func (*Sigmoid) Output added in v1.1.44

func (sm *Sigmoid) Output(acts *[]float32)

Output returns the resulting Decoded output activation values into given slice which is automatically resized if not of sufficient size.

func (*Sigmoid) Train added in v1.1.44

func (sm *Sigmoid) Train(targs []float32) (float32, error)

Train trains the decoder with given target correct answers, as []float32 values. Returns SSE (sum squared error) of difference between targets and outputs. Also returns and prints an error if targets are not sufficient length for NCats.

func (*Sigmoid) ValsTsr added in v1.1.44

func (sm *Sigmoid) ValsTsr(name string) *etensor.Float32

ValsTsr gets value tensor of given name, creating if not yet made

type SigmoidUnit added in v1.1.44

type SigmoidUnit struct {
	Targ float32 `desc:"target activation value -- typically 0 or 1 but can be within that range too"`
	Act  float32 `desc:"final activation = 1 / (1 + e^-Net) -- this is the decoded output"`
	Net  float32 `desc:"net input = sum x * w"`
}

SigmoidUnit has variables for Sigmoid decoder unit

type SoftMax

type SoftMax struct {
	Lrate    float32                     `def:"0.1" desc:"learning rate"`
	Layers   []emer.Layer                `desc:"layers to decode"`
	NCats    int                         `desc:"number of different categories to decode"`
	Units    []SoftMaxUnit               `desc:"unit values"`
	Sorted   []int                       `` /* 183-byte string literal not displayed */
	NInputs  int                         `desc:"number of inputs -- total sizes of layer inputs"`
	Inputs   []float32                   `desc:"input values, copied from layers"`
	Targ     int                         `desc:"current target index of correct category"`
	ValsTsrs map[string]*etensor.Float32 `view:"-" desc:"for holding layer values"`
	Weights  etensor.Float32             `desc:"synaptic weights: outer loop is units, inner loop is inputs"`
}

SoftMax is a softmax decoder, which is the best choice for a 1-hot classification using the widely-used SoftMax function: https://en.wikipedia.org/wiki/Softmax_function

func (*SoftMax) Back

func (sm *SoftMax) Back()

Back compute the backward error propagation pass

func (*SoftMax) Decode

func (sm *SoftMax) Decode(varNm string) int

Decode decodes the given variable name from layers (forward pass) See Sorted list of indexes for the decoding output -- i.e., Sorted[0] is the most likely -- that is returned here as a convenience.

func (*SoftMax) Forward

func (sm *SoftMax) Forward()

Forward compute the forward pass from input

func (*SoftMax) Init

func (sm *SoftMax) Init(ncats, ninputs int)

Init initializes detector with number of categories and number of inputs

func (*SoftMax) InitLayer

func (sm *SoftMax) InitLayer(ncats int, layers []emer.Layer)

InitLayer initializes detector with number of categories and layers

func (*SoftMax) Input

func (sm *SoftMax) Input(varNm string)

Input grabs the input from given variable in layers

func (*SoftMax) Sort

func (sm *SoftMax) Sort()

Sort updates Sorted indexes of the current Unit category activations sorted from highest to lowest. i.e., the 0-index value has the strongest decoded output category, 1 the next-strongest, etc.

func (*SoftMax) Train

func (sm *SoftMax) Train(targ int)

Train trains the decoder with given target correct answer (0..NCats-1)

func (*SoftMax) ValsTsr

func (sm *SoftMax) ValsTsr(name string) *etensor.Float32

ValsTsr gets value tensor of given name, creating if not yet made

type SoftMaxUnit added in v1.1.44

type SoftMaxUnit struct {
	Act float32 `desc:"final activation = e^Ge / sum e^Ge"`
	Net float32 `desc:"net input = sum x * w"`
	Exp float32 `desc:"exp(Net)"`
}

SoftMaxUnit has variables for softmax decoder unit

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL