Documentation ¶
Index ¶
- func IdentityFunc(x float32) float32
- func LogisticFunc(x float32) float32
- func TopVoteInt(votes []int) (int, int)
- func TopVoteString(votes []string) (string, int)
- type ActivationFunc
- type Layer
- type Linear
- func (dec *Linear) Back() float32
- func (dec *Linear) BackMPI() float32
- func (dec *Linear) Decode(varNm string, di int)
- func (dec *Linear) Forward()
- func (dec *Linear) Init(nOutputs, nInputs int, poolIndex int, activationFn ActivationFunc)
- func (dec *Linear) InitLayer(nOutputs int, layers []Layer, activationFn ActivationFunc)
- func (dec *Linear) InitPool(nOutputs int, layer Layer, poolIndex int, activationFn ActivationFunc)
- func (dec *Linear) Input(varNm string, di int)
- func (dec *Linear) Output(acts *[]float32)
- func (dec *Linear) SetTargets(targs []float32) error
- func (dec *Linear) Train(targs []float32) (float32, error)
- func (dec *Linear) TrainMPI(targs []float32) (float32, error)
- func (dec *Linear) ValsTsr(name string) *etensor.Float32
- type LinearUnit
- type SoftMax
- func (sm *SoftMax) Back()
- func (sm *SoftMax) BackMPI()
- func (sm *SoftMax) Decode(varNm string, di int) int
- func (sm *SoftMax) Forward()
- func (sm *SoftMax) Init(ncats, ninputs int)
- func (sm *SoftMax) InitLayer(ncats int, layers []emer.Layer)
- func (sm *SoftMax) Input(varNm string, di int)
- func (sm *SoftMax) Load(path string) error
- func (sm *SoftMax) Save(path string) error
- func (sm *SoftMax) Sort()
- func (sm *SoftMax) Train(targ int)
- func (sm *SoftMax) TrainMPI(targ int)
- func (sm *SoftMax) ValsTsr(name string) *etensor.Float32
- type SoftMaxUnit
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func IdentityFunc ¶ added in v1.3.36
func LogisticFunc ¶ added in v1.3.36
LogisticFunc implements the standard logistic function. Its outputs are in the range (0, 1). Also known as Sigmoid. See https://en.wikipedia.org/wiki/Logistic_function.
func TopVoteInt ¶ added in v1.1.42
TopVoteInt returns the choice with the most votes among a list of votes as integer-valued choices, and also returns the number of votes for that item. In the case of ties, it chooses one at random (otherwise it would have a bias toward the lowest numbered item).
func TopVoteString ¶ added in v1.1.42
TopVoteString returns the choice with the most votes among a list of votes as string-valued choices, and also returns the number of votes for that item. In the case of ties, it chooses one at random (otherwise it would have a bias toward the lowest numbered item).
Types ¶
type ActivationFunc ¶ added in v1.3.36
type Layer ¶ added in v1.3.39
type Layer interface { Name() string UnitValsTensor(tsr etensor.Tensor, varNm string, di int) error Shape() *etensor.Shape }
Layer is the subset of emer.Layer that is used by this code
type Linear ¶ added in v1.3.36
type Linear struct { // [def: 0.1] learning rate LRate float32 `def:"0.1" desc:"learning rate"` // layers to decode Layers []Layer `desc:"layers to decode"` // unit values -- read this for decoded output Units []LinearUnit `desc:"unit values -- read this for decoded output"` // number of inputs -- total sizes of layer inputs NInputs int `desc:"number of inputs -- total sizes of layer inputs"` // number of outputs -- total sizes of layer inputs NOutputs int `desc:"number of outputs -- total sizes of layer inputs"` // input values, copied from layers Inputs []float32 `desc:"input values, copied from layers"` // [view: -] for holding layer values ValsTsrs map[string]*etensor.Float32 `view:"-" desc:"for holding layer values"` // synaptic weights: outer loop is units, inner loop is inputs Weights etensor.Float32 `desc:"synaptic weights: outer loop is units, inner loop is inputs"` // activation function ActivationFn ActivationFunc `desc:"activation function"` // which pool to use within a layer PoolIndex int `desc:"which pool to use within a layer"` // [view: -] mpi communicator -- MPI users must set this to their comm -- do direct assignment Comm *mpi.Comm `view:"-" desc:"mpi communicator -- MPI users must set this to their comm -- do direct assignment"` // delta weight changes: only for MPI mode -- outer loop is units, inner loop is inputs MPIDWts etensor.Float32 `desc:"delta weight changes: only for MPI mode -- outer loop is units, inner loop is inputs"` }
Linear is a linear neural network, which can be configured with a custom activation function. By default it will use the identity function. It learns using the delta rule for each output unit.
func (*Linear) Back ¶ added in v1.3.36
Back compute the backward error propagation pass Returns SSE (sum squared error) of difference between targets and outputs.
func (*Linear) BackMPI ¶ added in v1.4.30
BackMPI compute the backward error propagation pass Returns SSE (sum squared error) of difference between targets and outputs.
func (*Linear) Decode ¶ added in v1.3.36
Decode decodes the given variable name from layers (forward pass). Decoded values are in Units[i].Act -- see also Output to get into a []float32. di is a data parallel index di, for networks capable of processing input patterns in parallel.
func (*Linear) Forward ¶ added in v1.3.36
func (dec *Linear) Forward()
Forward compute the forward pass from input
func (*Linear) Init ¶ added in v1.3.36
func (dec *Linear) Init(nOutputs, nInputs int, poolIndex int, activationFn ActivationFunc)
Init initializes detector with number of categories and number of inputs
func (*Linear) InitLayer ¶ added in v1.3.36
func (dec *Linear) InitLayer(nOutputs int, layers []Layer, activationFn ActivationFunc)
InitLayer initializes detector with number of categories and layers
func (*Linear) InitPool ¶ added in v1.3.39
func (dec *Linear) InitPool(nOutputs int, layer Layer, poolIndex int, activationFn ActivationFunc)
InitPool initializes detector with number of categories, 1 layer and
func (*Linear) Input ¶ added in v1.3.36
Input grabs the input from given variable in layers di is a data parallel index di, for networks capable of processing input patterns in parallel.
func (*Linear) Output ¶ added in v1.3.36
Output returns the resulting Decoded output activation values into given slice which is automatically resized if not of sufficient size.
func (*Linear) SetTargets ¶ added in v1.4.30
SetTargets sets given target correct answers, as []float32 values. Also returns and prints an error if targets are not sufficient length for NOutputs.
func (*Linear) Train ¶ added in v1.3.36
Train trains the decoder with given target correct answers, as []float32 values. Returns SSE (sum squared error) of difference between targets and outputs. Also returns and prints an error if targets are not sufficient length for NOutputs.
func (*Linear) TrainMPI ¶ added in v1.4.30
TrainMPI trains the decoder with given target correct answers, as []float32 values. Returns SSE (sum squared error) of difference between targets and outputs. Also returns and prints an error if targets are not sufficient length for NOutputs. MPI version uses mpi to synchronize weight changes across parallel nodes.
type LinearUnit ¶ added in v1.3.36
type LinearUnit struct { // target activation value -- typically 0 or 1 but can be within that range too Target float32 `desc:"target activation value -- typically 0 or 1 but can be within that range too"` // final activation = sum x * w -- this is the decoded output Act float32 `desc:"final activation = sum x * w -- this is the decoded output"` // net input = sum x * w Net float32 `desc:"net input = sum x * w"` }
LinearUnit has variables for Linear decoder unit
type SoftMax ¶
type SoftMax struct { // [def: 0.1] learning rate Lrate float32 `def:"0.1" desc:"learning rate"` // layers to decode Layers []emer.Layer `desc:"layers to decode"` // number of different categories to decode NCats int `desc:"number of different categories to decode"` // unit values Units []SoftMaxUnit `desc:"unit values"` // sorted list of indexes into Units, in descending order from strongest to weakest -- i.e., Sorted[0] has the most likely categorization, and its activity is Units[Sorted[0]].Act Sorted []int `` /* 183-byte string literal not displayed */ // number of inputs -- total sizes of layer inputs NInputs int `desc:"number of inputs -- total sizes of layer inputs"` // input values, copied from layers Inputs []float32 `desc:"input values, copied from layers"` // current target index of correct category Target int `desc:"current target index of correct category"` // [view: -] for holding layer values ValsTsrs map[string]*etensor.Float32 `view:"-" desc:"for holding layer values"` // synaptic weights: outer loop is units, inner loop is inputs Weights etensor.Float32 `desc:"synaptic weights: outer loop is units, inner loop is inputs"` // [view: -] mpi communicator -- MPI users must set this to their comm -- do direct assignment Comm *mpi.Comm `view:"-" desc:"mpi communicator -- MPI users must set this to their comm -- do direct assignment"` // delta weight changes: only for MPI mode -- outer loop is units, inner loop is inputs MPIDWts etensor.Float32 `desc:"delta weight changes: only for MPI mode -- outer loop is units, inner loop is inputs"` }
SoftMax is a softmax decoder, which is the best choice for a 1-hot classification using the widely-used SoftMax function: https://en.wikipedia.org/wiki/Softmax_function
func (*SoftMax) BackMPI ¶ added in v1.4.30
func (sm *SoftMax) BackMPI()
BackMPI compute the backward error propagation pass MPI version shares weight changes across nodes
func (*SoftMax) Decode ¶
Decode decodes the given variable name from layers (forward pass) See Sorted list of indexes for the decoding output -- i.e., Sorted[0] is the most likely -- that is returned here as a convenience. di is a data parallel index di, for networks capable of processing input patterns in parallel.
func (*SoftMax) Input ¶
Input grabs the input from given variable in layers di is a data parallel index di, for networks capable of processing input patterns in parallel.
func (*SoftMax) Load ¶ added in v1.4.31
Load loads the decoder weights from given file path. If the shape of the decoder does not match the shape of the saved weights, an error will be returned.
func (*SoftMax) Save ¶ added in v1.4.31
Save saves the decoder weights to given file path. If path ends in .gz, it will be gzipped.
func (*SoftMax) Sort ¶
func (sm *SoftMax) Sort()
Sort updates Sorted indexes of the current Unit category activations sorted from highest to lowest. i.e., the 0-index value has the strongest decoded output category, 1 the next-strongest, etc.
type SoftMaxUnit ¶ added in v1.1.44
type SoftMaxUnit struct { // final activation = e^Ge / sum e^Ge Act float32 `desc:"final activation = e^Ge / sum e^Ge"` // net input = sum x * w Net float32 `desc:"net input = sum x * w"` // exp(Net) Exp float32 `desc:"exp(Net)"` }
SoftMaxUnit has variables for softmax decoder unit