Documentation ¶
Index ¶
- func IdentityFunc(x float32) float32
- func LogisticFunc(x float32) float32
- func TopVoteInt(votes []int) (int, int)
- func TopVoteString(votes []string) (string, int)
- type ActivationFunc
- type Layer
- type Linear
- func (dec *Linear) Back() float32
- func (dec *Linear) BackMPI() float32
- func (dec *Linear) Decode(varNm string, di int)
- func (dec *Linear) Forward()
- func (dec *Linear) Init(nOutputs, nInputs int, poolIndex int, activationFn ActivationFunc)
- func (dec *Linear) InitLayer(nOutputs int, layers []Layer, activationFn ActivationFunc)
- func (dec *Linear) InitPool(nOutputs int, layer Layer, poolIndex int, activationFn ActivationFunc)
- func (dec *Linear) Input(varNm string, di int)
- func (dec *Linear) Output(acts *[]float32)
- func (dec *Linear) SetTargets(targs []float32) error
- func (dec *Linear) Train(targs []float32) (float32, error)
- func (dec *Linear) TrainMPI(targs []float32) (float32, error)
- func (dec *Linear) ValuesTsr(name string) *etensor.Float32
- type LinearUnit
- type SoftMax
- func (sm *SoftMax) Back()
- func (sm *SoftMax) BackMPI()
- func (sm *SoftMax) Decode(varNm string, di int) int
- func (sm *SoftMax) Forward()
- func (sm *SoftMax) Init(ncats, ninputs int)
- func (sm *SoftMax) InitLayer(ncats int, layers []emer.Layer)
- func (sm *SoftMax) Input(varNm string, di int)
- func (sm *SoftMax) Load(path string) error
- func (sm *SoftMax) Save(path string) error
- func (sm *SoftMax) Sort()
- func (sm *SoftMax) Train(targ int)
- func (sm *SoftMax) TrainMPI(targ int)
- func (sm *SoftMax) ValuesTsr(name string) *etensor.Float32
- type SoftMaxUnit
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func IdentityFunc ¶
func LogisticFunc ¶
LogisticFunc implements the standard logistic function. Its outputs are in the range (0, 1). Also known as Sigmoid. See https://en.wikipedia.org/wiki/Logistic_function.
func TopVoteInt ¶
TopVoteInt returns the choice with the most votes among a list of votes as integer-valued choices, and also returns the number of votes for that item. In the case of ties, it chooses one at random (otherwise it would have a bias toward the lowest numbered item).
func TopVoteString ¶
TopVoteString returns the choice with the most votes among a list of votes as string-valued choices, and also returns the number of votes for that item. In the case of ties, it chooses one at random (otherwise it would have a bias toward the lowest numbered item).
Types ¶
type ActivationFunc ¶
type Layer ¶
type Layer interface { Name() string UnitValuesTensor(tsr etensor.Tensor, varNm string, di int) error Shape() *etensor.Shape }
Layer is the subset of emer.Layer that is used by this code
type Linear ¶
type Linear struct { // learning rate LRate float32 `default:"0.1"` // layers to decode Layers []Layer // unit values -- read this for decoded output Units []LinearUnit // number of inputs -- total sizes of layer inputs NInputs int // number of outputs -- total sizes of layer inputs NOutputs int // input values, copied from layers Inputs []float32 // for holding layer values ValuesTsrs map[string]*etensor.Float32 `view:"-"` // synaptic weights: outer loop is units, inner loop is inputs Weights etensor.Float32 // activation function ActivationFn ActivationFunc // which pool to use within a layer PoolIndex int // mpi communicator -- MPI users must set this to their comm -- do direct assignment Comm *mpi.Comm `view:"-"` // delta weight changes: only for MPI mode -- outer loop is units, inner loop is inputs MPIDWts etensor.Float32 }
Linear is a linear neural network, which can be configured with a custom activation function. By default it will use the identity function. It learns using the delta rule for each output unit.
func (*Linear) Back ¶
Back compute the backward error propagation pass Returns SSE (sum squared error) of difference between targets and outputs.
func (*Linear) BackMPI ¶
BackMPI compute the backward error propagation pass Returns SSE (sum squared error) of difference between targets and outputs.
func (*Linear) Decode ¶
Decode decodes the given variable name from layers (forward pass). Decoded values are in Units[i].Act -- see also Output to get into a []float32. di is a data parallel index di, for networks capable of processing input patterns in parallel.
func (*Linear) Init ¶
func (dec *Linear) Init(nOutputs, nInputs int, poolIndex int, activationFn ActivationFunc)
Init initializes detector with number of categories and number of inputs
func (*Linear) InitLayer ¶
func (dec *Linear) InitLayer(nOutputs int, layers []Layer, activationFn ActivationFunc)
InitLayer initializes detector with number of categories and layers
func (*Linear) InitPool ¶
func (dec *Linear) InitPool(nOutputs int, layer Layer, poolIndex int, activationFn ActivationFunc)
InitPool initializes detector with number of categories, 1 layer and
func (*Linear) Input ¶
Input grabs the input from given variable in layers di is a data parallel index di, for networks capable of processing input patterns in parallel.
func (*Linear) Output ¶
Output returns the resulting Decoded output activation values into given slice which is automatically resized if not of sufficient size.
func (*Linear) SetTargets ¶
SetTargets sets given target correct answers, as []float32 values. Also returns and prints an error if targets are not sufficient length for NOutputs.
func (*Linear) Train ¶
Train trains the decoder with given target correct answers, as []float32 values. Returns SSE (sum squared error) of difference between targets and outputs. Also returns and prints an error if targets are not sufficient length for NOutputs.
func (*Linear) TrainMPI ¶
TrainMPI trains the decoder with given target correct answers, as []float32 values. Returns SSE (sum squared error) of difference between targets and outputs. Also returns and prints an error if targets are not sufficient length for NOutputs. MPI version uses mpi to synchronize weight changes across parallel nodes.
type LinearUnit ¶
type LinearUnit struct { // target activation value -- typically 0 or 1 but can be within that range too Target float32 // final activation = sum x * w -- this is the decoded output Act float32 // net input = sum x * w Net float32 }
LinearUnit has variables for Linear decoder unit
type SoftMax ¶
type SoftMax struct { // learning rate Lrate float32 `default:"0.1"` // layers to decode Layers []emer.Layer // number of different categories to decode NCats int // unit values Units []SoftMaxUnit // sorted list of indexes into Units, in descending order from strongest to weakest -- i.e., Sortedhas the most likely categorization, and its activity is Units].Act Sorted []int // number of inputs -- total sizes of layer inputs NInputs int // input values, copied from layers Inputs []float32 // current target index of correct category Target int // for holding layer values ValuesTsrs map[string]*etensor.Float32 `view:"-"` // synaptic weights: outer loop is units, inner loop is inputs Weights etensor.Float32 // mpi communicator -- MPI users must set this to their comm -- do direct assignment Comm *mpi.Comm `view:"-"` // delta weight changes: only for MPI mode -- outer loop is units, inner loop is inputs MPIDWts etensor.Float32 }
SoftMax is a softmax decoder, which is the best choice for a 1-hot classification using the widely-used SoftMax function: https://en.wikipedia.org/wiki/Softmax_function
func (*SoftMax) BackMPI ¶
func (sm *SoftMax) BackMPI()
BackMPI compute the backward error propagation pass MPI version shares weight changes across nodes
func (*SoftMax) Decode ¶
Decode decodes the given variable name from layers (forward pass) See Sorted list of indexes for the decoding output -- i.e., Sorted[0] is the most likely -- that is returned here as a convenience. di is a data parallel index di, for networks capable of processing input patterns in parallel.
func (*SoftMax) Input ¶
Input grabs the input from given variable in layers di is a data parallel index di, for networks capable of processing input patterns in parallel.
func (*SoftMax) Load ¶
Load loads the decoder weights from given file path. If the shape of the decoder does not match the shape of the saved weights, an error will be returned.
func (*SoftMax) Save ¶
Save saves the decoder weights to given file path. If path ends in .gz, it will be gzipped.
func (*SoftMax) Sort ¶
func (sm *SoftMax) Sort()
Sort updates Sorted indexes of the current Unit category activations sorted from highest to lowest. i.e., the 0-index value has the strongest decoded output category, 1 the next-strongest, etc.
type SoftMaxUnit ¶
type SoftMaxUnit struct { // final activation = e^Ge / sum e^Ge Act float32 // net input = sum x * w Net float32 // exp(Net) Exp float32 }
SoftMaxUnit has variables for softmax decoder unit