Documentation ¶
Index ¶
- func IdentityFunc(x float32) float32
- func LogisticFunc(x float32) float32
- func TopVoteInt(votes []int) (int, int)
- func TopVoteString(votes []string) (string, int)
- type ActivationFunc
- type Linear
- func (dec *Linear) Back() float32
- func (dec *Linear) Decode(varNm string)
- func (dec *Linear) Forward()
- func (dec *Linear) Init(nOutputs, nInputs int, activationFn ActivationFunc)
- func (dec *Linear) InitLayer(nOutputs int, layers []emer.Layer, activationFn ActivationFunc)
- func (dec *Linear) Input(varNm string)
- func (dec *Linear) Output(acts *[]float32)
- func (dec *Linear) Train(targs []float32) (float32, error)
- func (dec *Linear) ValsTsr(name string) *etensor.Float32
- type LinearUnit
- type SoftMax
- func (sm *SoftMax) Back()
- func (sm *SoftMax) Decode(varNm string) int
- func (sm *SoftMax) Forward()
- func (sm *SoftMax) Init(ncats, ninputs int)
- func (sm *SoftMax) InitLayer(ncats int, layers []emer.Layer)
- func (sm *SoftMax) Input(varNm string)
- func (sm *SoftMax) Sort()
- func (sm *SoftMax) Train(targ int)
- func (sm *SoftMax) ValsTsr(name string) *etensor.Float32
- type SoftMaxUnit
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func IdentityFunc ¶ added in v1.3.36
func LogisticFunc ¶ added in v1.3.36
LogisticFunc implements the standard logistic function. Its outputs are in the range (0, 1). Also known as Sigmoid. See https://en.wikipedia.org/wiki/Logistic_function.
func TopVoteInt ¶ added in v1.1.42
TopVoteInt returns the choice with the most votes among a list of votes as integer-valued choices, and also returns the number of votes for that item. In the case of ties, it chooses one at random (otherwise it would have a bias toward the lowest numbered item).
func TopVoteString ¶ added in v1.1.42
TopVoteString returns the choice with the most votes among a list of votes as string-valued choices, and also returns the number of votes for that item. In the case of ties, it chooses one at random (otherwise it would have a bias toward the lowest numbered item).
Types ¶
type ActivationFunc ¶ added in v1.3.36
type Linear ¶ added in v1.3.36
type Linear struct { LRate float32 `def:"0.1" desc:"learning rate"` Layers []emer.Layer `desc:"layers to decode"` Units []LinearUnit `desc:"unit values -- read this for decoded output"` NInputs int `desc:"number of inputs -- total sizes of layer inputs"` NOutputs int `desc:"number of outputs -- total sizes of layer inputs"` Inputs []float32 `desc:"input values, copied from layers"` ValsTsrs map[string]*etensor.Float32 `view:"-" desc:"for holding layer values"` Weights etensor.Float32 `desc:"synaptic weights: outer loop is units, inner loop is inputs"` ActivationFn ActivationFunc `desc:"activation function"` }
Linear is a linear neural network, which can be configured with a custom activation function. By default it will use the identity function. It learns using the delta rule for each output unit.
func (*Linear) Back ¶ added in v1.3.36
Back compute the backward error propagation pass Returns SSE (sum squared error) of difference between targets and outputs.
func (*Linear) Decode ¶ added in v1.3.36
Decode decodes the given variable name from layers (forward pass). Decoded values are in Units[i].Act -- see also Output to get into a []float32
func (*Linear) Forward ¶ added in v1.3.36
func (dec *Linear) Forward()
Forward compute the forward pass from input
func (*Linear) Init ¶ added in v1.3.36
func (dec *Linear) Init(nOutputs, nInputs int, activationFn ActivationFunc)
Init initializes detector with number of categories and number of inputs
func (*Linear) InitLayer ¶ added in v1.3.36
func (dec *Linear) InitLayer(nOutputs int, layers []emer.Layer, activationFn ActivationFunc)
InitLayer initializes detector with number of categories and layers
func (*Linear) Output ¶ added in v1.3.36
Output returns the resulting Decoded output activation values into given slice which is automatically resized if not of sufficient size.
type LinearUnit ¶ added in v1.3.36
type LinearUnit struct { Target float32 `desc:"target activation value -- typically 0 or 1 but can be within that range too"` Act float32 `desc:"final activation = sum x * w -- this is the decoded output"` Net float32 `desc:"net input = sum x * w"` }
LinearUnit has variables for Linear decoder unit
type SoftMax ¶
type SoftMax struct { Lrate float32 `def:"0.1" desc:"learning rate"` Layers []emer.Layer `desc:"layers to decode"` NCats int `desc:"number of different categories to decode"` Units []SoftMaxUnit `desc:"unit values"` Sorted []int `` /* 183-byte string literal not displayed */ NInputs int `desc:"number of inputs -- total sizes of layer inputs"` Inputs []float32 `desc:"input values, copied from layers"` Target int `desc:"current target index of correct category"` ValsTsrs map[string]*etensor.Float32 `view:"-" desc:"for holding layer values"` Weights etensor.Float32 `desc:"synaptic weights: outer loop is units, inner loop is inputs"` }
SoftMax is a softmax decoder, which is the best choice for a 1-hot classification using the widely-used SoftMax function: https://en.wikipedia.org/wiki/Softmax_function
func (*SoftMax) Decode ¶
Decode decodes the given variable name from layers (forward pass) See Sorted list of indexes for the decoding output -- i.e., Sorted[0] is the most likely -- that is returned here as a convenience.
func (*SoftMax) Sort ¶
func (sm *SoftMax) Sort()
Sort updates Sorted indexes of the current Unit category activations sorted from highest to lowest. i.e., the 0-index value has the strongest decoded output category, 1 the next-strongest, etc.
type SoftMaxUnit ¶ added in v1.1.44
type SoftMaxUnit struct { Act float32 `desc:"final activation = e^Ge / sum e^Ge"` Net float32 `desc:"net input = sum x * w"` Exp float32 `desc:"exp(Net)"` }
SoftMaxUnit has variables for softmax decoder unit