Documentation ¶
Index ¶
- func TopVoteInt(votes []int) (int, int)
- func TopVoteString(votes []string) (string, int)
- type Sigmoid
- func (sm *Sigmoid) Back() float32
- func (sm *Sigmoid) Decode(varNm string)
- func (sm *Sigmoid) Forward()
- func (sm *Sigmoid) Init(ncats, ninputs int)
- func (sm *Sigmoid) InitLayer(ncats int, layers []emer.Layer)
- func (sm *Sigmoid) Input(varNm string)
- func (sm *Sigmoid) Output(acts *[]float32)
- func (sm *Sigmoid) Train(targs []float32) (float32, error)
- func (sm *Sigmoid) ValsTsr(name string) *etensor.Float32
- type SigmoidUnit
- type SoftMax
- func (sm *SoftMax) Back()
- func (sm *SoftMax) Decode(varNm string) int
- func (sm *SoftMax) Forward()
- func (sm *SoftMax) Init(ncats, ninputs int)
- func (sm *SoftMax) InitLayer(ncats int, layers []emer.Layer)
- func (sm *SoftMax) Input(varNm string)
- func (sm *SoftMax) Sort()
- func (sm *SoftMax) Train(targ int)
- func (sm *SoftMax) ValsTsr(name string) *etensor.Float32
- type SoftMaxUnit
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func TopVoteInt ¶ added in v1.1.42
TopVoteInt returns the choice with the most votes among a list of votes as integer-valued choices, and also returns the number of votes for that item. In the case of ties, it chooses one at random (otherwise it would have a bias toward the lowest numbered item).
func TopVoteString ¶ added in v1.1.42
TopVoteString returns the choice with the most votes among a list of votes as string-valued choices, and also returns the number of votes for that item. In the case of ties, it chooses one at random (otherwise it would have a bias toward the lowest numbered item).
Types ¶
type Sigmoid ¶ added in v1.1.44
type Sigmoid struct { Lrate float32 `def:"0.1" desc:"learning rate"` Layers []emer.Layer `desc:"layers to decode"` NCats int `desc:"number of different categories to decode"` Units []SigmoidUnit `desc:"unit values -- read this for decoded output"` NInputs int `desc:"number of inputs -- total sizes of layer inputs"` Inputs []float32 `desc:"input values, copied from layers"` ValsTsrs map[string]*etensor.Float32 `view:"-" desc:"for holding layer values"` Weights etensor.Float32 `desc:"synaptic weights: outer loop is units, inner loop is inputs"` }
Sigmoid is a sigmoidal activation function decoder, which is the best choice for factorial, independent categories where any number of them might be active at a time. It learns using the delta rule for each output unit.
func (*Sigmoid) Back ¶ added in v1.1.44
Back compute the backward error propagation pass Returns SSE (sum squared error) of difference between targets and outputs.
func (*Sigmoid) Decode ¶ added in v1.1.44
Decode decodes the given variable name from layers (forward pass). Decoded values are in Units[i].Act -- see also Output to get into a []float32
func (*Sigmoid) Forward ¶ added in v1.1.44
func (sm *Sigmoid) Forward()
Forward compute the forward pass from input
func (*Sigmoid) Init ¶ added in v1.1.44
Init initializes detector with number of categories and number of inputs
func (*Sigmoid) InitLayer ¶ added in v1.1.44
InitLayer initializes detector with number of categories and layers
func (*Sigmoid) Output ¶ added in v1.1.44
Output returns the resulting Decoded output activation values into given slice which is automatically resized if not of sufficient size.
type SigmoidUnit ¶ added in v1.1.44
type SigmoidUnit struct { Targ float32 `desc:"target activation value -- typically 0 or 1 but can be within that range too"` Act float32 `desc:"final activation = 1 / (1 + e^-Net) -- this is the decoded output"` Net float32 `desc:"net input = sum x * w"` }
SigmoidUnit has variables for Sigmoid decoder unit
type SoftMax ¶
type SoftMax struct { Lrate float32 `def:"0.1" desc:"learning rate"` Layers []emer.Layer `desc:"layers to decode"` NCats int `desc:"number of different categories to decode"` Units []SoftMaxUnit `desc:"unit values"` Sorted []int `` /* 183-byte string literal not displayed */ NInputs int `desc:"number of inputs -- total sizes of layer inputs"` Inputs []float32 `desc:"input values, copied from layers"` Targ int `desc:"current target index of correct category"` ValsTsrs map[string]*etensor.Float32 `view:"-" desc:"for holding layer values"` Weights etensor.Float32 `desc:"synaptic weights: outer loop is units, inner loop is inputs"` }
SoftMax is a softmax decoder, which is the best choice for a 1-hot classification using the widely-used SoftMax function: https://en.wikipedia.org/wiki/Softmax_function
func (*SoftMax) Decode ¶
Decode decodes the given variable name from layers (forward pass) See Sorted list of indexes for the decoding output -- i.e., Sorted[0] is the most likely -- that is returned here as a convenience.
func (*SoftMax) Sort ¶
func (sm *SoftMax) Sort()
Sort updates Sorted indexes of the current Unit category activations sorted from highest to lowest. i.e., the 0-index value has the strongest decoded output category, 1 the next-strongest, etc.
type SoftMaxUnit ¶ added in v1.1.44
type SoftMaxUnit struct { Act float32 `desc:"final activation = e^Ge / sum e^Ge"` Net float32 `desc:"net input = sum x * w"` Exp float32 `desc:"exp(Net)"` }
SoftMaxUnit has variables for softmax decoder unit