Documentation ¶
Overview ¶
Package deep provides the DeepLeabra variant of Leabra, which performs predictive learning by attempting to predict the activation states over the Pulvinar nucleus of the thalamus (in posterior sensory cortex), which are driven phasically every 100 msec by deep layer 5 intrinsic bursting (5IB) neurons that have strong focal (essentially 1-to-1) connections onto the Pulvinar Thalamic Relay Cell (TRC) neurons.
This package allows you to specify layer types as Deep or TRC (e.g., Pulvinar) which in turn drives specific forms of computation associated with each of those layer types. Standard leabra layer types are all effectively Super.
Wiring diagram:
Super Layer --BurstTRC--> Pulv | ^ ^
BurstCtxt | - Back -/
| [DeepAttn] / v | / Deep Layer ----- (typically only for higher->lower)
DeepLeabra captures both the predictive learning and attentional modulation functions of the deep layer and thalamocortical circuitry.
* Super layer neurons reflect the superficial layers of the neocortex, but they also are the basis for directly computing the Burst activation signal that reflects the deep layer 5 IB bursting activation, via thresholding of the superficial layer activations (Bursting is thought to have a higher threshold).
* The alpha-cycle quarter(s) when Burst is updated and broadcast is set in BurstParams.BurstQtr (defaults to Q4, can also be e.g., Q2 and Q4 for beta frequency updating). During this quarter(s), the Burst from Super layers is continuously sent via BurstTRC projections to TRC layers (using efficient delta-based computation) to drive plus-phase outcome states in those layers. At the end of the burst quarter(s), BurstCtxt projections convey the Burst signal to Deep layer neurons, where it is integrated into the Ctxt value representing the temporally-delayed context information. Note: Deep layers also compute a Burst value themselves, which can be sent via self projections to relfect the extensive deep-to-deep lateral connectivity that provides more extensive temporal context information.
* Deep layer neurons reflect the layer 6 regular spiking CT corticothalamic neurons that project into the thalamus, and back up to all the other lamina within a microcolumn, where they drive a multiplicative attentional modulation signal. These neurons receive the Burst activation via a BurstCtxt projection type, typically once every 100 msec, and integrate that in the Ctxt value, which is added to other excitatory conductance inputs to drive the overall activation (Act) of these neurons. Due to the bursting nature of the Burst inputs, this causes these Deep layer neurons to reflect what the superficial layers encoded on the *previous* timestep -- thus they represent a temporally-delayed context state.
* Deep layer neurons project to the TRC (Pulvinar) neurons via standard Act-driven projections that integrate into standard Ge excitatory input in TRC neurons, to drive the prediction aspect of predictive learning. They also can project back to the Super layer neurons via a DeepAttn projection to drive attentional modulation of activity there (in `AttnGe`, `DeepAttn`, and `DeepLrn` Neuron vars).
* TRC layer neurons receive a BurstTRC projection from the Super layer (typically a `prjn.OneToOne` projection), which drives the plus-phase "outcome" activation state of these Pulvinar layers (Super actually computes the 5IB Burst activation). These layers also receive regular connections from Deep layers, which drive the prediction of this plus-phase outcome state, based on the temporally-delayed deep layer context information.
* The attentional effects are implemented via DeepAttn projections from Deep to Super layers, which are typically fixed, non-learning, one-to-one projections, that drive the AttnGe excitatory condutance in Super layers. AttnGe then drives the computation of DeepAttn and DeepLrn values that modulate (i.e., multiply) the activation (DeepAttn) or learning rate (DeepLrn) of these superficial neurons.
All of the relevant parameters are in the params.go file, in the Deep*Params classes, which are then fields in the deep.Layer.
* BurstParams (layer DeepBurst) has the BurstQtr when Burst is updated, and the thresholding parameters.
* CtxtParams (layer DeepCtxt) has parameters for integrating DeepCtxt input
* TRCParams (layer DeepTRC) has parameters for how to compute TRC plus phase activation states based on the TRCBurstGe excitatory input from the BurstTRC projections.
* AttnParams (layer DeepAttn) has the parameters for computing DeepAttn and DeepLrn from AttnGe
Index ¶
- Constants
- Variables
- func NeuronVarByName(varNm string) (int, error)
- type AttnParams
- type BurstParams
- type DeepLayer
- type DeepPrjn
- type Layer
- func (ly *Layer) ActFmG(ltime *leabra.Time)
- func (ly *Layer) AsDeep() *Layer
- func (ly *Layer) AttnGeInc(ltime *leabra.Time)
- func (ly *Layer) AvgMaxAct(ltime *leabra.Time)
- func (ly *Layer) AvgMaxActNoAttn(ltime *leabra.Time)
- func (ly *Layer) AvgMaxAttnGe(ltime *leabra.Time)
- func (ly *Layer) AvgMaxGe(ltime *leabra.Time)
- func (ly *Layer) AvgMaxTRCBurstGe(ltime *leabra.Time)
- func (ly *Layer) Build() error
- func (ly *Layer) BurstFmAct(ltime *leabra.Time)
- func (ly *Layer) BurstPrv(ltime *leabra.Time)
- func (ly *Layer) Class() string
- func (ly *Layer) CtxtFmGe(ltime *leabra.Time)
- func (ly *Layer) DecayState(decay float32)
- func (ly *Layer) DeepAttnFmG(ltime *leabra.Time)
- func (ly *Layer) Defaults()
- func (ly *Layer) GFmInc(ltime *leabra.Time)
- func (ly *Layer) GScaleFmAvgAct()
- func (ly *Layer) InitActs()
- func (ly *Layer) InitGInc()
- func (ly *Layer) IsSuper() bool
- func (ly *Layer) QuarterFinal(ltime *leabra.Time)
- func (ly *Layer) SendCtxtGe(ltime *leabra.Time)
- func (ly *Layer) SendGDelta(ltime *leabra.Time)
- func (ly *Layer) SendTRCBurstGeDelta(ltime *leabra.Time)
- func (ly *Layer) TRCBurstGeFmInc(ltime *leabra.Time)
- func (ly *Layer) UnitVal1DTry(varNm string, idx int) (float32, error)
- func (ly *Layer) UnitValTry(varNm string, idx []int) (float32, error)
- func (ly *Layer) UnitVals(vals *[]float32, varNm string) error
- func (ly *Layer) UnitValsTensor(tsr etensor.Tensor, varNm string) error
- func (ly *Layer) UnitVarNames() []string
- func (ly *Layer) UpdateParams()
- type LayerType
- type Network
- func (nt *Network) AddInputPulv2D(name string, shapeY, shapeX int) (input, pulv emer.Layer)
- func (nt *Network) AddInputPulv4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (input, pulv emer.Layer)
- func (nt *Network) AddSuperDeep2D(name string, shapeY, shapeX int, pulvLay, attn bool) (super, deep, pulv emer.Layer)
- func (nt *Network) AddSuperDeep4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int, pulvLay, attn bool) (super, deep, pulv emer.Layer)
- func (nt *Network) Cycle(ltime *leabra.Time)
- func (nt *Network) DeepBurst(ltime *leabra.Time)
- func (nt *Network) DeepCtxt(ltime *leabra.Time)
- func (nt *Network) Defaults()
- func (nt *Network) NewLayer() emer.Layer
- func (nt *Network) NewPrjn() emer.Prjn
- func (nt *Network) QuarterFinal(ltime *leabra.Time)
- func (nt *Network) UpdateParams()
- type Neuron
- type Pool
- type Prjn
- func (pj *Prjn) Build() error
- func (pj *Prjn) DWt()
- func (pj *Prjn) DWtDeepCtxt()
- func (pj *Prjn) Defaults()
- func (pj *Prjn) InitGInc()
- func (pj *Prjn) PrjnTypeName() string
- func (pj *Prjn) RecvAttnGeInc()
- func (pj *Prjn) RecvCtxtGeInc()
- func (pj *Prjn) RecvTRCBurstGeInc()
- func (pj *Prjn) SendAttnGeDelta(si int, delta float32)
- func (pj *Prjn) SendCtxtGe(si int, dburst float32)
- func (pj *Prjn) SendTRCBurstGeDelta(si int, delta float32)
- func (pj *Prjn) UpdateParams()
- type PrjnType
- type TRCParams
Constants ¶
const ( // Deep are deep-layer neurons, reflecting activation of layer 6 regular spiking // CT corticothalamic neurons, which drive both attention in Super (via DeepAttn // projections) and predictions in TRC (Pulvinar) via standard projections. Deep emer.LayerType = emer.LayerTypeN + iota // TRC are thalamic relay cell neurons, typically in the Pulvinar, which alternately reflect // predictions driven by Deep layer projections, and actual outcomes driven by BurstTRC // projections from corresponding Super layer neurons that provide strong driving inputs to // TRC neurons. TRC )
bool args for greater clarity
const ( // BurstCtxt are projections from Superficial layers to Deep layers that // send Burst activations drive updating of DeepCtxt excitatory conductance, // at end of a DeepBurst quarter. These projections also use a special learning // rule that takes into account the temporal delays in the activation states. BurstCtxt emer.PrjnType = emer.PrjnTypeN + iota // BurstTRC are projections from Superficial layers to TRC (thalamic relay cell) // neurons (e.g., in the Pulvinar) that send Burst activation continuously // during the DeepBurst quarter(s), driving the TRCBurstGe value, which then drives // the plus-phase activation state of the TRC representing the "outcome" against // which prior predictions are (implicitly) compared via the temporal difference // in TRC activation state. BurstTRC // DeepAttn are projections from Deep layers (representing layer 6 regular-spiking // CT corticothalamic neurons) up to corresponding Superficial layer neurons, that drive // the attentional modulation of activations there (i.e., DeepAttn and DeepLrn values). // This is sent continuously all the time from deep layers using the standard delta-based // Ge computation, and aggregated into the AttnGe variable on Super neurons. DeepAttn )
The DeepLeabra prjn types
Variables ¶
var ( NeuronVars = []string{"ActNoAttn", "Burst", "BurstPrv", "CtxtGe", "TRCBurstGe", "BurstSent", "AttnGe", "DeepAttn", "DeepLrn"} NeuronVarsMap map[string]int NeuronVarsAll []string )
var KiT_Layer = kit.Types.AddType(&Layer{}, LayerProps)
var KiT_LayerType = kit.Enums.AddEnumExt(emer.KiT_LayerType, LayerTypeN, kit.NotBitFlag, nil)
var KiT_Network = kit.Types.AddType(&Network{}, NetworkProps)
var KiT_Prjn = kit.Types.AddType(&Prjn{}, PrjnProps)
var KiT_PrjnType = kit.Enums.AddEnumExt(emer.KiT_PrjnType, PrjnTypeN, kit.NotBitFlag, nil)
var LayerProps = ki.Props{ "EnumType:Typ": KiT_LayerType, "ToolBar": ki.PropSlice{ {"Defaults", ki.Props{ "icon": "reset", "desc": "return all parameters to their intial default values", }}, {"InitWts", ki.Props{ "icon": "update", "desc": "initialize the layer's weight values according to prjn parameters, for all *sending* projections out of this layer", }}, {"InitActs", ki.Props{ "icon": "update", "desc": "initialize the layer's activation values", }}, {"sep-act", ki.BlankProp{}}, {"LesionNeurons", ki.Props{ "icon": "close", "desc": "Lesion (set the Off flag) for given proportion of neurons in the layer (number must be 0 -- 1, NOT percent!)", "Args": ki.PropSlice{ {"Proportion", ki.Props{ "desc": "proportion (0 -- 1) of neurons to lesion", }}, }, }}, {"UnLesionNeurons", ki.Props{ "icon": "reset", "desc": "Un-Lesion (reset the Off flag) for all neurons in the layer", }}, }, }
var NetworkProps = leabra.NetworkProps
var PrjnProps = ki.Props{ "EnumType:Typ": KiT_PrjnType, }
Functions ¶
func NeuronVarByName ¶
NeuronVarByName returns the index of the variable in the Neuron, or error
Types ¶
type AttnParams ¶
type AttnParams struct { On bool `desc:"Enable the computation of DeepAttn, DeepLrn from AttnGe (otherwise, DeepAttn and DeepLrn = 1)"` Min float32 `` /* 239-byte string literal not displayed */ Thr float32 `` /* 256-byte string literal not displayed */ Range float32 `` /* 128-byte string literal not displayed */ }
AttnParams are parameters determining how the DeepAttn and DeepLrn attentional modulation is computed from the AttnGe inputs received via DeepAttn projections
func (*AttnParams) DeepAttnFmG ¶
func (db *AttnParams) DeepAttnFmG(lrn float32) float32
DeepAttnFmG returns the DeepAttn value computed from DeepLrn value
func (*AttnParams) DeepLrnFmG ¶
func (db *AttnParams) DeepLrnFmG(attnG, attnMax float32) float32
DeepLrnFmG returns the DeepLrn value computed from AttnGe and MAX(AttnGe) across layer. As simply the max-normalized value.
func (*AttnParams) Defaults ¶
func (db *AttnParams) Defaults()
func (*AttnParams) Update ¶
func (db *AttnParams) Update()
type BurstParams ¶
type BurstParams struct { On bool `` /* 149-byte string literal not displayed */ BurstQtr leabra.Quarters `` /* 222-byte string literal not displayed */ FmActNoAttn bool `` /* 281-byte string literal not displayed */ ThrRel float32 `` /* 365-byte string literal not displayed */ ThrAbs float32 `` /* 258-byte string literal not displayed */ }
BurstParams are parameters determining how the DeepBurst activation is computed from the superficial layer activation values.
func (*BurstParams) Defaults ¶
func (db *BurstParams) Defaults()
func (*BurstParams) IsBurstQtr ¶
func (db *BurstParams) IsBurstQtr(qtr int) bool
IsBurstQtr returns true if the given quarter (0-3) is set as a Bursting quarter.
func (*BurstParams) NextIsBurstQtr ¶
func (db *BurstParams) NextIsBurstQtr(qtr int) bool
NextIsBurstQtr returns true if the quarter after given quarter (0-3) is set as a Bursting quarter according to BurstQtr settings. wraps around -- if qtr=3 and qtr=0 is a burst qtr, then it is true
func (*BurstParams) PrevIsBurstQtr ¶
func (db *BurstParams) PrevIsBurstQtr(qtr int) bool
PrevIsBurstQtr returns true if the quarter before given quarter (0-3) is set as a Bursting quarter according to BurstQtr settings. wraps around -- if qtr=0 and qtr=3 is a burst qtr, then it is true
func (*BurstParams) SetBurstQtr ¶
func (db *BurstParams) SetBurstQtr(qtr leabra.Quarters)
SetBurstQtr sets given burst quarter (adds to any existing) -- Q4 by default
func (*BurstParams) Update ¶
func (db *BurstParams) Update()
type DeepLayer ¶
type DeepLayer interface { leabra.LeabraLayer // AsDeep returns this layer as a deep.Layer -- provides direct access to variables AsDeep() *Layer // AttnGeInc integrates new AttnGe from increments sent during last SendGDelta. AttnGeInc(ltime *leabra.Time) // AvgMaxAttnGe computes the average and max AttnGe stats AvgMaxAttnGe(ltime *leabra.Time) // DeepAttnFmG computes DeepAttn and DeepLrn from AttnGe input, // and then applies the DeepAttn modulation to the Act activation value. DeepAttnFmG(ltime *leabra.Time) // AvgMaxActNoAttn computes the average and max ActNoAttn stats AvgMaxActNoAttn(ltime *leabra.Time) // BurstFmAct updates Burst layer 5 IB bursting value from current Act (superficial activation) // Subject to thresholding. BurstFmAct(ltime *leabra.Time) // SendTRCBurstGeDelta sends change in Burst activation since last sent, over BurstTRC // projections. SendTRCBurstGeDelta(ltime *leabra.Time) // TRCBurstGeFmInc computes the TRCBurstGe input from sent values TRCBurstGeFmInc(ltime *leabra.Time) // AvgMaxTRCBurstGe computes the average and max TRCBurstGe stats AvgMaxTRCBurstGe(ltime *leabra.Time) // SendCtxtGe sends full Burst activation over BurstCtxt projections to integrate // CtxtGe excitatory conductance on deep layers. // This must be called at the end of the Burst quarter for this layer. SendCtxtGe(ltime *leabra.Time) // CtxtFmGe integrates new CtxtGe excitatory conductance from projections, and computes // overall Ctxt value. This must be called at the end of the Burst quarter for this layer, // after SendCtxtGe. CtxtFmGe(ltime *leabra.Time) // BurstPrv saves Burst as BurstPrv BurstPrv(ltime *leabra.Time) }
DeepLayer defines the essential algorithmic API for DeepLeabra at the layer level.
type DeepPrjn ¶
type DeepPrjn interface { leabra.LeabraPrjn // SendCtxtGe sends the full Burst activation from sending neuron index si, // to integrate CtxtGe excitatory conductance on receivers SendCtxtGe(si int, dburst float32) // SendTRCBurstGeDelta sends the delta-Burst activation from sending neuron index si, // to integrate TRCBurstGe excitatory conductance on receivers SendTRCBurstGeDelta(si int, delta float32) // SendAttnGeDelta sends the delta-activation from sending neuron index si, // to integrate into AttnGeInc excitatory conductance on receivers SendAttnGeDelta(si int, delta float32) // RecvCtxtGeInc increments the receiver's CtxtGe from that of all the projections RecvCtxtGeInc() // RecvTRCBurstGeInc increments the receiver's TRCBurstGe from that of all the projections RecvTRCBurstGeInc() // RecvAttnGeInc increments the receiver's AttnGe from that of all the projections RecvAttnGeInc() // DWtDeepCtxt computes the weight change (learning) -- for DeepCtxt projections DWtDeepCtxt() }
DeepPrjn defines the essential algorithmic API for DeepLeabra at the projection level.
type Layer ¶
type Layer struct { leabra.Layer // access as .Layer DeepBurst BurstParams `` /* 142-byte string literal not displayed */ DeepTRC TRCParams `` /* 145-byte string literal not displayed */ DeepAttn AttnParams `` /* 180-byte string literal not displayed */ DeepNeurs []Neuron `` /* 151-byte string literal not displayed */ DeepPools []Pool `` /* 247-byte string literal not displayed */ }
deep.Layer is the DeepLeabra layer, based on basic rate-coded leabra.Layer
func (*Layer) ActFmG ¶
ActFmG computes rate-code activation from Ge, Gi, Gl conductances and updates learning running-average activations from that Act
func (*Layer) AttnGeInc ¶
AttnGeInc integrates new AttnGe from increments sent during last SendGDelta. Very low overhead if no DeepAttn prjns.
func (*Layer) AvgMaxAct ¶
AvgMaxAct computes the average and max Act stats, used in inhibition Deep version also computes AvgMaxActNoAttn
func (*Layer) AvgMaxActNoAttn ¶
AvgMaxActNoAttn computes the average and max ActNoAttn stats
func (*Layer) AvgMaxAttnGe ¶
AvgMaxAttnGe computes the average and max AttnGe stats
func (*Layer) AvgMaxGe ¶
AvgMaxGe computes the average and max Ge stats, used in inhibition Deep version also computes AttnGe stats
func (*Layer) AvgMaxTRCBurstGe ¶
AvgMaxTRCBurstGe computes the average and max TRCBurstGe stats
func (*Layer) BurstFmAct ¶
BurstFmAct updates Burst layer 5 IB bursting value from current Act (superficial activation) Subject to thresholding.
func (*Layer) CtxtFmGe ¶
CtxtFmGe integrates new CtxtGe excitatory conductance from projections, and computes overall Ctxt value, only on Deep layers. This must be called at the end of the DeepBurst quarter for this layer, after SendCtxtGe.
func (*Layer) DecayState ¶
func (*Layer) DeepAttnFmG ¶
DeepAttnFmG computes DeepAttn and DeepLrn from AttnGe input, and then applies the DeepAttn modulation to the Act activation value.
func (*Layer) GFmInc ¶
GFmInc integrates new synaptic conductances from increments sent during last SendGDelta.
func (*Layer) GScaleFmAvgAct ¶
func (ly *Layer) GScaleFmAvgAct()
GScaleFmAvgAct computes the scaling factor for synaptic input conductances G, based on sending layer average activation. This attempts to automatically adjust for overall differences in raw activity coming into the units to achieve a general target of around .5 to 1 for the integrated G values. DeepLeabra version separately normalizes the Deep projection types.
func (*Layer) InitGInc ¶
func (ly *Layer) InitGInc()
InitGinc initializes the Ge excitatory and Gi inhibitory conductance accumulation states including ActSent and G*Raw values. called at start of trial always, and can be called optionally when delta-based Ge computation needs to be updated (e.g., weights might have changed strength)
func (*Layer) IsSuper ¶
IsSuper returns true if layer is not a TRC or Deep type -- all others are Super
func (*Layer) QuarterFinal ¶
QuarterFinal does updating after end of a quarter
func (*Layer) SendCtxtGe ¶
SendCtxtGe sends full Burst activation over BurstCtxt projections to integrate CtxtGe excitatory conductance on deep layers. This must be called at the end of the DeepBurst quarter for this layer.
func (*Layer) SendGDelta ¶
SendGDelta sends change in activation since last sent, if above thresholds. Deep version sends either to standard Ge or AttnGe for DeepAttn projections.
func (*Layer) SendTRCBurstGeDelta ¶
SendTRCBurstGeDelta sends change in Burst activation since last sent, over BurstTRC projections.
func (*Layer) TRCBurstGeFmInc ¶
TRCBurstGeFmInc computes the TRCBurstGe input from sent values
func (*Layer) UnitVal1DTry ¶
UnitVal1DTry returns value of given variable name on given unit, using 1-dimensional index.
func (*Layer) UnitValTry ¶
UnitValTry returns value of given variable name on given unit, using shape-based dimensional index
func (*Layer) UnitVals ¶
UnitVals fills in values of given variable name on unit, for each unit in the layer, into given float32 slice (only resized if not big enough). Returns error on invalid var name.
func (*Layer) UnitValsTensor ¶
UnitValsTensor returns values of given variable name on unit for each unit in the layer, as a float32 tensor in same shape as layer units.
func (*Layer) UnitVarNames ¶
UnitVarNames returns a list of variable names available on the units in this layer
func (*Layer) UpdateParams ¶
func (ly *Layer) UpdateParams()
UpdateParams updates all params given any changes that might have been made to individual values including those in the receiving projections of this layer
type LayerType ¶
LayerType has the DeepLeabra extensions to the emer.LayerType types, for gui
const ( Deep_ LayerType = LayerType(emer.LayerTypeN) + iota TRC_ LayerTypeN )
gui versions
func StringToLayerType ¶
type Network ¶
deep.Network has parameters for running a DeepLeabra network
func (*Network) AddInputPulv2D ¶
AddInputPulv2D adds an input and corresponding Pulvinar (P suffix) layer with BurstTRC one-to-one projection from Input to Pulvinar. Pulvinar is placed Behind Input.
func (*Network) AddInputPulv4D ¶
func (nt *Network) AddInputPulv4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (input, pulv emer.Layer)
AddInputPulv4D adds an input and corresponding Pulvinar (P suffix) layer with BurstTRC one-to-one projection from Input to Pulvinar. Pulvinar is placed Behind Input.
func (*Network) AddSuperDeep2D ¶
func (nt *Network) AddSuperDeep2D(name string, shapeY, shapeX int, pulvLay, attn bool) (super, deep, pulv emer.Layer)
AddSuperDeep2D adds a superficial (hidden) and corresponding Deep (D suffix) layer with BurstCtxt Full projection from Hidden to Deep. Optionally creates a Pulvinar for Hidden with One-to-One BurstTRC to Pulvinar, and optionally a DeepAttn projection back from Deep to Super (OneToOne). Deep is placed Behind Super, and Pulvinar behind Deep if created.
func (*Network) AddSuperDeep4D ¶
func (nt *Network) AddSuperDeep4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int, pulvLay, attn bool) (super, deep, pulv emer.Layer)
AddSuperDeep4D adds a superficial (hidden) and corresponding Deep (D suffix) layer with BurstCtxt Full projection from Hidden to Deep. Optionally creates a Pulvinar for Hidden with One-to-One BurstTRC to Pulvinar, and optionally a DeepAttn projection back from Deep to Super (OneToOne) Deep is placed Behind Super, and Pulvinar behind Deep if created.
func (*Network) Cycle ¶
Cycle runs one cycle of activation updating Deep version adds call to update DeepBurst at end
func (*Network) DeepBurst ¶
DeepBurst is called at end of Cycle, computes Burst and sends it to other layers
func (*Network) DeepCtxt ¶
DeepCtxt sends DeepBurst to Deep layers and integrates DeepCtxt on Deep layers
func (*Network) Defaults ¶
func (nt *Network) Defaults()
Defaults sets all the default parameters for all layers and projections
func (*Network) QuarterFinal ¶
QuarterFinal does updating after end of a quarter
func (*Network) UpdateParams ¶
func (nt *Network) UpdateParams()
UpdateParams updates all the derived parameters if any have changed, for all layers and projections
type Neuron ¶
type Neuron struct { ActNoAttn float32 `` /* 316-byte string literal not displayed */ Burst float32 `` /* 686-byte string literal not displayed */ BurstPrv float32 `desc:"Burst from the previous alpha trial -- this is typically used for learning in the BurstCtxt projection."` CtxtGe float32 `` /* 351-byte string literal not displayed */ TRCBurstGe float32 `` /* 198-byte string literal not displayed */ BurstSent float32 `desc:"Last Burst activation value sent, for computing TRCBurstGe using efficient delta mechanism."` AttnGe float32 `` /* 314-byte string literal not displayed */ DeepAttn float32 `` /* 493-byte string literal not displayed */ DeepLrn float32 `` /* 246-byte string literal not displayed */ }
deep.Neuron holds the extra neuron (unit) level variables for DeepLeabra computation. DeepLeabra includes both attentional and predictive learning functions of the deep layers and thalamocortical circuitry. These are maintained in a separate parallel slice from the leabra.Neuron variables.
func (*Neuron) VarByIndex ¶
VarByIndex returns variable using index (0 = first variable in NeuronVars list)
type Prjn ¶
type Prjn struct { leabra.Prjn // access as .Prjn CtxtGeInc []float32 `desc:"local per-recv unit accumulator for Ctxt excitatory conductance from sending units -- not a delta -- the full value"` TRCBurstGeInc []float32 `` /* 133-byte string literal not displayed */ AttnGeInc []float32 `` /* 129-byte string literal not displayed */ }
deep.Prjn is the DeepLeabra projection, based on basic rate-coded leabra.Prjn
func (*Prjn) DWt ¶
func (pj *Prjn) DWt()
DWt computes the weight change (learning) -- on sending projections Deep version supports DeepCtxt temporal learning option
func (*Prjn) DWtDeepCtxt ¶
func (pj *Prjn) DWtDeepCtxt()
DWtDeepCtxt computes the weight change (learning) -- for DeepCtxt projections
func (*Prjn) PrjnTypeName ¶
func (*Prjn) RecvAttnGeInc ¶
func (pj *Prjn) RecvAttnGeInc()
RecvAttnGeInc increments the receiver's AttnGe from that of all the projections
func (*Prjn) RecvCtxtGeInc ¶
func (pj *Prjn) RecvCtxtGeInc()
RecvCtxtGeInc increments the receiver's CtxtGe from that of all the projections
func (*Prjn) RecvTRCBurstGeInc ¶
func (pj *Prjn) RecvTRCBurstGeInc()
RecvTRCBurstGeInc increments the receiver's TRCBurstGe from that of all the projections
func (*Prjn) SendAttnGeDelta ¶
SendAttnGeDelta sends the delta-activation from sending neuron index si, to integrate into AttnGeInc excitatory conductance on receivers
func (*Prjn) SendCtxtGe ¶
SendCtxtGe sends the full Burst activation from sending neuron index si, to integrate CtxtGe excitatory conductance on receivers
func (*Prjn) SendTRCBurstGeDelta ¶
SendTRCBurstGeDelta sends the delta-Burst activation from sending neuron index si, to integrate TRCBurstGe excitatory conductance on receivers
func (*Prjn) UpdateParams ¶
func (pj *Prjn) UpdateParams()
type PrjnType ¶
PrjnType has the DeepLeabra extensions to the emer.PrjnType types, for gui
gui versions
func StringToPrjnType ¶
type TRCParams ¶
type TRCParams struct { MaxInhib float32 `` /* 725-byte string literal not displayed */ InhibPool bool `` /* 184-byte string literal not displayed */ Binarize bool `` /* 346-byte string literal not displayed */ BinThr float32 `` /* 164-byte string literal not displayed */ BinOn float32 `def:"0.3" viewif:"Binarize" desc:"Effective value for units above threshold -- lower value around 0.3 or so seems best."` BinOff float32 `def:"0" viewif:"Binarize" desc:"Effective value for units below threshold -- typically 0."` }
TRCParams provides parameters for how the plus-phase (outcome) state of thalamic relay cell (e.g., Pulvinar) neurons is computed from the BurstTRC projections that drive TRCBurstGe excitatory conductance.