Documentation ¶
Overview ¶
Package deep provides the DeepLeabra variant of Leabra, which performs predictive learning by attempting to predict the activation states over the Pulvinar nucleus of the thalamus (in posterior sensory cortex), which are driven phasically every 100 msec by deep layer 5 intrinsic bursting (5IB) neurons that have strong focal (essentially 1-to-1) connections onto the Pulvinar Thalamic Relay Cell (TRC) neurons.
This package has 3 specialized Layer types:
- SuperLayer: implements the superficial layer neurons, which function just like standard leabra.Layer neurons, while also directly computing the Burst activation signal that reflects the deep layer 5IB bursting activation, via thresholding of the superficial layer activations (Bursting is thought to have a higher threshold).
CTLayer: implements the layer 6 regular spiking CT corticothalamic neurons that project into the thalamus. They receive the Burst activation via a CTCtxtPrjn projection type, typically once every 100 msec, and integrate that in the CtxtGe value, which is added to other excitatory conductance inputs to drive the overall activation (Act) of these neurons. Due to the bursting nature of the Burst inputs, this causes these CT layer neurons to reflect what the superficial layers encoded on the *previous* timestep -- thus they represent a temporally-delayed context state.
CTLayer can send Context via self projections to reflect the extensive deep-to-deep lateral connectivity that provides more extensive temporal context information.
- TRCLayer: implement the TRC (Pulvinar) neurons, upon which the prediction generated by CTLayer projections is projected in the minus phase. This is computed via standard Act-driven projections that integrate into standard Ge excitatory input in TRC neurons. The 5IB Burst-driven plus-phase "outcome" activation state is driven by direct access to the corresponding driver SuperLayer (not via standard projection mechanisms).
Wiring diagram:
SuperLayer --Burst--> TRCLayer | ^ CTCtxt /- Back -/ | / v / CTLayer -----/ (typically only for higher->lower)
Timing:
The alpha-cycle quarter(s) when Burst is updated and broadcast is set in BurstQtr (defaults to Q4, can also be e.g., Q2 and Q4 for beta frequency updating). During this quarter(s), the Burst value is computed in SuperLayer, and this is continuously accessed by TRCLayer neurons to drive plus-phase outcome states.
At the *end* of the burst quarter(s), in the QuarterFinal method, CTCtxt projections convey the Burst signal from Super to CTLayer neurons, where it is integrated into the Ctxt value representing the temporally-delayed context information.
Index ¶
- Constants
- Variables
- func AddDeep2D(nt *leabra.Network, name string, shapeY, shapeX int) (super, ct, trc emer.Layer)
- func AddDeep2DFakeCT(nt *leabra.Network, name string, shapeY, shapeX int) (super, ct, trc emer.Layer)
- func AddDeep2DPy(nt *leabra.Network, name string, shapeY, shapeX int) []emer.Layer
- func AddDeep4D(nt *leabra.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (super, ct, trc emer.Layer)
- func AddDeep4DFakeCT(nt *leabra.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (super, ct, trc emer.Layer)
- func AddDeep4DPy(nt *leabra.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) []emer.Layer
- func AddDeepNoTRC2D(nt *leabra.Network, name string, shapeY, shapeX int) (super, ct emer.Layer)
- func AddDeepNoTRC2DPy(nt *leabra.Network, name string, shapeY, shapeX int) []emer.Layer
- func AddDeepNoTRC4D(nt *leabra.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (super, ct emer.Layer)
- func AddDeepNoTRC4DPy(nt *leabra.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) []emer.Layer
- func ConnectCtxtToCT(nt *leabra.Network, send, recv emer.Layer, pat prjn.Pattern) emer.Prjn
- func ConnectCtxtToCTFake(nt *leabra.Network, send, recv emer.Layer, pat prjn.Pattern) emer.Prjn
- func ConnectSuperToCT(nt *leabra.Network, send, recv emer.Layer) emer.Prjn
- func ConnectSuperToCTFake(nt *leabra.Network, send, recv emer.Layer) emer.Prjn
- func DriveAct(dni int, dly *leabra.Layer, sly *SuperLayer, issuper bool) float32
- func MaxPoolActAvg(ly *leabra.Layer) float32
- func SuperNeuronVarIdxByName(varNm string) (int, error)
- func UnitsSize(ly *leabra.Layer) (x, y int)
- type BurstParams
- type CTCtxtPrjn
- func (pj *CTCtxtPrjn) Build() error
- func (pj *CTCtxtPrjn) DWt()
- func (pj *CTCtxtPrjn) Defaults()
- func (pj *CTCtxtPrjn) InitGInc()
- func (pj *CTCtxtPrjn) PrjnTypeName() string
- func (pj *CTCtxtPrjn) RecvCtxtGeInc()
- func (pj *CTCtxtPrjn) RecvGInc()
- func (pj *CTCtxtPrjn) SendCtxtGe(si int, dburst float32)
- func (pj *CTCtxtPrjn) SendGDelta(si int, delta float32)
- func (pj *CTCtxtPrjn) Type() emer.PrjnType
- func (pj *CTCtxtPrjn) UpdateParams()
- type CTLayer
- func (ly *CTLayer) Build() error
- func (ly *CTLayer) Class() string
- func (ly *CTLayer) CtxtFmGe(ltime *leabra.Time)
- func (ly *CTLayer) Defaults()
- func (ly *CTLayer) GFmInc(ltime *leabra.Time)
- func (ly *CTLayer) InitActs()
- func (ly *CTLayer) SendCtxtGe(ltime *leabra.Time)
- func (ly *CTLayer) UnitVal1D(varIdx int, idx int) float32
- func (ly *CTLayer) UnitVarIdx(varNm string) (int, error)
- func (ly *CTLayer) UnitVarNames() []string
- func (ly *CTLayer) UnitVarNum() int
- type CtxtSender
- type Driver
- type Drivers
- type EPool
- type EPools
- type IPool
- type IPools
- type LayerType
- type Network
- func (nt *Network) AddDeep2D(name string, shapeY, shapeX int) (super, ct, pulv emer.Layer)
- func (nt *Network) AddDeep2DFakeCT(name string, shapeY, shapeX int) (super, ct, pulv emer.Layer)
- func (nt *Network) AddDeep4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (super, ct, pulv emer.Layer)
- func (nt *Network) AddDeep4DFakeCT(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (super, ct, pulv emer.Layer)
- func (nt *Network) AddDeepNoTRC2D(name string, shapeY, shapeX int) (super, ct emer.Layer)
- func (nt *Network) AddDeepNoTRC4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (super, ct emer.Layer)
- func (nt *Network) CTCtxt(ltime *leabra.Time)
- func (nt *Network) ConnectCtxtToCT(send, recv emer.Layer, pat prjn.Pattern) emer.Prjn
- func (nt *Network) Defaults()
- func (nt *Network) QuarterFinal(ltime *leabra.Time)
- func (nt *Network) UnitVarNames() []string
- func (nt *Network) UpdateParams()
- type PrjnType
- type SuperLayer
- func (ly *SuperLayer) ActFmG(ltime *leabra.Time)
- func (ly *SuperLayer) Build() error
- func (ly *SuperLayer) BurstFmAct(ltime *leabra.Time)
- func (ly *SuperLayer) BurstPrv()
- func (ly *SuperLayer) CyclePost(ltime *leabra.Time)
- func (ly *SuperLayer) DecayState(decay float32)
- func (ly *SuperLayer) Defaults()
- func (ly *SuperLayer) InitActs()
- func (ly *SuperLayer) QuarterFinal(ltime *leabra.Time)
- func (ly *SuperLayer) SendCtxtGe(ltime *leabra.Time)
- func (ly *SuperLayer) TRCLayer() (*leabra.Layer, error)
- func (ly *SuperLayer) UnitVal1D(varIdx int, idx int) float32
- func (ly *SuperLayer) UnitVarIdx(varNm string) (int, error)
- func (ly *SuperLayer) UnitVarNames() []string
- func (ly *SuperLayer) UnitVarNum() int
- func (ly *SuperLayer) UpdateParams()
- func (ly *SuperLayer) ValidateTRCLayer() error
- type SuperNeuron
- type TRCAttnParams
- type TRCLayer
- func (ly *TRCLayer) Class() string
- func (ly *TRCLayer) Defaults()
- func (ly *TRCLayer) DriverLayer(drv *Driver) (*leabra.Layer, error)
- func (ly *TRCLayer) GFmInc(ltime *leabra.Time)
- func (ly *TRCLayer) InitWts()
- func (ly *TRCLayer) IsTarget() bool
- func (ly *TRCLayer) SetDriverActs()
- func (ly *TRCLayer) SetDriverNeuron(tni int, drvGe, drvInhib float32)
- func (ly *TRCLayer) SetDriverOffs() error
- func (ly *TRCLayer) UpdateParams()
- type TRCParams
- type TRNLayer
- type TopoInhib
- type TopoInhibLayer
Constants ¶
const ( // CT are layer 6 corticothalamic projecting neurons, which drive predictions // in TRC (Pulvinar) via standard projections. CT emer.LayerType = emer.LayerTypeN + iota // TRC are thalamic relay cell neurons in the Pulvinar / MD thalamus, // which alternately reflect predictions driven by Deep layer projections, // and actual outcomes driven by Burst activity from corresponding // Super layer neurons that provide strong driving inputs to TRC neurons. TRC )
const ( // CTCtxt are projections from Superficial layers to CT layers that // send Burst activations drive updating of CtxtGe excitatory conductance, // at end of a DeepBurst quarter. These projections also use a special learning // rule that takes into account the temporal delays in the activation states. // Can also add self context from CT for deeper temporal context. CTCtxt emer.PrjnType = emer.PrjnTypeN + iota )
The DeepLeabra prjn types
Variables ¶
var ( // NeuronVars are for full list across all deep Layer types NeuronVars = []string{"Burst", "BurstPrv", "Attn", "CtxtGe"} // SuperNeuronVars are for SuperLayer directly SuperNeuronVars = []string{"Burst", "BurstPrv", "Attn"} SuperNeuronVarsMap map[string]int // NeuronVarsAll is full integrated list across inherited layers and NeuronVars NeuronVarsAll []string )
var KiT_CTCtxtPrjn = kit.Types.AddType(&CTCtxtPrjn{}, PrjnProps)
var KiT_CTLayer = kit.Types.AddType(&CTLayer{}, LayerProps)
var KiT_LayerType = kit.Enums.AddEnumExt(emer.KiT_LayerType, LayerTypeN, kit.NotBitFlag, nil)
var KiT_Network = kit.Types.AddType(&Network{}, NetworkProps)
var KiT_PrjnType = kit.Enums.AddEnumExt(emer.KiT_PrjnType, PrjnTypeN, kit.NotBitFlag, nil)
var KiT_SuperLayer = kit.Types.AddType(&SuperLayer{}, LayerProps)
var KiT_TRCLayer = kit.Types.AddType(&TRCLayer{}, LayerProps)
var KiT_TRNLayer = kit.Types.AddType(&TRNLayer{}, leabra.LayerProps)
var KiT_TopoInhibLayer = kit.Types.AddType(&TopoInhibLayer{}, LayerProps)
var LayerProps = ki.Props{ "EnumType:Typ": KiT_LayerType, "ToolBar": ki.PropSlice{ {"Defaults", ki.Props{ "icon": "reset", "desc": "return all parameters to their intial default values", }}, {"InitWts", ki.Props{ "icon": "update", "desc": "initialize the layer's weight values according to prjn parameters, for all *sending* projections out of this layer", }}, {"InitActs", ki.Props{ "icon": "update", "desc": "initialize the layer's activation values", }}, {"sep-act", ki.BlankProp{}}, {"LesionNeurons", ki.Props{ "icon": "close", "desc": "Lesion (set the Off flag) for given proportion of neurons in the layer (number must be 0 -- 1, NOT percent!)", "Args": ki.PropSlice{ {"Proportion", ki.Props{ "desc": "proportion (0 -- 1) of neurons to lesion", }}, }, }}, {"UnLesionNeurons", ki.Props{ "icon": "reset", "desc": "Un-Lesion (reset the Off flag) for all neurons in the layer", }}, }, }
LayerProps are required to get the extended EnumType
var NetworkProps = leabra.NetworkProps
var PrjnProps = ki.Props{ "EnumType:Typ": KiT_PrjnType, }
Functions ¶
func AddDeep2D ¶ added in v1.1.4
AddDeep2D adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn OneToOne projection from Super to CT, and TRC Pulvinar for Super (P suffix). TRC projects back to Super and CT layers, type = Back, class = FmPulv CT is placed Behind Super, and Pulvinar behind CT. Drivers must be added to the TRC layer, and it must be sized appropriately for those drivers.
func AddDeep2DFakeCT ¶ added in v1.1.26
func AddDeep2DFakeCT(nt *leabra.Network, name string, shapeY, shapeX int) (super, ct, trc emer.Layer)
AddDeep2DFakeCT adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with FAKE CTCtxtPrjn OneToOne projection from Super to CT, and TRC Pulvinar for Super (P suffix). TRC projects back to Super and CT layers, type = Back, class = FmPulv CT is placed Behind Super, and Pulvinar behind CT. Drivers must be added to the TRC layer, and it must be sized appropriately for those drivers. This does NOT make a CTCtxtPrjn -- instead makes a regular leabra.Prjn -- for testing!
func AddDeep2DPy ¶ added in v1.1.15
AddDeep2DPy adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn Full projection from Super to CT, and TRC Pulvinar for Super (P suffix). TRC projects back to Super and CT layers, type = Back, class = FmPulv CT is placed Behind Super, and Pulvinar behind CT. Drivers must be added to the TRC layer, and it must be sized appropriately for those drivers. Py is Python version, returns layers as a slice
func AddDeep4D ¶ added in v1.1.4
func AddDeep4D(nt *leabra.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (super, ct, trc emer.Layer)
AddDeep4D adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn OneToOne projection from Super to CT, and TRC Pulvinar for Super (P suffix). TRC projects back to Super and CT layers, also PoolOneToOne, class = FmPulv CT is placed Behind Super, and Pulvinar behind CT. Drivers must be added to the TRC layer, and it must be sized appropriately for those drivers.
func AddDeep4DFakeCT ¶ added in v1.1.26
func AddDeep4DFakeCT(nt *leabra.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (super, ct, trc emer.Layer)
AddDeep4DFakeCT adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with FAKE CTCtxtPrjn OneToOne projection from Super to CT, and TRC Pulvinar for Super (P suffix). TRC projects back to Super and CT layers, also PoolOneToOne, class = FmPulv CT is placed Behind Super, and Pulvinar behind CT. Drivers must be added to the TRC layer, and it must be sized appropriately for those drivers. This does NOT make a CTCtxtPrjn -- instead makes a regular leabra.Prjn -- for testing!
func AddDeep4DPy ¶ added in v1.1.15
func AddDeep4DPy(nt *leabra.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) []emer.Layer
AddDeep4DPy adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn PoolOneToOne projection from Super to CT, and TRC Pulvinar for Super (P suffix). TRC projects back to Super and CT layers, also PoolOneToOne, class = FmPulv CT is placed Behind Super, and Pulvinar behind CT. Drivers must be added to the TRC layer, and it must be sized appropriately for those drivers. Py is Python version, returns layers as a slice
func AddDeepNoTRC2D ¶ added in v1.1.4
AddDeepNoTRC2D adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn OneToOne projection from Super to CT, and NO TRC Pulvinar. CT is placed Behind Super.
func AddDeepNoTRC2DPy ¶ added in v1.1.15
AddDeepNoTRC2DPy adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn Full projection from Super to CT, and NO TRC Pulvinar. CT is placed Behind Super. Py is Python version, returns layers as a slice
func AddDeepNoTRC4D ¶ added in v1.1.4
func AddDeepNoTRC4D(nt *leabra.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (super, ct emer.Layer)
AddDeepNoTRC4D adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn OneToOne projection from Super to CT, and NO TRC Pulvinar. CT is placed Behind Super.
func AddDeepNoTRC4DPy ¶ added in v1.1.15
func AddDeepNoTRC4DPy(nt *leabra.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) []emer.Layer
AddDeepNoTRC4DPy adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn PoolOneToOne projection from Super to CT, and NO TRC Pulvinar. CT is placed Behind Super. Py is Python version, returns layers as a slice
func ConnectCtxtToCT ¶ added in v1.1.2
ConnectCtxtToCT adds a CTCtxtPrjn from given sending layer to a CT layer Use ConnectSuperToCT for main projection from corresponding superficial layer.
func ConnectCtxtToCTFake ¶ added in v1.1.26
ConnectCtxtToCTFake adds a FAKE CTCtxtPrjn from given sending layer to a CT layer This does NOT make a CTCtxtPrjn -- instead makes a regular leabra.Prjn -- for testing!
func ConnectSuperToCT ¶ added in v1.1.21
ConnectSuperToCT adds a CTCtxtPrjn from given sending Super layer to a CT layer This automatically sets the FmSuper flag to engage proper defaults, uses a OneToOne prjn pattern, and sets the class to CTFmSuper
func ConnectSuperToCTFake ¶ added in v1.1.26
ConnectSuperToCTFake adds a FAKE CTCtxtPrjn from given sending Super layer to a CT layer uses a OneToOne prjn pattern, and sets the class to CTFmSuper. This does NOT make a CTCtxtPrjn -- instead makes a regular leabra.Prjn -- for testing!
func MaxPoolActAvg ¶ added in v1.1.4
MaxPoolActAvg returns the max Inhib.Act.Avg value across pools
func SuperNeuronVarIdxByName ¶ added in v1.1.4
SuperNeuronVarIdxByName returns the index of the variable in the SuperNeuron, or error
Types ¶
type BurstParams ¶ added in v1.0.0
type BurstParams struct { BurstQtr leabra.Quarters `` /* 206-byte string literal not displayed */ ThrRel float32 `` /* 353-byte string literal not displayed */ ThrAbs float32 `` /* 246-byte string literal not displayed */ }
BurstParams determine how the 5IB Burst activation is computed from standard Act activation values in SuperLayer -- thresholded.
func (*BurstParams) Defaults ¶ added in v1.0.0
func (db *BurstParams) Defaults()
type CTCtxtPrjn ¶ added in v1.1.2
type CTCtxtPrjn struct { leabra.Prjn // access as .Prjn FmSuper bool `` /* 200-byte string literal not displayed */ CtxtGeInc []float32 `desc:"local per-recv unit accumulator for Ctxt excitatory conductance from sending units -- not a delta -- the full value"` }
CTCtxtPrjn is the "context" temporally-delayed projection into CTLayer, (corticothalamic deep layer 6) where the CtxtGe excitatory input is integrated only at end of Burst Quarter. Set FmSuper for the main projection from corresponding Super layer.
func (*CTCtxtPrjn) Build ¶ added in v1.1.2
func (pj *CTCtxtPrjn) Build() error
func (*CTCtxtPrjn) DWt ¶ added in v1.1.2
func (pj *CTCtxtPrjn) DWt()
DWt computes the weight change (learning) for Ctxt projections
func (*CTCtxtPrjn) Defaults ¶ added in v1.1.2
func (pj *CTCtxtPrjn) Defaults()
func (*CTCtxtPrjn) InitGInc ¶ added in v1.1.2
func (pj *CTCtxtPrjn) InitGInc()
func (*CTCtxtPrjn) PrjnTypeName ¶ added in v1.1.2
func (pj *CTCtxtPrjn) PrjnTypeName() string
func (*CTCtxtPrjn) RecvCtxtGeInc ¶ added in v1.1.2
func (pj *CTCtxtPrjn) RecvCtxtGeInc()
RecvCtxtGeInc increments the receiver's CtxtGe from that of all the projections
func (*CTCtxtPrjn) RecvGInc ¶ added in v1.1.2
func (pj *CTCtxtPrjn) RecvGInc()
RecvGInc: disabled for this type
func (*CTCtxtPrjn) SendCtxtGe ¶ added in v1.1.2
func (pj *CTCtxtPrjn) SendCtxtGe(si int, dburst float32)
SendCtxtGe sends the full Burst activation from sending neuron index si, to integrate CtxtGe excitatory conductance on receivers
func (*CTCtxtPrjn) SendGDelta ¶ added in v1.1.2
func (pj *CTCtxtPrjn) SendGDelta(si int, delta float32)
SendGDelta: disabled for this type
func (*CTCtxtPrjn) Type ¶ added in v1.1.2
func (pj *CTCtxtPrjn) Type() emer.PrjnType
func (*CTCtxtPrjn) UpdateParams ¶ added in v1.1.2
func (pj *CTCtxtPrjn) UpdateParams()
type CTLayer ¶ added in v1.1.2
type CTLayer struct { TopoInhibLayer // access as .TopoInhibLayer BurstQtr leabra.Quarters `` /* 206-byte string literal not displayed */ CtxtGes []float32 `desc:"slice of context (temporally delayed) excitatory conducances."` }
CTLayer implements the corticothalamic projecting layer 6 deep neurons that project to the TRC pulvinar neurons, to generate the predictions. They receive phasic input representing 5IB bursting via CTCtxtPrjn inputs from SuperLayer and also from self projections.
func AddCTLayer2D ¶ added in v1.1.2
AddCTLayer2D adds a CTLayer of given size, with given name.
func AddCTLayer4D ¶ added in v1.1.2
AddCTLayer4D adds a CTLayer of given size, with given name.
func (*CTLayer) Build ¶ added in v1.1.2
Build constructs the layer state, including calling Build on the projections.
func (*CTLayer) CtxtFmGe ¶ added in v1.1.2
CtxtFmGe integrates new CtxtGe excitatory conductance from projections, and computes overall Ctxt value, only on Deep layers. This must be called at the end of the DeepBurst quarter for this layer, after SendCtxtGe.
func (*CTLayer) GFmInc ¶ added in v1.1.2
GFmInc integrates new synaptic conductances from increments sent during last SendGDelta.
func (*CTLayer) SendCtxtGe ¶ added in v1.1.2
SendCtxtGe sends activation over CTCtxtPrjn projections to integrate CtxtGe excitatory conductance on CT layers. This must be called at the end of the Burst quarter for this layer. Satisfies the CtxtSender interface.
func (*CTLayer) UnitVal1D ¶ added in v1.1.2
UnitVal1D returns value of given variable index on given unit, using 1-dimensional index. returns NaN on invalid index. This is the core unit var access method used by other methods, so it is the only one that needs to be updated for derived layer types.
func (*CTLayer) UnitVarIdx ¶ added in v1.1.2
UnitVarIdx returns the index of given variable within the Neuron, according to UnitVarNames() list (using a map to lookup index), or -1 and error message if not found.
func (*CTLayer) UnitVarNames ¶ added in v1.1.2
UnitVarNames returns a list of variable names available on the units in this layer
func (*CTLayer) UnitVarNum ¶ added in v1.1.2
UnitVarNum returns the number of Neuron-level variables for this layer. This is needed for extending indexes in derived types.
type CtxtSender ¶ added in v1.1.2
type CtxtSender interface { leabra.LeabraLayer // SendCtxtGe sends activation over CTCtxtPrjn projections to integrate // CtxtGe excitatory conductance on CT layers. // This must be called at the end of the Burst quarter for this layer. SendCtxtGe(ltime *leabra.Time) }
CtxtSender is an interface for layers that implement the SendCtxtGe method (SuperLayer, CTLayer)
type Driver ¶ added in v1.1.4
type Driver struct { Driver string `desc:"driver layer"` Off int `inactive:"-" desc:"offset into TRC pool"` }
Driver describes the source of driver inputs from cortex into TRC (pulvinar)
type Drivers ¶ added in v1.1.4
type Drivers []*Driver
Drivers are a list of drivers
type EPool ¶ added in v1.1.4
type EPool struct { LayNm string `desc:"layer name"` Wt float32 `desc:"general scaling factor for how much excitation from this pool"` }
EPool are how to gather excitation across pools
type EPools ¶ added in v1.1.4
type EPools []*EPool
EPools is a list of pools
type IPool ¶ added in v1.1.4
type IPool struct { LayNm string `desc:"layer name"` Wt float32 `desc:"general scaling factor for how much overall inhibition from this pool contributes, in a non-pool-specific manner"` PoolWt float32 `` /* 160-byte string literal not displayed */ SOff evec.Vec2i `desc:"offset into source, sending layer"` ROff evec.Vec2i `desc:"offset into our own receiving layer"` }
IPool are how to gather inhibition across pools
type IPools ¶ added in v1.1.4
type IPools []*IPool
IPools is a list of pools
type LayerType ¶ added in v1.0.0
LayerType has the DeepLeabra extensions to the emer.LayerType types, for gui
const ( CT_ LayerType = LayerType(emer.LayerTypeN) + iota TRC_ LayerTypeN )
gui versions
func StringToLayerType ¶ added in v1.0.0
type Network ¶
deep.Network has parameters for running a DeepLeabra network
func (*Network) AddDeep2D ¶ added in v1.1.4
AddDeep2D adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn OneToOne projection from Super to CT. Optionally creates a TRC Pulvinar for Super. CT is placed Behind Super, and Pulvinar behind CT if created.
func (*Network) AddDeep2DFakeCT ¶ added in v1.1.26
AddDeep2DFakeCT adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with FAKE CTCtxtPrjn OneToOne projection from Super to CT. Optionally creates a TRC Pulvinar for Super. CT is placed Behind Super, and Pulvinar behind CT if created.
func (*Network) AddDeep4D ¶ added in v1.1.4
func (nt *Network) AddDeep4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (super, ct, pulv emer.Layer)
AddDeep4D adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn OneToOne projection from Super to CT. Optionally creates a TRC Pulvinar for Super. CT is placed Behind Super, and Pulvinar behind CT if created.
func (*Network) AddDeep4DFakeCT ¶ added in v1.1.26
func (nt *Network) AddDeep4DFakeCT(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (super, ct, pulv emer.Layer)
AddDeep4DFakeCT adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with FAKE CTCtxtPrjn OneToOne projection from Super to CT. Optionally creates a TRC Pulvinar for Super. CT is placed Behind Super, and Pulvinar behind CT if created.
func (*Network) AddDeepNoTRC2D ¶ added in v1.1.4
AddDeepNoTRC2D adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn OneToOne projection from Super to CT, and NO TRC Pulvinar. CT is placed Behind Super.
func (*Network) AddDeepNoTRC4D ¶ added in v1.1.4
func (nt *Network) AddDeepNoTRC4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (super, ct emer.Layer)
AddDeepNoTRC4D adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn PoolOneToOne projection from Super to CT, and NO TRC Pulvinar. CT is placed Behind Super.
func (*Network) CTCtxt ¶ added in v1.1.2
CTCtxt sends context to CT layers and integrates CtxtGe on CT layers
func (*Network) ConnectCtxtToCT ¶ added in v1.1.2
ConnectCtxtToCT adds a CTCtxtPrjn from given sending layer to a CT layer
func (*Network) Defaults ¶
func (nt *Network) Defaults()
Defaults sets all the default parameters for all layers and projections
func (*Network) QuarterFinal ¶
QuarterFinal does updating after end of a quarter
func (*Network) UnitVarNames ¶ added in v1.1.0
UnitVarNames returns a list of variable names available on the units in this layer
func (*Network) UpdateParams ¶
func (nt *Network) UpdateParams()
UpdateParams updates all the derived parameters if any have changed, for all layers and projections
type PrjnType ¶ added in v1.0.0
PrjnType has the DeepLeabra extensions to the emer.PrjnType types, for gui
func StringToPrjnType ¶ added in v1.0.0
type SuperLayer ¶ added in v1.1.2
type SuperLayer struct { TopoInhibLayer // access as .TopoInhibLayer Burst BurstParams `` /* 142-byte string literal not displayed */ Attn TRCAttnParams `` /* 215-byte string literal not displayed */ SuperNeurs []SuperNeuron `desc:"slice of super neuron values -- same size as Neurons"` }
SuperLayer is the DeepLeabra superficial layer, based on basic rate-coded leabra.Layer. Computes the Burst activation from regular activations.
func AddSuperLayer2D ¶ added in v1.1.2
func AddSuperLayer2D(nt *leabra.Network, name string, nNeurY, nNeurX int) *SuperLayer
AddSuperLayer2D adds a SuperLayer of given size, with given name.
func AddSuperLayer4D ¶ added in v1.1.2
func AddSuperLayer4D(nt *leabra.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) *SuperLayer
AddSuperLayer4D adds a SuperLayer of given size, with given name.
func (*SuperLayer) ActFmG ¶ added in v1.1.4
func (ly *SuperLayer) ActFmG(ltime *leabra.Time)
func (*SuperLayer) Build ¶ added in v1.1.2
func (ly *SuperLayer) Build() error
Build constructs the layer state, including calling Build on the projections.
func (*SuperLayer) BurstFmAct ¶ added in v1.1.2
func (ly *SuperLayer) BurstFmAct(ltime *leabra.Time)
BurstFmAct updates Burst layer 5IB bursting value from current Act (superficial activation), subject to thresholding.
func (*SuperLayer) BurstPrv ¶ added in v1.1.2
func (ly *SuperLayer) BurstPrv()
BurstPrv saves Burst as BurstPrv
func (*SuperLayer) CyclePost ¶ added in v1.1.2
func (ly *SuperLayer) CyclePost(ltime *leabra.Time)
CyclePost calls BurstFmAct
func (*SuperLayer) DecayState ¶ added in v1.1.2
func (ly *SuperLayer) DecayState(decay float32)
func (*SuperLayer) Defaults ¶ added in v1.1.2
func (ly *SuperLayer) Defaults()
func (*SuperLayer) InitActs ¶ added in v1.1.2
func (ly *SuperLayer) InitActs()
func (*SuperLayer) QuarterFinal ¶ added in v1.1.2
func (ly *SuperLayer) QuarterFinal(ltime *leabra.Time)
QuarterFinal does updating after end of a quarter
func (*SuperLayer) SendCtxtGe ¶ added in v1.1.2
func (ly *SuperLayer) SendCtxtGe(ltime *leabra.Time)
SendCtxtGe sends Burst activation over CTCtxtPrjn projections to integrate CtxtGe excitatory conductance on CT layers. This must be called at the end of the Burst quarter for this layer. Satisfies the CtxtSender interface.
func (*SuperLayer) TRCLayer ¶ added in v1.1.4
func (ly *SuperLayer) TRCLayer() (*leabra.Layer, error)
TRCLayer returns the TRC layer for attentional modulation
func (*SuperLayer) UnitVal1D ¶ added in v1.1.2
func (ly *SuperLayer) UnitVal1D(varIdx int, idx int) float32
UnitVal1D returns value of given variable index on given unit, using 1-dimensional index. returns NaN on invalid index. This is the core unit var access method used by other methods, so it is the only one that needs to be updated for derived layer types.
func (*SuperLayer) UnitVarIdx ¶ added in v1.1.2
func (ly *SuperLayer) UnitVarIdx(varNm string) (int, error)
UnitVarIdx returns the index of given variable within the Neuron, according to UnitVarNames() list (using a map to lookup index), or -1 and error message if not found.
func (*SuperLayer) UnitVarNames ¶ added in v1.1.2
func (ly *SuperLayer) UnitVarNames() []string
UnitVarNames returns a list of variable names available on the units in this layer
func (*SuperLayer) UnitVarNum ¶ added in v1.1.2
func (ly *SuperLayer) UnitVarNum() int
UnitVarNum returns the number of Neuron-level variables for this layer. This is needed for extending indexes in derived types.
func (*SuperLayer) UpdateParams ¶ added in v1.1.2
func (ly *SuperLayer) UpdateParams()
UpdateParams updates all params given any changes that might have been made to individual values including those in the receiving projections of this layer
func (*SuperLayer) ValidateTRCLayer ¶ added in v1.1.4
func (ly *SuperLayer) ValidateTRCLayer() error
type SuperNeuron ¶ added in v1.1.2
type SuperNeuron struct { Burst float32 `desc:"5IB bursting activation value, computed by thresholding regular activation"` BurstPrv float32 `desc:"previous bursting activation -- used for context-based learning"` Attn float32 `desc:"attentional signal from TRC layer"` }
SuperNeuron has the neuron values for SuperLayer
func (*SuperNeuron) VarByIdx ¶ added in v1.1.2
func (sn *SuperNeuron) VarByIdx(idx int) float32
type TRCAttnParams ¶ added in v1.1.4
type TRCAttnParams struct { On bool `desc:"is attentional modulation active?"` Min float32 `desc:"minimum act multiplier if attention is 0"` TRCLay string `desc:"name of TRC layer -- defaults to layer name + P"` }
TRCAttnParams determine how the TRCLayer activation modulates SuperLayer activations
func (*TRCAttnParams) Defaults ¶ added in v1.1.4
func (at *TRCAttnParams) Defaults()
type TRCLayer ¶ added in v1.1.2
type TRCLayer struct { TopoInhibLayer // access as .TopoInhibLayer TRC TRCParams `` /* 141-byte string literal not displayed */ Drivers Drivers `desc:"name of SuperLayer that sends 5IB Burst driver inputs to this layer"` }
TRCLayer is the thalamic relay cell layer for DeepLeabra. It has normal activity during the minus phase, as activated by CT etc inputs, and is then driven by strong 5IB driver inputs in the plus phase. For attentional modulation, TRC maintains pool-level correspondence with CT inputs which creates challenges for aligning with driver inputs.
- Max operation used to integrate across multiple drivers, where necessary, e.g., multiple driver pools map onto single TRC pool (common feedforward theme), *even when there is no logical connection for the i'th unit in each pool* -- to make this dimensionality reduction more effective, using lateral connectivity between pools that favors this correspondence is beneficial. Overall, this is consistent with typical DCNN max pooling organization.
- Typically, pooled 4D TRC layers should have fewer pools than driver layers, in which case the respective pool geometry is interpolated. Ideally, integer size differences are best (e.g., driver layer has 2x pools vs TRC).
- Pooled 4D TRC layer should in general not predict flat 2D drivers, but if so the drivers are replicated for each pool.
- Similarly, there shouldn't generally be more TRC pools than driver pools, but if so, drivers replicate across pools.
func AddTRCLayer2D ¶ added in v1.1.2
AddTRCLayer2D adds a TRCLayer of given size, with given name.
func AddTRCLayer4D ¶ added in v1.1.2
AddTRCLayer4D adds a TRCLayer of given size, with given name.
func (*TRCLayer) DriverLayer ¶ added in v1.1.2
DriverLayer returns the driver layer for given Driver
func (*TRCLayer) GFmInc ¶ added in v1.1.2
GFmInc integrates new synaptic conductances from increments sent during last SendGDelta.
func (*TRCLayer) SetDriverActs ¶ added in v1.1.4
func (ly *TRCLayer) SetDriverActs()
SetDriverActs sets the driver activations, integrating across all the driver layers
func (*TRCLayer) SetDriverNeuron ¶ added in v1.1.4
SetDriverNeuron sets the driver activation for given Neuron, based on given Ge driving value (use DriveFmMaxAvg) from driver layer (Burst or Act)
func (*TRCLayer) SetDriverOffs ¶ added in v1.1.4
SetDriverOffs sets the driver offsets
func (*TRCLayer) UpdateParams ¶ added in v1.1.2
func (ly *TRCLayer) UpdateParams()
UpdateParams updates all params given any changes that might have been made to individual values including those in the receiving projections of this layer
type TRCParams ¶ added in v1.0.0
type TRCParams struct { DriversOff bool `def:"false" desc:"Turn off the driver inputs, in which case this layer behaves like a standard layer"` BurstQtr leabra.Quarters `` /* 188-byte string literal not displayed */ DriveScale float32 `def:"0.3" min:"0.0" desc:"multiplier on driver input strength, multiplies activation of driver layer"` MaxInhib float32 `` /* 662-byte string literal not displayed */ NoTopo bool `` /* 159-byte string literal not displayed */ AvgMix float32 `` /* 209-byte string literal not displayed */ Binarize bool `` /* 234-byte string literal not displayed */ BinThr float32 `viewif:"Binarize" desc:"Threshold for binarizing in terms of sending Burst activation"` BinOn float32 `` /* 190-byte string literal not displayed */ BinOff float32 `def:"0" viewif:"Binarize" desc:"Resulting driver Ge value for units below threshold -- typically 0."` }
TRCParams provides parameters for how the plus-phase (outcome) state of thalamic relay cell (e.g., Pulvinar) neurons is computed from the corresponding driver neuron Burst activation.
func (*TRCParams) DriveGe ¶ added in v1.1.2
DriveGe returns effective excitatory conductance to use for given driver input Burst activation
func (*TRCParams) GeFmMaxAvg ¶ added in v1.1.6
GeFmMaxAvg returns the drive Ge value as function of max and average
type TRNLayer ¶ added in v1.1.4
type TRNLayer struct { leabra.Layer ILayers emer.LayNames `desc:"layers that we receive inhibition from"` }
TRNLayer copies inhibition from pools in CT and TRC layers, and from other TRNLayers, and pools this inhibition using the Max operation
type TopoInhib ¶ added in v1.1.4
type TopoInhib struct { On bool `desc:"use topographic inhibition"` Width int `desc:"half-width of topographic inhibition within layer"` Sigma float32 `desc:"normalized gaussian sigma as proportion of Width, for gaussian weighting"` Gi float32 `desc:"overall inhibition multiplier for topographic inhibition (generally <= 1)"` LayGi float32 `` /* 147-byte string literal not displayed */ Wts []float32 `inactive:"+" desc:"gaussian weights as function of distance, precomputed. index 0 = dist 1"` }
TopoInhib provides for topographic gaussian inhibition integrating over neighborhood. Effective inhibition is
type TopoInhibLayer ¶ added in v1.1.4
type TopoInhibLayer struct { leabra.Layer // access as .Layer TopoInhib TopoInhib `desc:"topographic inhibition parameters for pool-level inhibition (only used for layers with pools)"` }
TopoInhibLayer is a layer with topographically organized inhibition among pools
func (*TopoInhibLayer) Defaults ¶ added in v1.1.4
func (ly *TopoInhibLayer) Defaults()
func (*TopoInhibLayer) InhibFmGeAct ¶ added in v1.1.4
func (ly *TopoInhibLayer) InhibFmGeAct(ltime *leabra.Time)
InhibFmGeAct computes inhibition Gi from Ge and Act averages within relevant Pools
func (*TopoInhibLayer) TopoGi ¶ added in v1.1.4
func (ly *TopoInhibLayer) TopoGi(ltime *leabra.Time)
TopoGi computes topographic Gi between pools
func (*TopoInhibLayer) TopoGiPos ¶ added in v1.1.4
func (ly *TopoInhibLayer) TopoGiPos(py, px, d int) float32
TopoGiPos returns position-specific Gi contribution
func (*TopoInhibLayer) UpdateParams ¶ added in v1.1.4
func (ly *TopoInhibLayer) UpdateParams()
UpdateParams updates all params given any changes that might have been made to individual values including those in the receiving projections of this layer