Documentation ¶
Overview ¶
Package deep provides the DeepAxon variant of Axon, which performs predictive learning by attempting to predict the activation states over the Pulvinar nucleus of the thalamus (in posterior sensory cortex), which are driven phasically every 100 msec by deep layer 5 intrinsic bursting (5IB) neurons that have strong focal (essentially 1-to-1) connections onto the Pulvinar Thalamic Relay Cell (Pulv) neurons.
This package has 3 specialized Layer types:
- SuperLayer: implements the superficial layer neurons, which function just like standard axon.Layer neurons, while also directly computing the Burst activation signal that reflects the deep layer 5IB bursting activation, via thresholding of the superficial layer activations (Bursting is thought to have a higher threshold).
CTLayer: implements the layer 6 regular spiking CT corticothalamic neurons that project into the thalamus. They receive the Burst activation via a CTCtxtPrjn projection type, typically once every 100 msec, and integrate that in the CtxtGe value, which is added to other excitatory conductance inputs to drive the overall activation (Act) of these neurons. Due to the bursting nature of the Burst inputs, this causes these CT layer neurons to reflect what the superficial layers encoded on the *previous* timestep -- thus they represent a temporally-delayed context state.
CTLayer can send Context via self projections to reflect the extensive deep-to-deep lateral connectivity that provides more extensive temporal context information.
- PulvLayer: implement the Pulv (Pulvinar) neurons, upon which the prediction generated by CTLayer projections is projected in the minus phase. This is computed via standard Act-driven projections that integrate into standard Ge excitatory input in Pulv neurons. The 5IB Burst-driven plus-phase "outcome" activation state is driven by direct access to the corresponding driver SuperLayer (not via standard projection mechanisms).
Wiring diagram:
SuperLayer --Burst--> PulvLayer | ^ CTCtxt /- Back -/ | / v / CTLayer -----/ (typically only for higher->lower)
Timing:
The alpha-cycle quarter(s) when Burst is updated and broadcast is set in BurstQtr (defaults to Q4, can also be e.g., Q2 and Q4 for beta frequency updating). During this quarter(s), the Burst value is computed in SuperLayer, and this is continuously accessed by PulvLayer neurons to drive plus-phase outcome states.
At the *end* of the burst quarter(s), in the QuarterFinal method, CTCtxt projections convey the Burst signal from Super to CTLayer neurons, where it is integrated into the Ctxt value representing the temporally-delayed context information.
Index ¶
- Constants
- Variables
- func AddPulvForSuper(nt *axon.Network, super emer.Layer, space float32) emer.Layer
- func AddSuperCT2D(nt *axon.Network, name string, shapeY, shapeX int, space float32, ...) (super, ct emer.Layer)
- func AddSuperCT4D(nt *axon.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int, ...) (super, ct emer.Layer)
- func ConnectCTSelf(nt *axon.Network, ly emer.Layer, pat prjn.Pattern) (ctxt, maint emer.Prjn)
- func ConnectCtxtToCT(nt *axon.Network, send, recv emer.Layer, pat prjn.Pattern) emer.Prjn
- func ConnectSuperToCT(nt *axon.Network, send, recv emer.Layer, pat prjn.Pattern) emer.Prjn
- func ConnectToPulv(nt *axon.Network, super, ct, pulv emer.Layer, ...) (toPulv, toSuper, toCT emer.Prjn)
- func DriveAct(dni int, dly *axon.Layer, sly *SuperLayer, issuper bool) float32
- func LayerSendCtxtGe(ly *axon.Layer, ltime *axon.Time)
- func LogAddPulvCorSimItems(lg *elog.Logs, net *axon.Network, times ...etime.Times)
- func SuperNeuronVarIdxByName(varNm string) (int, error)
- type BurstParams
- type CTCtxtPrjn
- func (pj *CTCtxtPrjn) Build() error
- func (pj *CTCtxtPrjn) DWt(ltime *axon.Time)
- func (pj *CTCtxtPrjn) Defaults()
- func (pj *CTCtxtPrjn) GFmSpike(ltime *axon.Time)
- func (pj *CTCtxtPrjn) InitGBuffs()
- func (pj *CTCtxtPrjn) PrjnTypeName() string
- func (pj *CTCtxtPrjn) RecvCtxtGeInc()
- func (pj *CTCtxtPrjn) RecvSynCa(ltime *axon.Time)
- func (pj *CTCtxtPrjn) SendCtxtGe(si int, burst float32)
- func (pj *CTCtxtPrjn) SendSpike(si int)
- func (pj *CTCtxtPrjn) SendSynCa(ltime *axon.Time)
- func (pj *CTCtxtPrjn) Type() emer.PrjnType
- func (pj *CTCtxtPrjn) UpdateParams()
- type CTLayer
- func (ly *CTLayer) Build() error
- func (ly *CTLayer) Class() string
- func (ly *CTLayer) CtxtFmGe(ltime *axon.Time)
- func (ly *CTLayer) DecayState(decay, glong float32)
- func (ly *CTLayer) Defaults()
- func (ly *CTLayer) GFmSpike(ltime *axon.Time)
- func (ly *CTLayer) InitActs()
- func (ly *CTLayer) SendCtxtGe(ltime *axon.Time)
- func (ly *CTLayer) UnitVal1D(varIdx int, idx int) float32
- func (ly *CTLayer) UnitVarIdx(varNm string) (int, error)
- func (ly *CTLayer) UnitVarNames() []string
- func (ly *CTLayer) UnitVarNum() int
- func (ly *CTLayer) UpdateParams()
- type CTParams
- type CtxtSender
- type EPool
- type EPools
- type IPool
- type IPools
- type LayerType
- type Network
- func (nt *Network) AddInputPulv2D(name string, nNeurY, nNeurX int, space float32) (emer.Layer, *PulvLayer)
- func (nt *Network) AddInputPulv4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int, space float32) (emer.Layer, *PulvLayer)
- func (nt *Network) AddPulvAttnLayer2D(name string, nNeurY, nNeurX int) *PulvAttnLayer
- func (nt *Network) AddPulvAttnLayer4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) *PulvAttnLayer
- func (nt *Network) AddPulvForSuper(super emer.Layer, space float32) emer.Layer
- func (nt *Network) AddSuperCT2D(name string, shapeY, shapeX int, space float32, pat prjn.Pattern) (super, ct emer.Layer)
- func (nt *Network) AddSuperCT4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int, space float32, ...) (super, ct emer.Layer)
- func (nt *Network) AddSuperLayer2D(name string, nNeurY, nNeurX int) *SuperLayer
- func (nt *Network) AddSuperLayer4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) *SuperLayer
- func (nt *Network) CTCtxt(ltime *axon.Time)
- func (nt *Network) ConnectCTSelf(ly emer.Layer, pat prjn.Pattern) (ctxt, maint emer.Prjn)
- func (nt *Network) ConnectCtxtToCT(send, recv emer.Layer, pat prjn.Pattern) emer.Prjn
- func (nt *Network) ConnectToPulv(super, ct, pulv emer.Layer, toPulvPat, fmPulvPat prjn.Pattern) (toPulv, toSuper, toCT emer.Prjn)
- func (nt *Network) Defaults()
- func (nt *Network) PlusPhaseImpl(ltime *axon.Time)
- func (nt *Network) UnitVarNames() []string
- func (nt *Network) UpdateParams()
- type PrjnType
- type PulvAttnLayer
- func (ly *PulvAttnLayer) AttnFmAct(ltime *axon.Time)
- func (ly *PulvAttnLayer) Class() string
- func (ly *PulvAttnLayer) CyclePost(ltime *axon.Time)
- func (ly *PulvAttnLayer) Defaults()
- func (ly *PulvAttnLayer) IsTarget() bool
- func (ly *PulvAttnLayer) SendAttnLay(tly *axon.Layer, ltime *axon.Time)
- func (ly *PulvAttnLayer) SendAttnLays(ltime *axon.Time)
- func (ly *PulvAttnLayer) UpdateParams()
- type PulvLayer
- func AddInputPulv2D(nt *axon.Network, name string, nNeurY, nNeurX int, space float32) (emer.Layer, *PulvLayer)
- func AddInputPulv4D(nt *axon.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int, ...) (emer.Layer, *PulvLayer)
- func AddPulvLayer2D(nt *axon.Network, name string, nNeurY, nNeurX int) *PulvLayer
- func AddPulvLayer4D(nt *axon.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) *PulvLayer
- func (ly *PulvLayer) Class() string
- func (ly *PulvLayer) Defaults()
- func (ly *PulvLayer) DriverLayer(drv string) (*axon.Layer, error)
- func (ly *PulvLayer) GFmSpike(ltime *axon.Time)
- func (ly *PulvLayer) GeFmDrivers(ltime *axon.Time)
- func (ly *PulvLayer) IsTarget() bool
- func (ly *PulvLayer) UpdateParams()
- type PulvParams
- type SendAttnParams
- type SuperLayer
- func (ly *SuperLayer) Build() error
- func (ly *SuperLayer) BurstFmCaSpkP(ltime *axon.Time)
- func (ly *SuperLayer) BurstPrv()
- func (ly *SuperLayer) CyclePost(ltime *axon.Time)
- func (ly *SuperLayer) DecayState(decay, glong float32)
- func (ly *SuperLayer) Defaults()
- func (ly *SuperLayer) InitActs()
- func (ly *SuperLayer) NewState()
- func (ly *SuperLayer) SendCtxtGe(ltime *axon.Time)
- func (ly *SuperLayer) UnitVal1D(varIdx int, idx int) float32
- func (ly *SuperLayer) UnitVarIdx(varNm string) (int, error)
- func (ly *SuperLayer) UnitVarNames() []string
- func (ly *SuperLayer) UnitVarNum() int
- func (ly *SuperLayer) UpdateParams()
- type SuperNeuron
- type TRNLayer
Constants ¶
const ( // CT are layer 6 corticothalamic projecting neurons, which drive predictions // in Pulv (Pulvinar) via standard projections. CT emer.LayerType = emer.LayerTypeN + iota // Pulv are thalamic relay cell neurons in the Pulvinar thalamus, // which alternately reflect predictions driven by Deep layer projections, // and actual outcomes driven by Burst activity from corresponding // Super layer neurons that provide strong driving inputs to Pulv neurons. Pulv // TRN is thalamic reticular nucleus layer for inhibitory competition // within the thalamus. TRN )
const ( // CTCtxt are projections from Superficial layers to CT layers that // send Burst activations drive updating of CtxtGe excitatory conductance, // at end of a DeepBurst quarter. These projections also use a special learning // rule that takes into account the temporal delays in the activation states. // Can also add self context from CT for deeper temporal context. CTCtxt emer.PrjnType = emer.PrjnTypeN + iota )
The DeepAxon prjn types
Variables ¶
var ( // NeuronVars are for full list across all deep Layer types NeuronVars = []string{"Burst", "BurstPrv", "CtxtGe"} // SuperNeuronVars are for SuperLayer directly SuperNeuronVars = []string{"Burst", "BurstPrv"} SuperNeuronVarsMap map[string]int // NeuronVarsAll is full integrated list across inherited layers and NeuronVars NeuronVarsAll []string )
var KiT_CTCtxtPrjn = kit.Types.AddType(&CTCtxtPrjn{}, PrjnProps)
var KiT_CTLayer = kit.Types.AddType(&CTLayer{}, LayerProps)
var KiT_LayerType = kit.Enums.AddEnumExt(emer.KiT_LayerType, LayerTypeN, kit.NotBitFlag, nil)
var KiT_Network = kit.Types.AddType(&Network{}, NetworkProps)
var KiT_PrjnType = kit.Enums.AddEnumExt(emer.KiT_PrjnType, PrjnTypeN, kit.NotBitFlag, nil)
var KiT_PulvAttnLayer = kit.Types.AddType(&PulvAttnLayer{}, LayerProps)
var KiT_PulvLayer = kit.Types.AddType(&PulvLayer{}, LayerProps)
var KiT_SuperLayer = kit.Types.AddType(&SuperLayer{}, LayerProps)
var KiT_TRNLayer = kit.Types.AddType(&TRNLayer{}, LayerProps)
var LayerProps = ki.Props{ "EnumType:Typ": KiT_LayerType, "ToolBar": ki.PropSlice{ {"Defaults", ki.Props{ "icon": "reset", "desc": "return all parameters to their intial default values", }}, {"InitWts", ki.Props{ "icon": "update", "desc": "initialize the layer's weight values according to prjn parameters, for all *sending* projections out of this layer", }}, {"InitActs", ki.Props{ "icon": "update", "desc": "initialize the layer's activation values", }}, {"sep-act", ki.BlankProp{}}, {"LesionNeurons", ki.Props{ "icon": "close", "desc": "Lesion (set the Off flag) for given proportion of neurons in the layer (number must be 0 -- 1, NOT percent!)", "Args": ki.PropSlice{ {"Proportion", ki.Props{ "desc": "proportion (0 -- 1) of neurons to lesion", }}, }, }}, {"UnLesionNeurons", ki.Props{ "icon": "reset", "desc": "Un-Lesion (reset the Off flag) for all neurons in the layer", }}, }, }
LayerProps are required to get the extended EnumType
var NetworkProps = axon.NetworkProps
var PrjnProps = ki.Props{ "EnumType:Typ": KiT_PrjnType, }
Functions ¶
func AddPulvForSuper ¶ added in v1.5.9
AddPulvForSuper adds a Pulvinar for given superficial layer (SuperLayer) with a P suffix. The Pulv.Driver is set to Super. The Pulv layer needs other CT connections from higher up to predict this layer. Pulvinar is positioned behind the CT layer.
func AddSuperCT2D ¶ added in v1.2.85
func AddSuperCT2D(nt *axon.Network, name string, shapeY, shapeX int, space float32, pat prjn.Pattern) (super, ct emer.Layer)
AddSuperCT2D adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn projection from Super to CT using given projection pattern, super and ct have SetClass(name) called to allow shared params. CT is placed Behind Super.
func AddSuperCT4D ¶ added in v1.2.85
func AddSuperCT4D(nt *axon.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int, space float32, pat prjn.Pattern) (super, ct emer.Layer)
AddSuperCT4D adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn projection from Super to CT using given projection pattern, super and ct have SetClass(name) called to allow shared params. CT is placed Behind Super.
func ConnectCTSelf ¶ added in v1.5.4
ConnectCTSelf adds a Self (Lateral) CTCtxtPrjn projection within a CT layer, in addition to a regular lateral projection, which supports active maintenance. The CTCtxtPrjn has a Class label of CTSelfCtxt, and the regular one is CTSelfMaint
func ConnectCtxtToCT ¶ added in v1.2.2
ConnectCtxtToCT adds a CTCtxtPrjn from given sending layer to a CT layer Use ConnectSuperToCT for main projection from corresponding superficial layer.
func ConnectSuperToCT ¶ added in v1.2.2
ConnectSuperToCT adds a CTCtxtPrjn from given sending Super layer to a CT layer This automatically sets the FmSuper flag to engage proper defaults, Uses given projection pattern -- e.g., Full, OneToOne, or PoolOneToOne
func ConnectToPulv ¶ added in v1.5.9
func ConnectToPulv(nt *axon.Network, super, ct, pulv emer.Layer, toPulvPat, fmPulvPat prjn.Pattern) (toPulv, toSuper, toCT emer.Prjn)
ConnectToPulv connects Super and CT with given Pulv: CT -> Pulv is class CTToPulv, From Pulv = type = Back, class = FmPulv. toPulvPat is the prjn.Pattern CT -> Pulv and fmPulvPat is Pulv -> CT, Super. Typically Pulv is a different shape than Super and CT, so use Full or appropriate topological pattern
func DriveAct ¶ added in v1.2.2
DriveAct returns the driver activation -- Burst for Super, else CaSpkP
func LayerSendCtxtGe ¶ added in v1.2.91
LayerSendCtxtGe sends activation (CaSpkP) over CTCtxtPrjn projections to integrate CtxtGe excitatory conductance on CT layers. This should be called at the end of the 5IB Bursting phase via Network.CTCtxt Satisfies the CtxtSender interface.
func LogAddPulvCorSimItems ¶ added in v1.5.9
LogAddPulvCorSimItems adds CorSim stats for Pulv / Pulvinar layers aggregated across three time scales, ordered from higher to lower, e.g., Run, Epoch, Trial.
func SuperNeuronVarIdxByName ¶ added in v1.2.2
SuperNeuronVarIdxByName returns the index of the variable in the SuperNeuron, or error
Types ¶
type BurstParams ¶ added in v1.2.2
type BurstParams struct { ThrRel float32 `` /* 348-byte string literal not displayed */ ThrAbs float32 `` /* 241-byte string literal not displayed */ }
BurstParams determine how the 5IB Burst activation is computed from standard Act activation values in SuperLayer -- thresholded.
func (*BurstParams) Defaults ¶ added in v1.2.2
func (db *BurstParams) Defaults()
type CTCtxtPrjn ¶ added in v1.2.2
type CTCtxtPrjn struct { axon.Prjn // access as .Prjn FmSuper bool `` /* 200-byte string literal not displayed */ CtxtGeInc []float32 `desc:"local per-recv unit accumulator for Ctxt excitatory conductance from sending units -- not a delta -- the full value"` }
CTCtxtPrjn is the "context" temporally-delayed projection into CTLayer, (corticothalamic deep layer 6) where the CtxtGe excitatory input is integrated only at end of Burst Quarter. Set FmSuper for the main projection from corresponding Super layer.
func (*CTCtxtPrjn) Build ¶ added in v1.2.2
func (pj *CTCtxtPrjn) Build() error
func (*CTCtxtPrjn) DWt ¶ added in v1.2.2
func (pj *CTCtxtPrjn) DWt(ltime *axon.Time)
DWt computes the weight change (learning) for Ctxt projections
func (*CTCtxtPrjn) Defaults ¶ added in v1.2.2
func (pj *CTCtxtPrjn) Defaults()
func (*CTCtxtPrjn) GFmSpike ¶ added in v1.5.12
func (pj *CTCtxtPrjn) GFmSpike(ltime *axon.Time)
GFmSpike: disabled for this type
func (*CTCtxtPrjn) InitGBuffs ¶ added in v1.5.10
func (pj *CTCtxtPrjn) InitGBuffs()
func (*CTCtxtPrjn) PrjnTypeName ¶ added in v1.2.2
func (pj *CTCtxtPrjn) PrjnTypeName() string
func (*CTCtxtPrjn) RecvCtxtGeInc ¶ added in v1.2.2
func (pj *CTCtxtPrjn) RecvCtxtGeInc()
RecvCtxtGeInc increments the receiver's CtxtGe from that of all the projections
func (*CTCtxtPrjn) RecvSynCa ¶ added in v1.5.5
func (pj *CTCtxtPrjn) RecvSynCa(ltime *axon.Time)
func (*CTCtxtPrjn) SendCtxtGe ¶ added in v1.2.2
func (pj *CTCtxtPrjn) SendCtxtGe(si int, burst float32)
SendCtxtGe sends the full Burst activation from sending neuron index si, to integrate CtxtGe excitatory conductance on receivers
func (*CTCtxtPrjn) SendSpike ¶ added in v1.2.4
func (pj *CTCtxtPrjn) SendSpike(si int)
SendSpike: disabled for this type
func (*CTCtxtPrjn) SendSynCa ¶ added in v1.5.5
func (pj *CTCtxtPrjn) SendSynCa(ltime *axon.Time)
func (*CTCtxtPrjn) Type ¶ added in v1.2.2
func (pj *CTCtxtPrjn) Type() emer.PrjnType
func (*CTCtxtPrjn) UpdateParams ¶ added in v1.2.2
func (pj *CTCtxtPrjn) UpdateParams()
type CTLayer ¶ added in v1.2.2
type CTLayer struct { axon.Layer // access as .Layer CT CTParams `desc:"parameters for CT layer specific functions"` CtxtGes []float32 `desc:"slice of context (temporally delayed) excitatory conducances."` }
CTLayer implements the corticothalamic projecting layer 6 deep neurons that project to the Pulv pulvinar neurons, to generate the predictions. They receive phasic input representing 5IB bursting via CTCtxtPrjn inputs from SuperLayer and also from self projections.
There are two primary modes of behavior: single-step copy and multi-step temporal integration, each of which requires different parmeterization:
- single-step copy requires NMDA, GABAB Gbar = .15, Tau = 100, (i.e. std defaults) and CT.Decay = 0, with one-to-one projection from Super, and no CT self connections. See examples/deep_move for a working example.
- Temporal integration requires NMDA, GABAB Gbar = .3, Tau = 300, CT.Decay = 50, with self connections of both CTCtxtPrjn and standard that support NMDA active maintenance. See examples/deep_fsa and examples/deep_move for working examples.
func AddCTLayer2D ¶ added in v1.2.2
AddCTLayer2D adds a CTLayer of given size, with given name.
func AddCTLayer4D ¶ added in v1.2.2
AddCTLayer4D adds a CTLayer of given size, with given name.
func (*CTLayer) CtxtFmGe ¶ added in v1.2.2
CtxtFmGe integrates new CtxtGe excitatory conductance from projections, and computes overall Ctxt value, only on Deep layers. This should be called at the end of the 5IB Bursting phase via Network.CTCtxt
func (*CTLayer) DecayState ¶ added in v1.5.10
func (*CTLayer) GFmSpike ¶ added in v1.5.12
GFmSpike integrates new synaptic conductances from increments sent during last Spike
func (*CTLayer) SendCtxtGe ¶ added in v1.2.2
SendCtxtGe sends activation (CaSpkP) over CTCtxtPrjn projections to integrate CtxtGe excitatory conductance on CT layers. This should be called at the end of the 5IB Bursting phase via Network.CTCtxt Satisfies the CtxtSender interface.
func (*CTLayer) UnitVal1D ¶ added in v1.2.2
UnitVal1D returns value of given variable index on given unit, using 1-dimensional index. returns NaN on invalid index. This is the core unit var access method used by other methods, so it is the only one that needs to be updated for derived layer types.
func (*CTLayer) UnitVarIdx ¶ added in v1.2.2
UnitVarIdx returns the index of given variable within the Neuron, according to UnitVarNames() list (using a map to lookup index), or -1 and error message if not found.
func (*CTLayer) UnitVarNames ¶ added in v1.2.2
UnitVarNames returns a list of variable names available on the units in this layer
func (*CTLayer) UnitVarNum ¶ added in v1.2.2
UnitVarNum returns the number of Neuron-level variables for this layer. This is needed for extending indexes in derived types.
func (*CTLayer) UpdateParams ¶ added in v1.5.3
func (ly *CTLayer) UpdateParams()
type CTParams ¶ added in v1.5.3
type CTParams struct { GeGain float32 `` /* 243-byte string literal not displayed */ DecayTau float32 `` /* 227-byte string literal not displayed */ DecayDt float32 `view:"-" json:"-" xml:"-" desc:"1 / tau"` }
CTParams control the CT corticothalamic neuron special behavior
type CtxtSender ¶ added in v1.2.2
type CtxtSender interface { axon.AxonLayer // SendCtxtGe sends activation over CTCtxtPrjn projections to integrate // CtxtGe excitatory conductance on CT layers. // This must be called at the end of the Burst quarter (plus phase) SendCtxtGe(ltime *axon.Time) }
CtxtSender is an interface for layers that implement the SendCtxtGe method (SuperLayer, CTLayer)
type EPool ¶ added in v1.2.2
type EPool struct { LayNm string `desc:"layer name"` Wt float32 `desc:"general scaling factor for how much excitation from this pool"` }
EPool are how to gather excitation across pools
type EPools ¶ added in v1.2.2
type EPools []*EPool
EPools is a list of pools
type IPool ¶ added in v1.2.2
type IPool struct { LayNm string `desc:"layer name"` Wt float32 `desc:"general scaling factor for how much overall inhibition from this pool contributes, in a non-pool-specific manner"` PoolWt float32 `` /* 160-byte string literal not displayed */ SOff evec.Vec2i `desc:"offset into source, sending layer"` ROff evec.Vec2i `desc:"offset into our own receiving layer"` }
IPool are how to gather inhibition across pools
type IPools ¶ added in v1.2.2
type IPools []*IPool
IPools is a list of pools
type LayerType ¶ added in v1.2.2
LayerType has the DeepAxon extensions to the emer.LayerType types, for gui
const ( CT_ LayerType = LayerType(emer.LayerTypeN) + iota Pulv_ TRN_ LayerTypeN )
gui versions
func StringToLayerType ¶ added in v1.2.2
type Network ¶
deep.Network runs a Deep version of Axon network, for predictive learning via deep cortical layers interconnected with the thalamus (Pulvinar). It has special computational methods only for PlusPhase where deep layer context updating happens, corresponding to the bursting of deep layer 5IB neurons. It also has methods for creating different specialized layer types.
func NewNetwork ¶ added in v1.2.94
NewNetwork returns a new deep Network
func (*Network) AddInputPulv2D ¶ added in v1.5.9
func (nt *Network) AddInputPulv2D(name string, nNeurY, nNeurX int, space float32) (emer.Layer, *PulvLayer)
AddInputPulv2D adds an Input and PulvLayer of given size, with given name. The Input layer is set as the Driver of the PulvLayer. Both layers have SetClass(name) called to allow shared params.
func (*Network) AddInputPulv4D ¶ added in v1.5.9
func (nt *Network) AddInputPulv4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int, space float32) (emer.Layer, *PulvLayer)
AddInputPulv4D adds an Input and PulvLayer of given size, with given name. The Input layer is set as the Driver of the PulvLayer. Both layers have SetClass(name) called to allow shared params.
func (*Network) AddPulvAttnLayer2D ¶ added in v1.5.9
func (nt *Network) AddPulvAttnLayer2D(name string, nNeurY, nNeurX int) *PulvAttnLayer
AddPulvAttnLayer2D adds a PulvAttnLayer of given size, with given name.
func (*Network) AddPulvAttnLayer4D ¶ added in v1.5.9
func (nt *Network) AddPulvAttnLayer4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) *PulvAttnLayer
AddPulvAttnLayer4D adds a PulvLayer of given size, with given name.
func (*Network) AddPulvForSuper ¶ added in v1.5.9
AddPulvForSuper adds a Pulvinar for given superficial layer (SuperLayer) with a P suffix. The Pulv.Driver is set to Super. The Pulv layer needs other CT connections from higher up to predict this layer. Pulvinar is positioned behind the CT layer.
func (*Network) AddSuperCT2D ¶ added in v1.2.85
func (nt *Network) AddSuperCT2D(name string, shapeY, shapeX int, space float32, pat prjn.Pattern) (super, ct emer.Layer)
AddSuperCT2D adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn projection from Super to CT using given projection pattern, and NO Pulv Pulvinar. CT is placed Behind Super.
func (*Network) AddSuperCT4D ¶ added in v1.2.85
func (nt *Network) AddSuperCT4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int, space float32, pat prjn.Pattern) (super, ct emer.Layer)
AddSuperCT4D adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn projection from Super to CT using given projection pattern, and NO Pulv Pulvinar. CT is placed Behind Super.
func (*Network) AddSuperLayer2D ¶ added in v1.2.85
func (nt *Network) AddSuperLayer2D(name string, nNeurY, nNeurX int) *SuperLayer
AddSuperLayer2D adds a SuperLayer of given size, with given name.
func (*Network) AddSuperLayer4D ¶ added in v1.2.85
func (nt *Network) AddSuperLayer4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) *SuperLayer
AddSuperLayer4D adds a PulvLayer of given size, with given name.
func (*Network) CTCtxt ¶ added in v1.2.2
CTCtxt sends context to CT layers and integrates CtxtGe on CT layers
func (*Network) ConnectCTSelf ¶ added in v1.5.4
ConnectCTSelf adds a Self (Lateral) CTCtxtPrjn projection within a CT layer, in addition to a regular lateral projection, which supports active maintenance. The CTCtxtPrjn has a Class label of CTSelfCtxt, and the regular one is CTSelfMaint
func (*Network) ConnectCtxtToCT ¶ added in v1.2.2
ConnectCtxtToCT adds a CTCtxtPrjn from given sending layer to a CT layer
func (*Network) ConnectToPulv ¶ added in v1.5.9
func (nt *Network) ConnectToPulv(super, ct, pulv emer.Layer, toPulvPat, fmPulvPat prjn.Pattern) (toPulv, toSuper, toCT emer.Prjn)
ConnectToPulv connects Super and CT with given Pulv: CT -> Pulv is class CTToPulv, From Pulv = type = Back, class = FmPulv toPulvPat is the prjn.Pattern CT -> Pulv and fmPulvPat is Pulv -> CT, Super Typically Pulv is a different shape than Super and CT, so use Full or appropriate topological pattern
func (*Network) Defaults ¶
func (nt *Network) Defaults()
Defaults sets all the default parameters for all layers and projections
func (*Network) PlusPhaseImpl ¶ added in v1.4.0
PlusPhase does updating after end of plus phase
func (*Network) UnitVarNames ¶ added in v1.2.2
UnitVarNames returns a list of variable names available on the units in this layer
func (*Network) UpdateParams ¶
func (nt *Network) UpdateParams()
UpdateParams updates all the derived parameters if any have changed, for all layers and projections
type PrjnType ¶ added in v1.2.2
PrjnType has the DeepAxon extensions to the emer.PrjnType types, for gui
func StringToPrjnType ¶ added in v1.2.2
type PulvAttnLayer ¶ added in v1.5.9
type PulvAttnLayer struct { axon.Layer // access as .Layer SendAttn SendAttnParams `view:"inline" desc:"sending attention parameters"` }
PulvAttnLayer is the thalamic relay cell layer for Attention in DeepAxon.
func AddPulvAttnLayer2D ¶ added in v1.5.9
func AddPulvAttnLayer2D(nt *axon.Network, name string, nNeurY, nNeurX int) *PulvAttnLayer
AddPulvAttnLayer2D adds a PulvAttnLayer of given size, with given name.
func AddPulvAttnLayer4D ¶ added in v1.5.9
func AddPulvAttnLayer4D(nt *axon.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) *PulvAttnLayer
AddPulvAttnLayer4D adds a PulvAttnLayer of given size, with given name.
func (*PulvAttnLayer) AttnFmAct ¶ added in v1.5.9
func (ly *PulvAttnLayer) AttnFmAct(ltime *axon.Time)
AttnFmAct computes our attention signal from activations
func (*PulvAttnLayer) Class ¶ added in v1.5.9
func (ly *PulvAttnLayer) Class() string
func (*PulvAttnLayer) CyclePost ¶ added in v1.5.9
func (ly *PulvAttnLayer) CyclePost(ltime *axon.Time)
CyclePost is called at end of Cycle We use it to send Attn
func (*PulvAttnLayer) Defaults ¶ added in v1.5.9
func (ly *PulvAttnLayer) Defaults()
func (*PulvAttnLayer) IsTarget ¶ added in v1.5.9
func (ly *PulvAttnLayer) IsTarget() bool
func (*PulvAttnLayer) SendAttnLay ¶ added in v1.5.9
func (ly *PulvAttnLayer) SendAttnLay(tly *axon.Layer, ltime *axon.Time)
SendAttnLay sends attention signal to given layer
func (*PulvAttnLayer) SendAttnLays ¶ added in v1.5.9
func (ly *PulvAttnLayer) SendAttnLays(ltime *axon.Time)
SendAttnLays sends attention signal to all layers
func (*PulvAttnLayer) UpdateParams ¶ added in v1.5.9
func (ly *PulvAttnLayer) UpdateParams()
UpdateParams updates all params given any changes that might have been made to individual values including those in the receiving projections of this layer
type PulvLayer ¶ added in v1.5.9
type PulvLayer struct { axon.Layer // access as .Layer Pulv PulvParams `` /* 136-byte string literal not displayed */ Driver string `desc:"name of SuperLayer that sends 5IB Burst driver inputs to this layer"` }
PulvLayer is the Pulvinar thalamic relay cell layer for DeepAxon. It has normal activity during the minus phase, as activated by CT etc inputs, and is then driven by strong 5IB driver inputs in the plus phase.
func AddInputPulv2D ¶ added in v1.5.9
func AddInputPulv2D(nt *axon.Network, name string, nNeurY, nNeurX int, space float32) (emer.Layer, *PulvLayer)
AddInputPulv2D adds an Input and PulvLayer of given size, with given name. The Input layer is set as the Driver of the PulvLayer. Both layers have SetClass(name) called to allow shared params.
func AddInputPulv4D ¶ added in v1.5.9
func AddInputPulv4D(nt *axon.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int, space float32) (emer.Layer, *PulvLayer)
AddInputPulv4D adds an Input and PulvLayer of given size, with given name. The Input layer is set as the Driver of the PulvLayer. Both layers have SetClass(name) called to allow shared params.
func AddPulvLayer2D ¶ added in v1.5.9
AddPulvLayer2D adds a PulvLayer of given size, with given name.
func AddPulvLayer4D ¶ added in v1.5.9
AddPulvLayer4D adds a PulvLayer of given size, with given name.
func (*PulvLayer) DriverLayer ¶ added in v1.5.9
DriverLayer returns the driver layer for given Driver
func (*PulvLayer) GFmSpike ¶ added in v1.5.12
GFmSpike integrates new synaptic conductances from updated Spiking inputs
func (*PulvLayer) GeFmDrivers ¶ added in v1.5.9
GeFmDrivers computes excitatory conductance from driver neurons
func (*PulvLayer) UpdateParams ¶ added in v1.5.9
func (ly *PulvLayer) UpdateParams()
UpdateParams updates all params given any changes that might have been made to individual values including those in the receiving projections of this layer
type PulvParams ¶ added in v1.5.9
type PulvParams struct { DriversOff bool `def:"false" desc:"Turn off the driver inputs, in which case this layer behaves like a standard layer"` DriveScale float32 `` /* 145-byte string literal not displayed */ FullDriveAct float32 `` /* 352-byte string literal not displayed */ }
PulvParams provides parameters for how the plus-phase (outcome) state of Pulvinar thalamic relay cell neurons is computed from the corresponding driver neuron Burst activation. Drivers are hard clamped using Clamp.Rate.
func (*PulvParams) Defaults ¶ added in v1.5.9
func (tp *PulvParams) Defaults()
func (*PulvParams) DriveGe ¶ added in v1.5.9
func (tp *PulvParams) DriveGe(act float32) float32
DriveGe returns effective excitatory conductance to use for given driver input Burst activation
func (*PulvParams) Update ¶ added in v1.5.9
func (tp *PulvParams) Update()
type SendAttnParams ¶ added in v1.2.85
type SendAttnParams struct { Thr float32 `` /* 134-byte string literal not displayed */ ToLays emer.LayNames `desc:"list of layers to send attentional modulation to"` }
SendAttnParams parameters for sending attention
func (*SendAttnParams) Defaults ¶ added in v1.2.85
func (ti *SendAttnParams) Defaults()
func (*SendAttnParams) Update ¶ added in v1.2.85
func (ti *SendAttnParams) Update()
type SuperLayer ¶ added in v1.2.2
type SuperLayer struct { axon.Layer // access as .Layer Burst BurstParams `` /* 142-byte string literal not displayed */ SuperNeurs []SuperNeuron `desc:"slice of super neuron values -- same size as Neurons"` }
SuperLayer is the DeepAxon superficial layer, based on basic rate-coded axon.Layer. Computes the Burst activation from regular activations.
func AddSuperLayer2D ¶ added in v1.2.2
func AddSuperLayer2D(nt *axon.Network, name string, nNeurY, nNeurX int) *SuperLayer
AddSuperLayer2D adds a SuperLayer of given size, with given name.
func AddSuperLayer4D ¶ added in v1.2.2
func AddSuperLayer4D(nt *axon.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) *SuperLayer
AddSuperLayer4D adds a SuperLayer of given size, with given name.
func (*SuperLayer) Build ¶ added in v1.2.2
func (ly *SuperLayer) Build() error
func (*SuperLayer) BurstFmCaSpkP ¶ added in v1.5.7
func (ly *SuperLayer) BurstFmCaSpkP(ltime *axon.Time)
BurstFmCaSpkP updates Burst layer 5IB bursting value from current CaSpkP reflecting a time-integrated spiking value useful in learning, subject to thresholding. Only updated during plus phase.
func (*SuperLayer) BurstPrv ¶ added in v1.2.2
func (ly *SuperLayer) BurstPrv()
BurstPrv saves Burst as BurstPrv
func (*SuperLayer) CyclePost ¶ added in v1.2.2
func (ly *SuperLayer) CyclePost(ltime *axon.Time)
CyclePost calls BurstFmCaSpkP
func (*SuperLayer) DecayState ¶ added in v1.2.2
func (ly *SuperLayer) DecayState(decay, glong float32)
func (*SuperLayer) Defaults ¶ added in v1.2.2
func (ly *SuperLayer) Defaults()
func (*SuperLayer) InitActs ¶ added in v1.2.2
func (ly *SuperLayer) InitActs()
func (*SuperLayer) NewState ¶ added in v1.5.7
func (ly *SuperLayer) NewState()
func (*SuperLayer) SendCtxtGe ¶ added in v1.2.2
func (ly *SuperLayer) SendCtxtGe(ltime *axon.Time)
SendCtxtGe sends Burst activation over CTCtxtPrjn projections to integrate CtxtGe excitatory conductance on CT layers. This should be called at the end of the 5IB Bursting phase via Network.CTCtxt Satisfies the CtxtSender interface.
func (*SuperLayer) UnitVal1D ¶ added in v1.2.2
func (ly *SuperLayer) UnitVal1D(varIdx int, idx int) float32
UnitVal1D returns value of given variable index on given unit, using 1-dimensional index. returns NaN on invalid index. This is the core unit var access method used by other methods, so it is the only one that needs to be updated for derived layer types.
func (*SuperLayer) UnitVarIdx ¶ added in v1.2.2
func (ly *SuperLayer) UnitVarIdx(varNm string) (int, error)
UnitVarIdx returns the index of given variable within the Neuron, according to UnitVarNames() list (using a map to lookup index), or -1 and error message if not found.
func (*SuperLayer) UnitVarNames ¶ added in v1.2.2
func (ly *SuperLayer) UnitVarNames() []string
UnitVarNames returns a list of variable names available on the units in this layer
func (*SuperLayer) UnitVarNum ¶ added in v1.2.2
func (ly *SuperLayer) UnitVarNum() int
UnitVarNum returns the number of Neuron-level variables for this layer. This is needed for extending indexes in derived types.
func (*SuperLayer) UpdateParams ¶ added in v1.2.2
func (ly *SuperLayer) UpdateParams()
type SuperNeuron ¶ added in v1.2.2
type SuperNeuron struct { Burst float32 `desc:"5IB bursting activation value, computed by thresholding regular activation"` BurstPrv float32 `desc:"previous bursting activation -- used for context-based learning"` }
SuperNeuron has the neuron values for SuperLayer
func (*SuperNeuron) VarByIdx ¶ added in v1.2.2
func (sn *SuperNeuron) VarByIdx(idx int) float32