Documentation ¶
Overview ¶
Package pbwm provides the prefrontal cortex basal ganglia working memory (PBWM) model of the basal ganglia (BG) and prefrontal cortex (PFC) circuitry that supports dynamic BG gating of PFC robust active maintenance.
In the Go framework, it is version 1 (was version 5 in cemer).
This package builds on the deep package for defining thalamocortical circuits involved in predictive learning -- the BG basically acts to gate these circuits.
It provides a basis for dopamine-modulated processing of all types, and is the base package for the PVLV model package built on top of it.
There are multiple levels of functionality to allow for flexibility in exploring new variants.
Each different Layer type defines and manages its own Neuron type, despite some redundancy, so only one type is needed and it is exactly what that layer needs. However, a Network must have a single consistent set of Neuron variables, which is given by NeuronVars and NeurVars enum. In many cases, those "neuron" variables are actually stored in the layer itself instead of on per-neuron level.
Naming rule: DA when a singleton, DaMod (lowercase a) when CamelCased with something else
############## # Basic Level
* pbwm.Layer has DA, ACh, SE -- can be modulated
* ModLayer adds DA-modulated learning on top of basic Leabra learning
- GateLayer has GateStates in 1-to-1 correspondence with Pools, to keep track of gating state -- source gating layers can send updates to other layers.
################ # PBWM specific
- MatrixLayer for dorsal striatum gating of DLPFC areas, separate D1R = Go, D2R = NoGo Each layer contains Maint and Out GateTypes, as function of outer 4D Pool X dimension (Maint on the left, Out on the right)
- GPiThalLayer receives from Matrix Go and GPe NoGo to compute final WTA gating, and broadcasts GateState info to its SendTo layers. See Timing params for timing.
- PFCLayer for active maintenance -- reproduces a DeepLeabra like framework, with update timing according to BurstQtr. Gating is computed in quarter *before* updating in BurstQtr. At *end* of BurstQtr, Super Burst -> Deep Ctxt to drive maintenance via Ctxt in Deep.
Index ¶
- Variables
- func AddDorsalBG(nt *leabra.Network, prefix string, nY, nMaint, nOut, nNeurY, nNeurX int) (mtxGo, mtxNoGo, gpe, gpi, cin leabra.LeabraLayer)
- func AddDorsalBGPy(nt *leabra.Network, prefix string, nY, nMaint, nOut, nNeurY, nNeurX int) []leabra.LeabraLayer
- func AddPBWM(nt *leabra.Network, prefix string, ...) (...)
- func AddPBWMPy(nt *leabra.Network, prefix string, ...) []leabra.LeabraLayer
- func AddPFC(nt *leabra.Network, prefix string, nY, nMaint, nOut, nNeurY, nNeurX int, ...) (pfcMnt, pfcMntD, pfcOut, pfcOutD leabra.LeabraLayer)
- func AddPFCLayer(nt *leabra.Network, name string, nY, nX, nNeurY, nNeurX int, ...) (sp, dp leabra.LeabraLayer)
- func AddPFCPy(nt *leabra.Network, prefix string, nY, nMaint, nOut, nNeurY, nNeurX int, ...) []leabra.LeabraLayer
- type CINLayer
- func (ly *CINLayer) ActFmG(ltime *leabra.Time)
- func (ly *CINLayer) Build() error
- func (ly *CINLayer) CyclePost(ltime *leabra.Time)
- func (ly *CINLayer) Defaults()
- func (ly *CINLayer) GetACh() float32
- func (ly *CINLayer) MaxAbsRew() float32
- func (ly *CINLayer) SetACh(ach float32)
- func (ly *CINLayer) UnitVal1D(varIdx int, idx int) float32
- func (ly *CINLayer) UnitVarIdx(varNm string) (int, error)
- func (ly *CINLayer) UnitVarNum() int
- type DaHebbPrjn
- type DaModParams
- type DaReceptors
- type GPiGateParams
- type GPiNeuron
- type GPiThalLayer
- func (ly *GPiThalLayer) AddSendTo(laynm string)
- func (ly *GPiThalLayer) AlphaCycInit()
- func (ly *GPiThalLayer) Build() error
- func (ly *GPiThalLayer) Defaults()
- func (ly *GPiThalLayer) GFmInc(ltime *leabra.Time)
- func (ly *GPiThalLayer) GateFmAct(ltime *leabra.Time)
- func (ly *GPiThalLayer) GateSend(ltime *leabra.Time)
- func (ly *GPiThalLayer) GateType() GateTypes
- func (ly *GPiThalLayer) InitActs()
- func (ly *GPiThalLayer) MatrixPrjns() (goPrjn, nogoPrjn *GPiThalPrjn, err error)
- func (ly *GPiThalLayer) RecGateAct(ltime *leabra.Time)
- func (ly *GPiThalLayer) SendGateShape() error
- func (ly *GPiThalLayer) SendGateStates()
- func (ly *GPiThalLayer) SendToCheck() error
- func (ly *GPiThalLayer) SendToMatrixPFC(prefix string)
- func (ly *GPiThalLayer) UnitValByIdx(vidx NeurVars, idx int) float32
- type GPiThalPrjn
- type GPiTimingParams
- type GateLayer
- func (ly *GateLayer) AsGate() *GateLayer
- func (ly *GateLayer) Build() error
- func (ly *GateLayer) GateShape() *GateShape
- func (ly *GateLayer) GateState(poolIdx int) *GateState
- func (ly *GateLayer) InitActs()
- func (ly *GateLayer) SetGateState(poolIdx int, state *GateState)
- func (ly *GateLayer) SetGateStates(states []GateState, typ GateTypes)
- func (ly *GateLayer) UnitValByIdx(vidx NeurVars, idx int) float32
- type GateLayerer
- type GateShape
- type GateState
- type GateTypes
- type Layer
- func (ly *Layer) AsGate() *GateLayer
- func (ly *Layer) AsPBWM() *Layer
- func (ly *Layer) Defaults()
- func (ly *Layer) DoQuarter2DWt() bool
- func (ly *Layer) GateSend(ltime *leabra.Time)
- func (ly *Layer) GetACh() float32
- func (ly *Layer) GetDA() float32
- func (ly *Layer) GetSE() float32
- func (ly *Layer) InitActs()
- func (ly *Layer) Quarter2DWt()
- func (ly *Layer) QuarterFinal(ltime *leabra.Time)
- func (ly *Layer) RecGateAct(ltime *leabra.Time)
- func (ly *Layer) SendMods(ltime *leabra.Time)
- func (ly *Layer) SetACh(ach float32)
- func (ly *Layer) SetDA(da float32)
- func (ly *Layer) SetSE(se float32)
- func (ly *Layer) UnitVal1D(varIdx int, idx int) float32
- func (ly *Layer) UnitValByIdx(vidx NeurVars, idx int) float32
- func (ly *Layer) UnitVarIdx(varNm string) (int, error)
- func (ly *Layer) UnitVarNames() []string
- func (ly *Layer) UnitVarNum() int
- func (ly *Layer) UpdateParams()
- type MatrixLayer
- func (ly *MatrixLayer) ActFmG(ltime *leabra.Time)
- func (ly *MatrixLayer) Build() error
- func (ly *MatrixLayer) DALrnFmDA(da float32) float32
- func (ly *MatrixLayer) DaAChFmLay(ltime *leabra.Time)
- func (ly *MatrixLayer) Defaults()
- func (ly *MatrixLayer) DoQuarter2DWt() bool
- func (ly *MatrixLayer) GateType() GateTypes
- func (ly *MatrixLayer) InhibFmGeAct(ltime *leabra.Time)
- func (ly *MatrixLayer) InitActs()
- func (ly *MatrixLayer) RecGateAct(ltime *leabra.Time)
- func (ly *MatrixLayer) UnitValByIdx(vidx NeurVars, idx int) float32
- type MatrixNeuron
- type MatrixParams
- type MatrixTracePrjn
- func (pj *MatrixTracePrjn) Build() error
- func (pj *MatrixTracePrjn) ClearTrace()
- func (pj *MatrixTracePrjn) DWt()
- func (pj *MatrixTracePrjn) Defaults()
- func (pj *MatrixTracePrjn) InitWts()
- func (pj *MatrixTracePrjn) SynVal1D(varIdx int, synIdx int) float32
- func (pj *MatrixTracePrjn) SynVarIdx(varNm string) (int, error)
- type ModLayer
- type Network
- func (nt *Network) AddCINLayer(name string) *CINLayer
- func (nt *Network) AddDorsalBG(prefix string, nY, nMaint, nOut, nNeurY, nNeurX int) (mtxGo, mtxNoGo, gpe, gpi, cin leabra.LeabraLayer)
- func (nt *Network) AddGPeLayer(name string, nY, nMaint, nOut int) *Layer
- func (nt *Network) AddGPiThalLayer(name string, nY, nMaint, nOut int) *GPiThalLayer
- func (nt *Network) AddMatrixLayer(name string, nY, nMaint, nOut, nNeurY, nNeurX int, da DaReceptors) *MatrixLayer
- func (nt *Network) AddPBWM(prefix string, nY, nMaint, nOut, nNeurBgY, nNeurBgX, nNeurPfcY, nNeurPfcX int) (...)
- func (nt *Network) AddPFC(prefix string, nY, nMaint, nOut, nNeurY, nNeurX int, dynMaint bool) (pfcMnt, pfcMntD, pfcOut, pfcOutD leabra.LeabraLayer)
- func (nt *Network) AddPFCLayer(name string, nY, nX, nNeurY, nNeurX int, out, dynMaint bool) (sp, dp leabra.LeabraLayer)
- func (nt *Network) CycleImpl(ltime *leabra.Time)
- func (nt *Network) Defaults()
- func (nt *Network) GateSend(ltime *leabra.Time)
- func (nt *Network) NewLayer() emer.Layer
- func (nt *Network) NewPrjn() emer.Prjn
- func (nt *Network) RecGateAct(ltime *leabra.Time)
- func (nt *Network) SendMods(ltime *leabra.Time)
- func (nt *Network) SynVarNames() []string
- func (nt *Network) UnitVarNames() []string
- func (nt *Network) UpdateParams()
- type NeurVars
- type PBWMLayer
- type PFCDeepLayer
- func (ly *PFCDeepLayer) ActFmG(ltime *leabra.Time)
- func (ly *PFCDeepLayer) Build() error
- func (ly *PFCDeepLayer) ClearMaint(pool int)
- func (ly *PFCDeepLayer) DeepMaint(ltime *leabra.Time)
- func (ly *PFCDeepLayer) Defaults()
- func (ly *PFCDeepLayer) DoQuarter2DWt() bool
- func (ly *PFCDeepLayer) GFmInc(ltime *leabra.Time)
- func (ly *PFCDeepLayer) GateType() GateTypes
- func (ly *PFCDeepLayer) Gating(ltime *leabra.Time)
- func (ly *PFCDeepLayer) InitActs()
- func (ly *PFCDeepLayer) MaintPFC() *PFCDeepLayer
- func (ly *PFCDeepLayer) QuarterFinal(ltime *leabra.Time)
- func (ly *PFCDeepLayer) RecGateAct(ltime *leabra.Time)
- func (ly *PFCDeepLayer) SuperPFC() leabra.LeabraLayer
- func (ly *PFCDeepLayer) UnitValByIdx(vidx NeurVars, idx int) float32
- func (ly *PFCDeepLayer) UpdtGateCnt(ltime *leabra.Time)
- type PFCDyn
- type PFCDyns
- type PFCGateParams
- type PFCMaintParams
- type PFCNeuron
- type TraceParams
- type TraceSyn
- type Valences
Constants ¶
This section is empty.
Variables ¶
var ( // NeuronVars are the pbwm neurons plus some custom variables that sub-types use for their // algo-specific cases -- need a consistent set of overall network-level vars for display / generic // interface. NeuronVars = []string{"DA", "DALrn", "ACh", "SE", "GateAct", "GateNow", "GateCnt", "ActG", "Maint", "MaintGe"} NeuronVarsMap map[string]int NeuronVarsAll []string )
var KiT_CINLayer = kit.Types.AddType(&CINLayer{}, leabra.LayerProps)
var KiT_DaHebbPrjn = kit.Types.AddType(&DaHebbPrjn{}, leabra.PrjnProps)
var KiT_DaReceptors = kit.Enums.AddEnum(DaReceptorsN, kit.NotBitFlag, nil)
var KiT_GPiThalLayer = kit.Types.AddType(&GPiThalLayer{}, leabra.LayerProps)
var KiT_GPiThalPrjn = kit.Types.AddType(&GPiThalPrjn{}, leabra.PrjnProps)
var KiT_GateLayer = kit.Types.AddType(&GateLayer{}, leabra.LayerProps)
var KiT_GateTypes = kit.Enums.AddEnum(GateTypesN, kit.NotBitFlag, nil)
var KiT_Layer = kit.Types.AddType(&Layer{}, leabra.LayerProps)
var KiT_MatrixLayer = kit.Types.AddType(&MatrixLayer{}, leabra.LayerProps)
var KiT_ModLayer = kit.Types.AddType(&ModLayer{}, leabra.LayerProps)
var KiT_Network = kit.Types.AddType(&Network{}, NetworkProps)
var KiT_PFCDeepLayer = kit.Types.AddType(&PFCDeepLayer{}, leabra.LayerProps)
var KiT_Valences = kit.Enums.AddEnum(ValencesN, kit.NotBitFlag, nil)
var NetworkProps = leabra.NetworkProps
var SynVarsAll []string
SynVarsAll is the pbwm collection of all synapse-level vars (includes TraceSynVars)
var TraceSynVars = []string{"NTr", "Tr"}
Functions ¶
func AddDorsalBG ¶ added in v1.1.11
func AddDorsalBG(nt *leabra.Network, prefix string, nY, nMaint, nOut, nNeurY, nNeurX int) (mtxGo, mtxNoGo, gpe, gpi, cin leabra.LeabraLayer)
AddDorsalBG adds MatrixGo, NoGo, GPe, GPiThal, and CIN layers, with given optional prefix. nY = number of pools in Y dimension, nMaint + nOut are pools in X dimension, and each pool has nNeurY, nNeurX neurons. Appropriate PoolOneToOne connections are made to drive GPiThal, with BgFixed class name set so they can be styled appropriately (no learning, WtRnd.Mean=0.8, Var=0)
func AddDorsalBGPy ¶ added in v1.1.15
func AddDorsalBGPy(nt *leabra.Network, prefix string, nY, nMaint, nOut, nNeurY, nNeurX int) []leabra.LeabraLayer
AddDorsalBGPy adds MatrixGo, NoGo, GPe, GPiThal, and CIN layers, with given optional prefix. nY = number of pools in Y dimension, nMaint + nOut are pools in X dimension, and each pool has nNeurY, nNeurX neurons. Appropriate PoolOneToOne connections are made to drive GPiThal, with BgFixed class name set so they can be styled appropriately (no learning, WtRnd.Mean=0.8, Var=0) Py is Python version, returns layers as a slice
func AddPBWM ¶ added in v1.1.11
func AddPBWM(nt *leabra.Network, prefix string, nY, nMaint, nOut, nNeurBgY, nNeurBgX, nNeurPfcY, nNeurPfcX int) (mtxGo, mtxNoGo, gpe, gpi, cin, pfcMnt, pfcMntD, pfcOut, pfcOutD leabra.LeabraLayer)
AddPBWM adds a DorsalBG and PFC with given params Defaults to simple case of basic maint dynamics in Deep
func AddPBWMPy ¶ added in v1.1.15
func AddPBWMPy(nt *leabra.Network, prefix string, nY, nMaint, nOut, nNeurBgY, nNeurBgX, nNeurPfcY, nNeurPfcX int) []leabra.LeabraLayer
AddPBWMPy adds a DorsalBG and PFC with given params Defaults to simple case of basic maint dynamics in Deep Py is Python version, returns layers as a slice
func AddPFC ¶ added in v1.1.11
func AddPFC(nt *leabra.Network, prefix string, nY, nMaint, nOut, nNeurY, nNeurX int, dynMaint bool) (pfcMnt, pfcMntD, pfcOut, pfcOutD leabra.LeabraLayer)
AddPFC adds paired PFCmnt, PFCout and associated Deep layers, with given optional prefix. nY = number of pools in Y dimension, nMaint, nOut are pools in X dimension, and each pool has nNeurY, nNeurX neurons. dynMaint is true for maintenance-only dyn, else full set of 5 dynamic maintenance types. Appropriate OneToOne connections are made between PFCmntD -> PFCout.
func AddPFCLayer ¶ added in v1.1.11
func AddPFCLayer(nt *leabra.Network, name string, nY, nX, nNeurY, nNeurX int, out, dynMaint bool) (sp, dp leabra.LeabraLayer)
AddPFCLayer adds a PFCLayer, super and deep, of given size, with given name. nY, nX = number of pools in Y, X dimensions, and each pool has nNeurY, nNeurX neurons. out is true for output-gating layer, and dynmaint is true for maintenance-only dyn, else Full set of 5 dynamic maintenance types. Both have the class "PFC" set. deep is positioned behind super.
func AddPFCPy ¶ added in v1.1.15
func AddPFCPy(nt *leabra.Network, prefix string, nY, nMaint, nOut, nNeurY, nNeurX int, dynMaint bool) []leabra.LeabraLayer
AddPFCPy adds paired PFCmnt, PFCout and associated Deep layers, with given optional prefix. nY = number of pools in Y dimension, nMaint, nOut are pools in X dimension, and each pool has nNeurY, nNeurX neurons. dynMaint is true for maintenance-only dyn, else full set of 5 dynamic maintenance types. Appropriate OneToOne connections are made between PFCmntD -> PFCout. Py is Python version, returns layers as a slice
Types ¶
type CINLayer ¶ added in v1.1.11
type CINLayer struct { leabra.Layer RewThr float32 `` /* 164-byte string literal not displayed */ RewLays emer.LayNames `desc:"Reward-representing layer(s) from which this computes ACh as Max absolute value"` SendACh rl.SendACh `desc:"list of layers to send acetylcholine to"` ACh float32 `desc:"acetylcholine value for this layer"` }
CINLayer (cholinergic interneuron) reads reward signals from named source layer(s) and sends the Max absolute value of that activity as the positively-rectified non-prediction-discounted reward signal computed by CINs, and sent as an acetylcholine (ACh) signal. To handle positive-only reward signals, need to include both a reward prediction and reward outcome layer.
func AddCINLayer ¶ added in v1.1.11
AddCINLayer adds a CINLayer, with a single neuron.
func (*CINLayer) Build ¶ added in v1.1.11
Build constructs the layer state, including calling Build on the projections.
func (*CINLayer) CyclePost ¶ added in v1.1.11
CyclePost is called at end of Cycle We use it to send ACh, which will then be active for the next cycle of processing.
func (*CINLayer) MaxAbsRew ¶ added in v1.1.11
MaxAbsRew returns the maximum absolute value of reward layer activations
func (*CINLayer) UnitVal1D ¶ added in v1.1.11
UnitVal1D returns value of given variable index on given unit, using 1-dimensional index. returns NaN on invalid index. This is the core unit var access method used by other methods, so it is the only one that needs to be updated for derived layer types.
func (*CINLayer) UnitVarIdx ¶ added in v1.1.11
UnitVarIdx returns the index of given variable within the Neuron, according to UnitVarNames() list (using a map to lookup index), or -1 and error message if not found.
func (*CINLayer) UnitVarNum ¶ added in v1.1.11
UnitVarNum returns the number of Neuron-level variables for this layer. This is needed for extending indexes in derived types.
type DaHebbPrjn ¶
DaHebbPrjn does dopamine-modulated Hebbian learning -- i.e., the 3-factor learning rule: Da * Recv.Act * Send.Act
func (*DaHebbPrjn) DWt ¶
func (pj *DaHebbPrjn) DWt()
DWt computes the weight change (learning) -- on sending projections.
func (*DaHebbPrjn) Defaults ¶
func (pj *DaHebbPrjn) Defaults()
type DaModParams ¶
type DaModParams struct { On bool `desc:"whether to use dopamine modulation"` ModGain bool `viewif:"On" desc:"modulate gain instead of Ge excitatory synaptic input"` Minus float32 `` /* 145-byte string literal not displayed */ Plus float32 `` /* 144-byte string literal not displayed */ NegGain float32 `` /* 208-byte string literal not displayed */ PosGain float32 `` /* 208-byte string literal not displayed */ }
Params for effects of dopamine (Da) based modulation, typically adding a Da-based term to the Ge excitatory synaptic input. Plus-phase = learning effects relative to minus-phase "performance" dopamine effects
func (*DaModParams) Defaults ¶
func (dm *DaModParams) Defaults()
func (*DaModParams) Gain ¶
func (dm *DaModParams) Gain(da, gain float32, plusPhase bool) float32
Gain returns da-modulated gain value
func (*DaModParams) GainModOn ¶
func (dm *DaModParams) GainModOn() bool
GainModOn returns true if modulating Gain
func (*DaModParams) Ge ¶
func (dm *DaModParams) Ge(da, ge float32, plusPhase bool) float32
Ge returns da-modulated ge value
func (*DaModParams) GeModOn ¶
func (dm *DaModParams) GeModOn() bool
GeModOn returns true if modulating Ge
type DaReceptors ¶
type DaReceptors int
DaReceptors for D1R and D2R dopamine receptors
const ( // D1R primarily expresses Dopamine D1 Receptors -- dopamine is excitatory and bursts of dopamine lead to increases in synaptic weight, while dips lead to decreases -- direct pathway in dorsal striatum D1R DaReceptors = iota // D2R primarily expresses Dopamine D2 Receptors -- dopamine is inhibitory and bursts of dopamine lead to decreases in synaptic weight, while dips lead to increases -- indirect pathway in dorsal striatum D2R DaReceptorsN )
func (*DaReceptors) FromString ¶
func (i *DaReceptors) FromString(s string) error
func (DaReceptors) MarshalJSON ¶
func (ev DaReceptors) MarshalJSON() ([]byte, error)
func (DaReceptors) String ¶
func (i DaReceptors) String() string
func (*DaReceptors) UnmarshalJSON ¶
func (ev *DaReceptors) UnmarshalJSON(b []byte) error
type GPiGateParams ¶
type GPiGateParams struct { GeGain float32 `` /* 217-byte string literal not displayed */ NoGo float32 `` /* 178-byte string literal not displayed */ Thr float32 `` /* 242-byte string literal not displayed */ ThrAct bool `` /* 159-byte string literal not displayed */ }
GPiGateParams has gating parameters for gating in GPiThal layer, including threshold
func (*GPiGateParams) Defaults ¶
func (gp *GPiGateParams) Defaults()
func (*GPiGateParams) GeRaw ¶
func (gp *GPiGateParams) GeRaw(goRaw, nogoRaw float32) float32
GeRaw returns the net GeRaw from go, nogo specific values
type GPiNeuron ¶
type GPiNeuron struct {
ActG float32 `desc:"gating activation -- the activity value when gating occurred in this pool"`
}
GPiNeuron contains extra variables for GPiThalLayer neurons -- stored separately
type GPiThalLayer ¶
type GPiThalLayer struct { GateLayer Timing GPiTimingParams `view:"inline" desc:"timing parameters determining when gating happens"` Gate GPiGateParams `view:"inline" desc:"gating parameters determining threshold for gating etc"` SendTo []string `desc:"list of layers to send GateState to"` GPiNeurs []GPiNeuron `` /* 144-byte string literal not displayed */ }
GPiThalLayer represents the combined Winner-Take-All dynamic of GPi (SNr) and Thalamus. It is the final arbiter of gating in the BG, weighing Go (direct) and NoGo (indirect) inputs from MatrixLayers (indirectly via GPe layer in case of NoGo). Use 4D structure for this so it matches 4D structure in Matrix layers
func AddGPiThalLayer ¶ added in v1.1.11
func AddGPiThalLayer(nt *leabra.Network, name string, nY, nMaint, nOut int) *GPiThalLayer
AddGPiThalLayer adds a GPiThalLayer of given size, with given name. nY = number of pools in Y dimension, nMaint + nOut are pools in X dimension, and each pool has 1x1 neurons.
func (*GPiThalLayer) AddSendTo ¶
func (ly *GPiThalLayer) AddSendTo(laynm string)
AddSendTo adds given layer name to list of those to send DA to
func (*GPiThalLayer) AlphaCycInit ¶
func (ly *GPiThalLayer) AlphaCycInit()
AlphaCycInit handles all initialization at start of new input pattern, including computing input scaling from running average activation etc. should already have presented the external input to the network at this point. need to clear incrementing GeRaw from prjns
func (*GPiThalLayer) Build ¶
func (ly *GPiThalLayer) Build() error
Build constructs the layer state, including calling Build on the projections.
func (*GPiThalLayer) Defaults ¶
func (ly *GPiThalLayer) Defaults()
func (*GPiThalLayer) GFmInc ¶
func (ly *GPiThalLayer) GFmInc(ltime *leabra.Time)
GFmInc integrates new synaptic conductances from increments sent during last SendGDelta.
func (*GPiThalLayer) GateFmAct ¶
func (ly *GPiThalLayer) GateFmAct(ltime *leabra.Time)
GateFmAct updates GateState from current activations, at time of gating
func (*GPiThalLayer) GateSend ¶
func (ly *GPiThalLayer) GateSend(ltime *leabra.Time)
GateSend updates gating state and sends it along to other layers
func (*GPiThalLayer) GateType ¶
func (ly *GPiThalLayer) GateType() GateTypes
func (*GPiThalLayer) InitActs ¶
func (ly *GPiThalLayer) InitActs()
func (*GPiThalLayer) MatrixPrjns ¶
func (ly *GPiThalLayer) MatrixPrjns() (goPrjn, nogoPrjn *GPiThalPrjn, err error)
MatrixPrjns returns the recv prjns from Go and NoGo MatrixLayer pathways -- error if not found or if prjns are not of the GPiThalPrjn type
func (*GPiThalLayer) RecGateAct ¶
func (ly *GPiThalLayer) RecGateAct(ltime *leabra.Time)
RecGateAct records the gating activation from current activation, when gating occcurs based on GateState.Now
func (*GPiThalLayer) SendGateShape ¶
func (ly *GPiThalLayer) SendGateShape() error
SendGateShape send GateShape info to all SendTo layers -- convenient config-time way to ensure all are consistent -- also checks validity of SendTo's
func (*GPiThalLayer) SendGateStates ¶
func (ly *GPiThalLayer) SendGateStates()
SendGateStates sends GateStates to other layers
func (*GPiThalLayer) SendToCheck ¶
func (ly *GPiThalLayer) SendToCheck() error
SendToCheck is called during Build to ensure that SendTo layers are valid
func (*GPiThalLayer) SendToMatrixPFC ¶
func (ly *GPiThalLayer) SendToMatrixPFC(prefix string)
SendToMatrixPFC adds standard SendTo layers for PBWM: MatrixGo, NoGo, PFCmntD, PFCoutD with optional prefix -- excludes mnt, out cases if corresp shape = 0
func (*GPiThalLayer) UnitValByIdx ¶
func (ly *GPiThalLayer) UnitValByIdx(vidx NeurVars, idx int) float32
UnitValByIdx returns value of given PBWM-specific variable by variable index and flat neuron index (from layer or neuron-specific one).
type GPiThalPrjn ¶
type GPiThalPrjn struct { leabra.Prjn // access as .Prjn GeRaw []float32 `desc:"per-recv, per-prjn raw excitatory input"` }
GPiThalPrjn accumulates per-prjn raw conductance that is needed for separately weighting NoGo vs. Go inputs
func (*GPiThalPrjn) Build ¶
func (pj *GPiThalPrjn) Build() error
func (*GPiThalPrjn) InitGInc ¶
func (pj *GPiThalPrjn) InitGInc()
func (*GPiThalPrjn) RecvGInc ¶
func (pj *GPiThalPrjn) RecvGInc()
RecvGInc increments the receiver's GeInc or GiInc from that of all the projections.
type GPiTimingParams ¶
type GPiTimingParams struct { GateQtr leabra.Quarters `` /* 247-byte string literal not displayed */ Cycle int `` /* 139-byte string literal not displayed */ }
GPiTimingParams has timing parameters for gating in the GPiThal layer
func (*GPiTimingParams) Defaults ¶
func (gt *GPiTimingParams) Defaults()
type GateLayer ¶
type GateLayer struct { Layer GateShp GateShape `desc:"shape of overall Maint + Out gating system that this layer is part of"` GateStates []GateState `` /* 192-byte string literal not displayed */ }
GateLayer is a layer that cares about thalamic (BG) gating signals, and has slice of GateState fields that a gating layer will update.
func (*GateLayer) Build ¶
Build constructs the layer state, including calling Build on the projections.
func (*GateLayer) GateState ¶
GateState returns the GateState for given pool index (0 based) on this layer
func (*GateLayer) SetGateState ¶
SetGateState sets the GateState for given pool index (individual pools start at 1) on this layer
func (*GateLayer) SetGateStates ¶
SetGateStates sets the GateStates from given source states, of given gating type
type GateLayerer ¶
type GateLayerer interface { // AsGate returns the layer as a GateLayer layer, for direct access to fields AsGate() *GateLayer // GateType returns the type of gating supported by this layer GateType() GateTypes // GateShape returns the shape of gating system that this layer is part of GateShape() *GateShape // GateState returns the GateState for given pool index (0-based) on this layer GateState(poolIdx int) *GateState // SetGateState sets the GateState for given pool index (0-based) on this layer SetGateState(poolIdx int, state *GateState) // SetGateStates sets the GateStates from given source states, of given gating type SetGateStates(states []GateState, typ GateTypes) }
GateLayerer is an interface for GateLayer layers
type GateShape ¶
type GateShape struct { Y int `desc:"overall shape dimensions for the full set of gating pools, e.g., as present in the Matrix and GPiThal levels"` MaintX int `desc:"how many pools in the X dimension are Maint gating pools -- rest are Out"` OutX int `desc:"how many pools in the X dimension are Out gating pools -- comes after Maint"` }
GateShape defines the shape of the outer pool dimensions of gating layers, organized into Maint and Out subsets which are arrayed along the X axis with Maint first (to the left) then Out. Individual layers may only represent Maint or Out subsets of this overall shape, but all need to have this coordinated shape information to be able to share gating state information. Each layer represents gate state information in their native geometry -- FullIndex1D provides access from a subset to full set.
func (*GateShape) FullIndex1D ¶
FullIndex1D returns the index into full MaintOut GateStates for given 1D pool idx (0-based) *from given GateType*.
func (*GateShape) Index ¶
Index returns the index into GateStates for given 2D pool coords for given GateType. Each type stores gate info in its "native" 2D format.
type GateState ¶
type GateState struct { Act float32 `` /* 203-byte string literal not displayed */ Now bool `desc:"gating timing signal -- true if this is the moment when gating takes place"` Cnt int `` /* 307-byte string literal not displayed */ }
GateState is gating state values stored in layers that receive thalamic gating signals including MatrixLayer, PFCLayer, GPiThal layer, etc -- use GateLayer as base layer to include.
type GateTypes ¶
type GateTypes int
GateTypes for region of striatum
func (*GateTypes) FromString ¶
func (GateTypes) MarshalJSON ¶
func (*GateTypes) UnmarshalJSON ¶
type Layer ¶
type Layer struct { leabra.Layer DA float32 `inactive:"+" desc:"current dopamine level for this layer"` ACh float32 `inactive:"+" desc:"current acetylcholine level for this layer"` SE float32 `inactive:"+" desc:"current serotonin level for this layer"` }
pbwm.Layer is the base layer type for PBWM framework -- has variables for the layer-level neuromodulatory variables: dopamine, ach, serotonin. See ModLayer for a version that includes DA-modulated learning parameters,
func AddGPeLayer ¶ added in v1.1.11
AddGPeLayer adds a pbwm.Layer to serve as a GPe layer, with given name. nY = number of pools in Y dimension, nMaint + nOut are pools in X dimension, and each pool has 1x1 neurons.
func (*Layer) AsGate ¶ added in v1.1.10
AsGate returns this layer as a pbwm.GateLayer -- nil for Layer
func (*Layer) DoQuarter2DWt ¶ added in v1.1.10
DoQuarter2DWt indicates whether to do optional Q2 DWt
func (*Layer) GateSend ¶ added in v1.1.10
GateSend updates gating state and sends it along to other layers. most layers don't implement -- only gating layers
func (*Layer) Quarter2DWt ¶ added in v1.1.10
func (ly *Layer) Quarter2DWt()
Quarter2DWt is optional Q2 DWt -- define where relevant
func (*Layer) QuarterFinal ¶ added in v1.1.10
QuarterFinal does updating after end of a quarter
func (*Layer) RecGateAct ¶ added in v1.1.10
RecGateAct records the gating activation from current activation, when gating occcurs based on GateState.Now -- only for gating layers
func (*Layer) SendMods ¶ added in v1.1.10
SendMods is called at end of Cycle to send modulator signals (DA, etc) which will then be active for the next cycle of processing
func (*Layer) UnitVal1D ¶ added in v1.1.10
UnitVal1D returns value of given variable index on given unit, using 1-dimensional index. returns NaN on invalid index. This is the core unit var access method used by other methods, so it is the only one that needs to be updated for derived layer types.
func (*Layer) UnitValByIdx ¶ added in v1.1.10
UnitValByIdx returns value of given PBWM-specific variable by variable index and flat neuron index (from layer or neuron-specific one). This must be updated for specialized PBWM layer types to return correct variables!
func (*Layer) UnitVarIdx ¶ added in v1.1.10
UnitVarIdx returns the index of given variable within the Neuron, according to UnitVarNames() list (using a map to lookup index), or -1 and error message if not found.
func (*Layer) UnitVarNames ¶ added in v1.1.10
UnitVarNames returns a list of variable names available on the units in this layer.
func (*Layer) UnitVarNum ¶ added in v1.1.11
UnitVarNum returns the number of Neuron-level variables for this layer. This is needed for extending indexes in derived types.
func (*Layer) UpdateParams ¶ added in v1.1.10
func (ly *Layer) UpdateParams()
UpdateParams updates all params given any changes that might have been made to individual values including those in the receiving projections of this layer
type MatrixLayer ¶
type MatrixLayer struct { GateLayer MaintN int `desc:"number of Maint Pools in X outer dimension of 4D shape -- Out gating after that"` DaR DaReceptors `desc:"dominant type of dopamine receptor -- D1R for Go pathway, D2R for NoGo"` Matrix MatrixParams `view:"inline" desc:"matrix parameters"` MatrixNeurs []MatrixNeuron `` /* 147-byte string literal not displayed */ }
MatrixLayer represents the dorsal matrisome MSN's that are the main Go / NoGo gating units in BG driving updating of PFC WM in PBWM. D1R = Go, D2R = NoGo, and outer 4D Pool X dimension determines GateTypes per MaintN (Maint on the left up to MaintN, Out on the right after)
func AddMatrixLayer ¶ added in v1.1.11
func AddMatrixLayer(nt *leabra.Network, name string, nY, nMaint, nOut, nNeurY, nNeurX int, da DaReceptors) *MatrixLayer
AddMatrixLayer adds a MatrixLayer of given size, with given name. nY = number of pools in Y dimension, nMaint + nOut are pools in X dimension, and each pool has nNeurY, nNeurX neurons. da gives the DaReceptor type (D1R = Go, D2R = NoGo)
func (*MatrixLayer) ActFmG ¶
func (ly *MatrixLayer) ActFmG(ltime *leabra.Time)
ActFmG computes rate-code activation from Ge, Gi, Gl conductances and updates learning running-average activations from that Act. Matrix extends to call DaAChFmLay
func (*MatrixLayer) Build ¶
func (ly *MatrixLayer) Build() error
Build constructs the layer state, including calling Build on the projections you MUST have properly configured the Inhib.Pool.On setting by this point to properly allocate Pools for the unit groups if necessary.
func (*MatrixLayer) DALrnFmDA ¶
func (ly *MatrixLayer) DALrnFmDA(da float32) float32
DALrnFmDA returns effective learning dopamine value from given raw DA value applying Burst and Dip Gain factors, and then reversing sign for D2R.
func (*MatrixLayer) DaAChFmLay ¶
func (ly *MatrixLayer) DaAChFmLay(ltime *leabra.Time)
DaAChFmLay computes Da and ACh from layer and Shunt received from PatchLayer units
func (*MatrixLayer) Defaults ¶
func (ly *MatrixLayer) Defaults()
func (*MatrixLayer) DoQuarter2DWt ¶
func (ly *MatrixLayer) DoQuarter2DWt() bool
DoQuarter2DWt indicates whether to do optional Q2 DWt
func (*MatrixLayer) GateType ¶
func (ly *MatrixLayer) GateType() GateTypes
func (*MatrixLayer) InhibFmGeAct ¶
func (ly *MatrixLayer) InhibFmGeAct(ltime *leabra.Time)
InhibiFmGeAct computes inhibition Gi from Ge and Act averages within relevant Pools Matrix version applies OutAChInhib to bias output gating on reward trials
func (*MatrixLayer) InitActs ¶
func (ly *MatrixLayer) InitActs()
func (*MatrixLayer) RecGateAct ¶
func (ly *MatrixLayer) RecGateAct(ltime *leabra.Time)
RecGateAct records the gating activation from current activation, when gating occcurs based on GateState.Now
func (*MatrixLayer) UnitValByIdx ¶
func (ly *MatrixLayer) UnitValByIdx(vidx NeurVars, idx int) float32
UnitValByIdx returns value of given PBWM-specific variable by variable index and flat neuron index (from layer or neuron-specific one).
type MatrixNeuron ¶
type MatrixNeuron struct { DA float32 `desc:"per-neuron modulated dopamine level, derived from layer DA and Shunt"` DALrn float32 `desc:"per-neuron effective learning dopamine value -- gain modulated and sign reversed for D2R"` ACh float32 `desc:"per-neuron modulated ACh level, derived from layer ACh and Shunt"` Shunt float32 `desc:"shunting input received from Patch neurons (in reality flows through SNc DA pathways)"` ActG float32 `desc:"gating activation -- the activity value when gating occurred in this pool"` }
MatrixNeuron contains extra variables for MatrixLayer neurons -- stored separately
type MatrixParams ¶
type MatrixParams struct { LearnQtr leabra.Quarters `` /* 199-byte string literal not displayed */ PatchShunt float32 `` /* 173-byte string literal not displayed */ ShuntACh bool `` /* 269-byte string literal not displayed */ OutAChInhib float32 `` /* 354-byte string literal not displayed */ BurstGain float32 `` /* 237-byte string literal not displayed */ DipGain float32 `` /* 237-byte string literal not displayed */ }
MatrixParams has parameters for Dorsal Striatum Matrix computation These are the main Go / NoGo gating units in BG driving updating of PFC WM in PBWM
func (*MatrixParams) Defaults ¶
func (mp *MatrixParams) Defaults()
type MatrixTracePrjn ¶
type MatrixTracePrjn struct { leabra.Prjn Trace TraceParams `view:"inline" desc:"special parameters for matrix trace learning"` TrSyns []TraceSyn `desc:"trace synaptic state values, ordered by the sending layer units which owns them -- one-to-one with SConIdx array"` }
MatrixTracePrjn does dopamine-modulated, gated trace learning, for Matrix learning in PBWM context
func (*MatrixTracePrjn) Build ¶
func (pj *MatrixTracePrjn) Build() error
func (*MatrixTracePrjn) ClearTrace ¶ added in v1.0.4
func (pj *MatrixTracePrjn) ClearTrace()
func (*MatrixTracePrjn) DWt ¶
func (pj *MatrixTracePrjn) DWt()
DWt computes the weight change (learning) -- on sending projections.
func (*MatrixTracePrjn) Defaults ¶
func (pj *MatrixTracePrjn) Defaults()
func (*MatrixTracePrjn) InitWts ¶
func (pj *MatrixTracePrjn) InitWts()
func (*MatrixTracePrjn) SynVal1D ¶ added in v1.1.10
func (pj *MatrixTracePrjn) SynVal1D(varIdx int, synIdx int) float32
SynVal1D returns value of given variable index (from SynVarIdx) on given SynIdx. Returns NaN on invalid index. This is the core synapse var access method used by other methods, so it is the only one that needs to be updated for derived layer types.
type ModLayer ¶
type ModLayer struct { Layer DaMod DaModParams `` /* 180-byte string literal not displayed */ }
ModLayer provides DA modulated learning to basic Leabra layers.
type Network ¶
pbwm.Network has methods for configuring specialized PBWM network components
func (*Network) AddCINLayer ¶ added in v1.1.11
AddCINLayer adds a CINLayer, with a single neuron.
func (*Network) AddDorsalBG ¶
func (nt *Network) AddDorsalBG(prefix string, nY, nMaint, nOut, nNeurY, nNeurX int) (mtxGo, mtxNoGo, gpe, gpi, cin leabra.LeabraLayer)
AddDorsalBG adds MatrixGo, NoGo, GPe, GPiThal, and CIN layers, with given optional prefix. nY = number of pools in Y dimension, nMaint + nOut are pools in X dimension, and each pool has nNeurY, nNeurX neurons. Appropriate PoolOneToOne connections are made to drive GPiThal, with BgFixed class name set so they can be styled appropriately (no learning, WtRnd.Mean=0.8, Var=0)
func (*Network) AddGPeLayer ¶
AddGPeLayer adds a pbwm.Layer to serve as a GPe layer, with given name. nY = number of pools in Y dimension, nMaint + nOut are pools in X dimension, and each pool has 1x1 neurons.
func (*Network) AddGPiThalLayer ¶
func (nt *Network) AddGPiThalLayer(name string, nY, nMaint, nOut int) *GPiThalLayer
AddGPiThalLayer adds a GPiThalLayer of given size, with given name. nY = number of pools in Y dimension, nMaint + nOut are pools in X dimension, and each pool has 1x1 neurons.
func (*Network) AddMatrixLayer ¶
func (nt *Network) AddMatrixLayer(name string, nY, nMaint, nOut, nNeurY, nNeurX int, da DaReceptors) *MatrixLayer
AddMatrixLayer adds a MatrixLayer of given size, with given name. nY = number of pools in Y dimension, nMaint + nOut are pools in X dimension, and each pool has nNeurY, nNeurX neurons. da gives the DaReceptor type (D1R = Go, D2R = NoGo)
func (*Network) AddPBWM ¶
func (nt *Network) AddPBWM(prefix string, nY, nMaint, nOut, nNeurBgY, nNeurBgX, nNeurPfcY, nNeurPfcX int) (mtxGo, mtxNoGo, gpe, gpi, cin, pfcMnt, pfcMntD, pfcOut, pfcOutD leabra.LeabraLayer)
AddPBWM adds a DorsalBG and PFC with given params Defaults to simple case of basic maint dynamics in Deep
func (*Network) AddPFC ¶
func (nt *Network) AddPFC(prefix string, nY, nMaint, nOut, nNeurY, nNeurX int, dynMaint bool) (pfcMnt, pfcMntD, pfcOut, pfcOutD leabra.LeabraLayer)
AddPFC adds paired PFCmnt, PFCout and associated Deep layers, with given optional prefix. nY = number of pools in Y dimension, nMaint, nOut are pools in X dimension, and each pool has nNeurY, nNeurX neurons. dynMaint is true for maintenance-only dyn, else full set of 5 dynamic maintenance types. Appropriate OneToOne connections are made between PFCmntD -> PFCout.
func (*Network) AddPFCLayer ¶
func (nt *Network) AddPFCLayer(name string, nY, nX, nNeurY, nNeurX int, out, dynMaint bool) (sp, dp leabra.LeabraLayer)
AddPFCLayer adds a PFCLayer, super and deep, of given size, with given name. nY, nX = number of pools in Y, X dimensions, and each pool has nNeurY, nNeurX neurons. out is true for output-gating layer, and dynmaint is true for maintenance-only dyn, else Full set of 5 dynamic maintenance types. Both have the class "PFC" set. deep is positioned behind super.
func (*Network) CycleImpl ¶ added in v1.1.10
CycleImpl runs one cycle of activation updating PBWM calls GateSend after Cycle and before DeepBurst
func (*Network) Defaults ¶
func (nt *Network) Defaults()
Defaults sets all the default parameters for all layers and projections
func (*Network) GateSend ¶
GateSend is called at end of Cycle, computes Gating and sends to other layers
func (*Network) RecGateAct ¶
RecGateAct is called after GateSend, to record gating activations at time of gating
func (*Network) SendMods ¶
SendMods is called at end of Cycle to send modulator signals (DA, etc) which will then be active for the next cycle of processing
func (*Network) SynVarNames ¶ added in v1.1.10
SynVarNames returns the names of all the variables on the synapses in this network.
func (*Network) UnitVarNames ¶ added in v1.1.10
UnitVarNames returns a list of variable names available on the units in this layer
func (*Network) UpdateParams ¶
func (nt *Network) UpdateParams()
UpdateParams updates all the derived parameters if any have changed, for all layers and projections
type NeurVars ¶ added in v1.1.10
type NeurVars int
NeurVars are indexes into extra PBWM neuron-level variables
type PBWMLayer ¶
type PBWMLayer interface { leabra.LeabraLayer // AsPBWM returns this layer as a pbwm.Layer (base Layer in PBWM) AsPBWM() *Layer // AsGate returns this layer as a pbwm.GateLayer (gated layer type) -- nil if not impl AsGate() *GateLayer // UnitValByIdx returns value of given PBWM-specific variable by variable index // and flat neuron index (from layer or neuron-specific one). UnitValByIdx(vidx NeurVars, idx int) float32 // GateSend updates gating state and sends it along to other layers. // Called after std Cycle methods. // Only implemented for gating layers. GateSend(ltime *leabra.Time) // RecGateAct records the gating activation from current activation, when gating occcurs // based on GateState.Now RecGateAct(ltime *leabra.Time) // SendMods is called at end of Cycle to send modulator signals (DA, etc) // which will then be active for the next cycle of processing SendMods(ltime *leabra.Time) // Quarter2DWt is optional Q2 DWt -- PFC and matrix layers can do this as appropriate Quarter2DWt() // DoQuarter2DWt returns true if this recv layer should have its weights updated DoQuarter2DWt() bool }
PBWMLayer defines the essential algorithmic API for PBWM at the layer level. Builds upon the leabra.LeabraLayer API
type PFCDeepLayer ¶ added in v1.1.11
type PFCDeepLayer struct { GateLayer Gate PFCGateParams `view:"inline" desc:"PFC Gating parameters"` Maint PFCMaintParams `view:"inline" desc:"PFC Maintenance parameters"` Dyns PFCDyns `` /* 286-byte string literal not displayed */ PFCNeurs []PFCNeuron `` /* 144-byte string literal not displayed */ }
PFCDeepLayer is a Prefrontal Cortex BG-gated deep working memory layer. This handles all of the PFC-specific functionality, looking for a corresponding Super layer with the same name except no final D. If Dyns are used, they are represented in extra Y-axis neurons, with the inner-loop being the basic Super Y axis values for each Dyn type, and outer-loop the Dyn types.
func (*PFCDeepLayer) ActFmG ¶ added in v1.1.11
func (ly *PFCDeepLayer) ActFmG(ltime *leabra.Time)
ActFmG computes rate-code activation from Ge, Gi, Gl conductances and updates learning running-average activations from that Act. PFC extends to call Gating.
func (*PFCDeepLayer) Build ¶ added in v1.1.11
func (ly *PFCDeepLayer) Build() error
Build constructs the layer state, including calling Build on the projections.
func (*PFCDeepLayer) ClearMaint ¶ added in v1.1.11
func (ly *PFCDeepLayer) ClearMaint(pool int)
ClearMaint resets maintenance in corresponding pool (0 based) in maintenance layer
func (*PFCDeepLayer) DeepMaint ¶ added in v1.1.11
func (ly *PFCDeepLayer) DeepMaint(ltime *leabra.Time)
DeepMaint updates deep maintenance activations
func (*PFCDeepLayer) Defaults ¶ added in v1.1.11
func (ly *PFCDeepLayer) Defaults()
func (*PFCDeepLayer) DoQuarter2DWt ¶ added in v1.1.11
func (ly *PFCDeepLayer) DoQuarter2DWt() bool
DoQuarter2DWt indicates whether to do optional Q2 DWt
func (*PFCDeepLayer) GFmInc ¶ added in v1.1.11
func (ly *PFCDeepLayer) GFmInc(ltime *leabra.Time)
GFmInc integrates new synaptic conductances from increments sent during last SendGDelta.
func (*PFCDeepLayer) GateType ¶ added in v1.1.11
func (ly *PFCDeepLayer) GateType() GateTypes
func (*PFCDeepLayer) Gating ¶ added in v1.1.11
func (ly *PFCDeepLayer) Gating(ltime *leabra.Time)
Gating updates PFC Gating state
func (*PFCDeepLayer) InitActs ¶ added in v1.1.11
func (ly *PFCDeepLayer) InitActs()
func (*PFCDeepLayer) MaintPFC ¶ added in v1.1.11
func (ly *PFCDeepLayer) MaintPFC() *PFCDeepLayer
MaintPFC returns corresponding PFCDeep maintenance layer with same name but outD -> mntD could be nil
func (*PFCDeepLayer) QuarterFinal ¶ added in v1.1.11
func (ly *PFCDeepLayer) QuarterFinal(ltime *leabra.Time)
QuarterFinal does updating after end of a quarter
func (*PFCDeepLayer) RecGateAct ¶ added in v1.1.11
func (ly *PFCDeepLayer) RecGateAct(ltime *leabra.Time)
RecGateAct records the gating activation from current activation, when gating occcurs based on GateState.Now
func (*PFCDeepLayer) SuperPFC ¶ added in v1.1.11
func (ly *PFCDeepLayer) SuperPFC() leabra.LeabraLayer
SuperPFC returns corresponding PFC super layer with same name without D should not be nil. Super can be any layer type.
func (*PFCDeepLayer) UnitValByIdx ¶ added in v1.1.11
func (ly *PFCDeepLayer) UnitValByIdx(vidx NeurVars, idx int) float32
UnitValByIdx returns value of given PBWM-specific variable by variable index and flat neuron index (from layer or neuron-specific one).
func (*PFCDeepLayer) UpdtGateCnt ¶ added in v1.1.11
func (ly *PFCDeepLayer) UpdtGateCnt(ltime *leabra.Time)
UpdtGateCnt updates the gate counter
type PFCDyn ¶
type PFCDyn struct { Init float32 `desc:"initial value at point when gating starts -- MUST be > 0 when used."` RiseTau float32 `` /* 161-byte string literal not displayed */ DecayTau float32 `` /* 162-byte string literal not displayed */ Desc string `desc:"description of this factor"` }
PFC dynamic behavior element -- defines the dynamic behavior of deep layer PFC units
type PFCDyns ¶
type PFCDyns []*PFCDyn
PFCDyns is a slice of dyns. Provides deterministic control over PFC maintenance dynamics -- the rows of PFC units (along Y axis) behave according to corresponding index of Dyns. ensure layer Y dim has even multiple of len(Dyns).
func (*PFCDyns) FullDyn ¶
FullDyn creates full dynamic Dyn configuration, with 5 different dynamic profiles: stable maint, phasic, rising maint, decaying maint, and up / down maint. tau is the rise / decay base time constant.
func (*PFCDyns) MaintOnly ¶
func (pd *PFCDyns) MaintOnly()
MaintOnly creates basic default maintenance dynamic configuration -- every unit just maintains over time. This should be used for Output gating layer.
type PFCGateParams ¶
type PFCGateParams struct { GateQtr leabra.Quarters `` /* 135-byte string literal not displayed */ OutGate bool `desc:"if true, this PFC layer is an output gate layer, which means that it only has transient activation during gating"` OutQ1Only bool `` /* 344-byte string literal not displayed */ }
PFCGateParams has parameters for PFC gating
func (*PFCGateParams) Defaults ¶
func (gp *PFCGateParams) Defaults()
type PFCMaintParams ¶
type PFCMaintParams struct { UseDyn bool `` /* 262-byte string literal not displayed */ MaintGain float32 `min:"0" def:"0.8" desc:"multiplier on maint current"` OutClearMaint bool `` /* 151-byte string literal not displayed */ Clear float32 `` /* 210-byte string literal not displayed */ MaxMaint int `` /* 200-byte string literal not displayed */ }
PFCMaintParams for PFC maintenance functions
func (*PFCMaintParams) Defaults ¶
func (mp *PFCMaintParams) Defaults()
type PFCNeuron ¶
type PFCNeuron struct { ActG float32 `desc:"gating activation -- the activity value when gating occurred in this pool"` Maint float32 `desc:"maintenance value for Deep layers = sending act at time of gating"` MaintGe float32 `desc:"maintenance excitatory conductance value for Deep layers"` }
PFCNeuron contains extra variables for PFCLayer neurons -- stored separately
type TraceParams ¶
type TraceParams struct { NotGatedLR float32 `` /* 351-byte string literal not displayed */ GateNoGoPosLR float32 `` /* 947-byte string literal not displayed */ AChDecay float32 `min:"0" def:"0" desc:"decay driven by receiving unit ACh value, sent by CIN units, for reseting the trace"` Decay float32 `` /* 294-byte string literal not displayed */ Deriv bool `` /* 305-byte string literal not displayed */ }
Params for for trace-based learning in the MatrixTracePrjn
func (*TraceParams) Defaults ¶
func (tp *TraceParams) Defaults()
func (*TraceParams) LrateMod ¶
func (tp *TraceParams) LrateMod(gated, d2r, posDa bool) float32
LrateMod returns the learning rate modulator based on gating, d2r, and posDa factors
func (*TraceParams) LrnFactor ¶
func (tp *TraceParams) LrnFactor(act float32) float32
LrnFactor resturns multiplicative factor for level of msn activation. If Deriv is 2 * act * (1-act) -- the factor of 2 compensates for otherwise reduction in learning from these factors. Otherwise is just act.
type TraceSyn ¶
type TraceSyn struct { NTr float32 `` /* 136-byte string literal not displayed */ Tr float32 `` /* 183-byte string literal not displayed */ }
TraceSyn holds extra synaptic state for trace projections
func (*TraceSyn) VarByIndex ¶ added in v1.1.10
VarByIndex returns synapse variable by index
type Valences ¶
type Valences int
Valences for Appetitive and Aversive valence coding