Documentation ¶
Index ¶
- Constants
- Variables
- func ConnectNMDA(nt *leabra.Network, send, recv emer.Layer, pat prjn.Pattern) emer.Prjn
- func NeuronVarIdxByName(varNm string) (int, error)
- type AlphaMaxLayer
- func (ly *AlphaMaxLayer) ActFmG(ltime *leabra.Time)
- func (ly *AlphaMaxLayer) ActLrnFmAlphaMax()
- func (ly *AlphaMaxLayer) AlphaCycInit(updtActAvg bool)
- func (ly *AlphaMaxLayer) AlphaMaxFmAct(ltime *leabra.Time)
- func (ly *AlphaMaxLayer) Build() error
- func (ly *AlphaMaxLayer) Defaults()
- func (ly *AlphaMaxLayer) InitActs()
- func (ly *AlphaMaxLayer) InitAlphaMax()
- func (ly *AlphaMaxLayer) MaxAlphaMax() float32
- func (ly *AlphaMaxLayer) UnitVal1D(varIdx int, idx int) float32
- func (ly *AlphaMaxLayer) UnitVarIdx(varNm string) (int, error)
- func (ly *AlphaMaxLayer) UnitVarNum() int
- type GABABParams
- func (gp *GABABParams) BiExp(g, x float32) (dG, dX float32)
- func (gp *GABABParams) Defaults()
- func (gp *GABABParams) GABAB(gabaB, gabaBx, gi float32) (g, x float32)
- func (gp *GABABParams) GFmS(s float32) float32
- func (gp *GABABParams) GFmV(v float32) float32
- func (gp *GABABParams) GgabaB(gabaB, vm float32) float32
- func (gp *GABABParams) Update()
- type Layer
- func (ly *Layer) ActFmG(ltime *leabra.Time)
- func (ly *Layer) ActLrnFmAlphaMax()
- func (ly *Layer) AlphaCycInit(updtActAvg bool)
- func (ly *Layer) AlphaMaxFmAct(ltime *leabra.Time)
- func (ly *Layer) Build() error
- func (ly *Layer) DecayState(decay float32)
- func (ly *Layer) Defaults()
- func (ly *Layer) GABABFmGi(ltime *leabra.Time)
- func (ly *Layer) GFmInc(ltime *leabra.Time)
- func (ly *Layer) GFmIncNeur(ltime *leabra.Time)
- func (ly *Layer) InitActs()
- func (ly *Layer) InitAlphaMax()
- func (ly *Layer) InitGInc()
- func (ly *Layer) InitGlong()
- func (ly *Layer) MaxAlphaMax() float32
- func (ly *Layer) RecvGInc(ltime *leabra.Time)
- func (ly *Layer) RecvGnmdaPInc(ltime *leabra.Time)
- func (ly *Layer) UnitVal1D(varIdx int, idx int) float32
- func (ly *Layer) UnitVarIdx(varNm string) (int, error)
- func (ly *Layer) UnitVarNum() int
- type NMDAParams
- type NMDAPrjn
- type Network
- type Neuron
- type PrjnType
Constants ¶
const ( // NMDAPrjn are projections that have strong NMDA channels supporting maintenance NMDA emer.PrjnType = emer.PrjnType(emer.PrjnTypeN) + iota )
The GLong prjn types
Variables ¶
var ( // NeuronVars are extra neuron variables for glong NeuronVars = []string{"AlphaMax", "VmEff", "Gnmda", "NMDA", "NMDASyn", "GgabaB", "GABAB", "GABABx"} // NeuronVarsAll is the glong collection of all neuron-level vars NeuronVarsAll []string NeuronVarsMap map[string]int // NeuronVarProps are integrated neuron var props including leabra NeuronVarProps = map[string]string{ "NMDA": `auto-scale:"+"`, "GABAB": `auto-scale:"+"`, "GABABx": `auto-scale:"+"`, } )
var KiT_AlphaMaxLayer = kit.Types.AddType(&AlphaMaxLayer{}, leabra.LayerProps)
var KiT_Layer = kit.Types.AddType(&Layer{}, leabra.LayerProps)
var KiT_NMDAPrjn = kit.Types.AddType(&NMDAPrjn{}, PrjnProps)
var KiT_Network = kit.Types.AddType(&Network{}, NetworkProps)
var KiT_PrjnType = kit.Enums.AddEnumExt(emer.KiT_PrjnType, PrjnTypeN, kit.NotBitFlag, nil)
var NetworkProps = leabra.NetworkProps
var PrjnProps = ki.Props{ "EnumType:Typ": KiT_PrjnType, }
Functions ¶
func ConnectNMDA ¶
ConnectNMDA adds a NMDAPrjn between given layers
func NeuronVarIdxByName ¶ added in v1.1.4
NeuronVarIdxByName returns the index of the variable in the Neuron, or error
Types ¶
type AlphaMaxLayer ¶
type AlphaMaxLayer struct { leabra.Layer AlphaMaxCyc int `desc:"cycle upon which to start updating AlphaMax value"` AlphaMaxs []float32 `desc:"per-neuron maximum activation value during alpha cycle"` }
AlphaMaxLayer computes the maximum activation per neuron over the alpha cycle. Needed for recording activations on layers with transient dynamics over alpha.
func (*AlphaMaxLayer) ActFmG ¶
func (ly *AlphaMaxLayer) ActFmG(ltime *leabra.Time)
func (*AlphaMaxLayer) ActLrnFmAlphaMax ¶
func (ly *AlphaMaxLayer) ActLrnFmAlphaMax()
ActLrnFmAlphaMax sets ActLrn to AlphaMax
func (*AlphaMaxLayer) AlphaCycInit ¶
func (ly *AlphaMaxLayer) AlphaCycInit(updtActAvg bool)
func (*AlphaMaxLayer) AlphaMaxFmAct ¶
func (ly *AlphaMaxLayer) AlphaMaxFmAct(ltime *leabra.Time)
AlphaMaxFmAct computes AlphaMax from Activation
func (*AlphaMaxLayer) Build ¶
func (ly *AlphaMaxLayer) Build() error
func (*AlphaMaxLayer) Defaults ¶
func (ly *AlphaMaxLayer) Defaults()
func (*AlphaMaxLayer) InitActs ¶
func (ly *AlphaMaxLayer) InitActs()
func (*AlphaMaxLayer) InitAlphaMax ¶
func (ly *AlphaMaxLayer) InitAlphaMax()
InitAlphaMax initializes the AlphaMax to 0
func (*AlphaMaxLayer) MaxAlphaMax ¶
func (ly *AlphaMaxLayer) MaxAlphaMax() float32
MaxAlphaMax returns the maximum AlphaMax across the layer
func (*AlphaMaxLayer) UnitVal1D ¶
func (ly *AlphaMaxLayer) UnitVal1D(varIdx int, idx int) float32
UnitVal1D returns value of given variable index on given unit, using 1-dimensional index. returns NaN on invalid index. This is the core unit var access method used by other methods, so it is the only one that needs to be updated for derived layer types.
func (*AlphaMaxLayer) UnitVarIdx ¶
func (ly *AlphaMaxLayer) UnitVarIdx(varNm string) (int, error)
UnitVarIdx returns the index of given variable within the Neuron, according to UnitVarNames() list (using a map to lookup index), or -1 and error message if not found.
func (*AlphaMaxLayer) UnitVarNum ¶
func (ly *AlphaMaxLayer) UnitVarNum() int
UnitVarNum returns the number of Neuron-level variables for this layer. This is needed for extending indexes in derived types.
type GABABParams ¶
type GABABParams struct { RiseTau float32 `def:"45" desc:"rise time for bi-exponential time dynamics of GABA-B"` DecayTau float32 `def:"50" desc:"decay time for bi-exponential time dynamics of GABA-B"` Gbar float32 `def:"0.2" desc:"overall strength multiplier of GABA-B current"` Gbase float32 `` /* 130-byte string literal not displayed */ Smult float32 `def:"15" desc:"multiplier for converting Gi from FFFB to GABA spikes"` MaxTime float32 `inactive:"+" desc:"time offset when peak conductance occurs, in msec, computed from RiseTau and DecayTau"` TauFact float32 `view:"-" desc:"time constant factor used in integration: (Decay / Rise) ^ (Rise / (Decay - Rise))"` }
GABABParams control the GABAB dynamics in PFC Maint neurons, based on Brunel & Wang (2001) parameters. We have to do some things to make it work for rate code neurons..
func (*GABABParams) BiExp ¶
func (gp *GABABParams) BiExp(g, x float32) (dG, dX float32)
BiExp computes bi-exponential update, returns dG and dX deltas to add to g and x
func (*GABABParams) Defaults ¶
func (gp *GABABParams) Defaults()
func (*GABABParams) GABAB ¶
func (gp *GABABParams) GABAB(gabaB, gabaBx, gi float32) (g, x float32)
GABAB returns the updated GABA-B / GIRK activation and underlying x value based on current values and gi inhibitory conductance (proxy for GABA spikes)
func (*GABABParams) GFmS ¶
func (gp *GABABParams) GFmS(s float32) float32
GFmS returns the GABA-B conductance as a function of GABA spiking rate, based on normalized spiking factor (i.e., Gi from FFFB etc)
func (*GABABParams) GFmV ¶
func (gp *GABABParams) GFmV(v float32) float32
GFmV returns the GABA-B conductance as a function of normalized membrane potential
func (*GABABParams) GgabaB ¶
func (gp *GABABParams) GgabaB(gabaB, vm float32) float32
GgabaB returns the overall net GABAB / GIRK conductance including Gbar, Gbase, and voltage-gating
func (*GABABParams) Update ¶
func (gp *GABABParams) Update()
type Layer ¶
type Layer struct { leabra.Layer NMDA NMDAParams `view:"inline" desc:"NMDA channel parameters plus more general params"` GABAB GABABParams `view:"inline" desc:"GABA-B / GIRK channel parameters"` GlNeurs []Neuron `` /* 152-byte string literal not displayed */ }
Layer has GABA-B and NMDA channels, with longer time-constants, to supports bistable activation dynamics including active maintenance in frontal cortex. NMDA requires NMDAPrjn on relevant projections. It also records AlphaMax = maximum activation within an AlphaCycle, which is important given the transient dynamics.
func AddGlongLayer2D ¶
AddGlongLayer2D adds a glong.Layer using 2D shape
func AddGlongLayer4D ¶
AddGlongLayer4D adds a glong.Layer using 4D shape with pools
func (*Layer) ActLrnFmAlphaMax ¶
func (ly *Layer) ActLrnFmAlphaMax()
ActLrnFmAlphaMax sets ActLrn to AlphaMax
func (*Layer) AlphaCycInit ¶
func (*Layer) AlphaMaxFmAct ¶
AlphaMaxFmAct computes AlphaMax from Activation
func (*Layer) DecayState ¶
func (*Layer) GFmInc ¶
GFmInc integrates new synaptic conductances from increments sent during last SendGDelta.
func (*Layer) GFmIncNeur ¶
GFmIncNeur is the neuron-level code for GFmInc that integrates overall Ge, Gi values from their G*Raw accumulators.
func (*Layer) InitAlphaMax ¶
func (ly *Layer) InitAlphaMax()
InitAlphaMax initializes the AlphaMax to 0
func (*Layer) MaxAlphaMax ¶
MaxAlphaMax returns the maximum AlphaMax across the layer
func (*Layer) RecvGInc ¶
RecvGInc calls RecvGInc on receiving projections to collect Neuron-level G*Inc values. This is called by GFmInc overall method, but separated out for cases that need to do something different.
func (*Layer) RecvGnmdaPInc ¶
RecvGnmdaPInc increments the recurrent-specific GeInc
func (*Layer) UnitVal1D ¶
UnitVal1D returns value of given variable index on given unit, using 1-dimensional index. returns NaN on invalid index. This is the core unit var access method used by other methods, so it is the only one that needs to be updated for derived layer types.
func (*Layer) UnitVarIdx ¶
UnitVarIdx returns the index of given variable within the Neuron, according to UnitVarNames() list (using a map to lookup index), or -1 and error message if not found.
func (*Layer) UnitVarNum ¶
UnitVarNum returns the number of Neuron-level variables for this layer. This is needed for extending indexes in derived types.
type NMDAParams ¶
type NMDAParams struct { ActVm float32 `` /* 199-byte string literal not displayed */ AlphaMaxCyc int `desc:"cycle upon which to start updating AlphaMax value"` Tau float32 `def:"100" desc:"decay time constant for NMDA current -- rise time is 2 msec and not worth extra effort for biexponential"` Gbar float32 `desc:"strength of NMDA current -- 0.02 is just over level sufficient to maintain in face of completely blank input"` }
NMDAParams control the NMDA dynamics in PFC Maint neurons, based on Brunel & Wang (2001) parameters. We have to do some things to make it work for rate code neurons..
func (*NMDAParams) Defaults ¶
func (np *NMDAParams) Defaults()
func (*NMDAParams) GFmV ¶
func (np *NMDAParams) GFmV(v float32) float32
GFmV returns the NMDA conductance as a function of normalized membrane potential
func (*NMDAParams) Gnmda ¶
func (np *NMDAParams) Gnmda(nmda, vm float32) float32
Gnmda returns the NMDA net conductance from nmda activation and vm
func (*NMDAParams) NMDA ¶
func (np *NMDAParams) NMDA(nmda, nmdaSyn float32) float32
NMDA returns the updated NMDA activation from current NMDA and NMDASyn input
func (*NMDAParams) VmEff ¶
func (np *NMDAParams) VmEff(vm, act float32) float32
VmEff returns the effective Vm value including backpropagating action potentials from ActVm
type NMDAPrjn ¶
NMDAPrjn is a projection with NMDA maintenance channels. It marks a projection for special treatment in a MaintLayer which actually does the NMDA computations. Excitatory conductance is aggregated separately for this projection.
func (*NMDAPrjn) PrjnTypeName ¶
func (*NMDAPrjn) UpdateParams ¶
func (pj *NMDAPrjn) UpdateParams()
type Network ¶
glong.Network has methods for configuring specialized Glong network components.
func (*Network) ConnectNMDA ¶ added in v1.1.4
ConnectNMDA adds a NMDAPrjn between given layers
func (*Network) Defaults ¶
func (nt *Network) Defaults()
Defaults sets all the default parameters for all layers and projections
func (*Network) NewLayer ¶ added in v1.1.4
NewLayer returns new layer of glong.Layer type -- this is default type for this network
func (*Network) UnitVarNames ¶
UnitVarNames returns a list of variable names available on the units in this layer
func (*Network) UnitVarProps ¶ added in v1.1.4
UnitVarProps returns properties for variables
func (*Network) UpdateParams ¶
func (nt *Network) UpdateParams()
UpdateParams updates all the derived parameters if any have changed, for all layers and projections
type Neuron ¶
type Neuron struct { AlphaMax float32 `desc:"Maximum activation over Alpha cycle period"` VmEff float32 `desc:"Effective membrane potential, including simulated backpropagating action potential contribution from activity level."` Gnmda float32 `desc:"net NMDA conductance, after Vm gating and Gbar -- added directly to Ge as it has the same reversal potential."` NMDA float32 `desc:"NMDA channel activation -- underlying time-integrated value with decay"` NMDASyn float32 `desc:"synaptic NMDA activation directly from projection(s)"` GgabaB float32 `desc:"net GABA-B conductance, after Vm gating and Gbar + Gbase -- set to Gk for GIRK, with .1 reversal potential."` GABAB float32 `desc:"GABA-B / GIRK activation -- time-integrated value with rise and decay time constants"` GABABx float32 `desc:"GABA-B / GIRK internal drive variable -- gets the raw activation and decays"` }
Neuron holds the extra neuron (unit) level variables for glong computation.
func (*Neuron) VarByIndex ¶
VarByIndex returns variable using index (0 = first variable in NeuronVars list)
type PrjnType ¶
PrjnType has the GLong extensions to the emer.PrjnType types, for gui
func StringToPrjnType ¶
Source Files ¶
Directories ¶
Path | Synopsis |
---|---|
eqplot plots an equation updating over time in a etable.Table and Plot2D. This is a good starting point for any plotting to explore specific equations.
|
eqplot plots an equation updating over time in a etable.Table and Plot2D. This is a good starting point for any plotting to explore specific equations. |