deep

package
v1.1.3 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 30, 2020 License: BSD-3-Clause Imports: 11 Imported by: 20

README

DeepLeabra

GoDoc

Package deep provides the DeepLeabra variant of Leabra, which performs predictive learning by attempting to predict the activation states over the Pulvinar nucleus of the thalamus (in posterior sensory cortex), which are driven phasically every 100 msec by deep layer 5 intrinsic bursting (5IB) neurons that have strong focal (essentially 1-to-1) connections onto the Pulvinar Thalamic Relay Cell (TRC) neurons.

This package has 3 specialized Layer types:

  • SuperLayer: implements the superficial layer neurons, which function just like standard leabra.Layer neurons, while also directly computing the Burst activation signal that reflects the deep layer 5IB bursting activation, via thresholding of the superficial layer activations (Bursting is thought to have a higher threshold).

  • CTLayer: implements the layer 6 regular spiking CT corticothalamic neurons that project into the thalamus. They receive the Burst activation via a CTCtxtPrjn projection type, typically once every 100 msec, and integrate that in the CtxtGe value, which is added to other excitatory conductance inputs to drive the overall activation (Act) of these neurons. Due to the bursting nature of the Burst inputs, this causes these CT layer neurons to reflect what the superficial layers encoded on the previous timestep -- thus they represent a temporally-delayed context state.

CTLayer can send Context via self projections to reflect the extensive deep-to-deep lateral connectivity that provides more extensive temporal context information.

  • TRCLayer: implement the TRC (Pulvinar) neurons, upon which the prediction generated by CTLayer projections is projected in the minus phase. This is computed via standard Act-driven projections that integrate into standard Ge excitatory input in TRC neurons. The 5IB Burst-driven plus-phase "outcome" activation state is driven by direct access to the corresponding driver SuperLayer (not via standard projection mechanisms). Wiring diagram:
  SuperLayer --Burst--> TRCLayer
    |                      ^
 CTCtxt          /- Back -/
   |            /
   v           /
 CTLayer -----/  (typically only for higher->lower)

Timing

The alpha-cycle quarter(s) when Burst is updated and broadcast is set in BurstQtr (defaults to Q4, can also be e.g., Q2 and Q4 for beta frequency updating). During this quarter(s), the Burst value is computed in SuperLayer, and this is continuously accessed by TRCLayer neurons to drive plus-phase outcome states.

At the end of the burst quarter(s), in the QuarterFinal method, CTCtxt projections convey the Burst signal from Super to CTLayer neurons, where it is integrated into the Ctxt value representing the temporally-delayed context information.

Extensions

See pbwm for info about Prefrontal-cortex Basal-ganglia Working Memory (PBWM) model that builds on this deep framework to support gated working memory.

Documentation

Overview

Package deep provides the DeepLeabra variant of Leabra, which performs predictive learning by attempting to predict the activation states over the Pulvinar nucleus of the thalamus (in posterior sensory cortex), which are driven phasically every 100 msec by deep layer 5 intrinsic bursting (5IB) neurons that have strong focal (essentially 1-to-1) connections onto the Pulvinar Thalamic Relay Cell (TRC) neurons.

This package has 3 specialized Layer types:

  • SuperLayer: implements the superficial layer neurons, which function just like standard leabra.Layer neurons, while also directly computing the Burst activation signal that reflects the deep layer 5IB bursting activation, via thresholding of the superficial layer activations (Bursting is thought to have a higher threshold).
  • CTLayer: implements the layer 6 regular spiking CT corticothalamic neurons that project into the thalamus. They receive the Burst activation via a CTCtxtPrjn projection type, typically once every 100 msec, and integrate that in the CtxtGe value, which is added to other excitatory conductance inputs to drive the overall activation (Act) of these neurons. Due to the bursting nature of the Burst inputs, this causes these CT layer neurons to reflect what the superficial layers encoded on the *previous* timestep -- thus they represent a temporally-delayed context state.

    CTLayer can send Context via self projections to reflect the extensive deep-to-deep lateral connectivity that provides more extensive temporal context information.

  • TRCLayer: implement the TRC (Pulvinar) neurons, upon which the prediction generated by CTLayer projections is projected in the minus phase. This is computed via standard Act-driven projections that integrate into standard Ge excitatory input in TRC neurons. The 5IB Burst-driven plus-phase "outcome" activation state is driven by direct access to the corresponding driver SuperLayer (not via standard projection mechanisms).

Wiring diagram:

 SuperLayer --Burst--> TRCLayer
   |                      ^
CTCtxt          /- Back -/
  |            /
  v           /
CTLayer -----/  (typically only for higher->lower)

Timing:

The alpha-cycle quarter(s) when Burst is updated and broadcast is set in BurstQtr (defaults to Q4, can also be e.g., Q2 and Q4 for beta frequency updating). During this quarter(s), the Burst value is computed in SuperLayer, and this is continuously accessed by TRCLayer neurons to drive plus-phase outcome states.

At the *end* of the burst quarter(s), in the QuarterFinal method, CTCtxt projections convey the Burst signal from Super to CTLayer neurons, where it is integrated into the Ctxt value representing the temporally-delayed context information.

Index

Constants

View Source
const (
	// CT are layer 6 corticothalamic projecting neurons, which drive predictions
	// in TRC (Pulvinar) via standard projections.
	CT emer.LayerType = emer.LayerTypeN + iota

	// TRC are thalamic relay cell neurons in the Pulvinar / MD thalamus,
	// which alternately reflect predictions driven by Deep layer projections,
	// and actual outcomes driven by Burst activity from corresponding
	// Super layer neurons that provide strong driving inputs to TRC neurons.
	TRC
)
View Source
const (
	AddPulv bool = true
	NoPulv       = false
)

bool args for greater clarity

View Source
const (
	// CTCtxt are projections from Superficial layers to CT layers that
	// send Burst activations drive updating of CtxtGe excitatory conductance,
	// at end of a DeepBurst quarter.  These projections also use a special learning
	// rule that takes into account the temporal delays in the activation states.
	// Can also add self context from CT for deeper temporal context.
	CTCtxt emer.PrjnType = emer.PrjnTypeN + iota
)

The DeepLeabra prjn types

Variables

View Source
var (
	NeuronVars    = []string{"Burst", "BurstPrv", "CtxtGe"}
	NeuronVarsMap map[string]int
	NeuronVarsAll []string
)
View Source
var KiT_CTCtxtPrjn = kit.Types.AddType(&CTCtxtPrjn{}, PrjnProps)
View Source
var KiT_CTLayer = kit.Types.AddType(&CTLayer{}, LayerProps)
View Source
var KiT_LayerType = kit.Enums.AddEnumExt(emer.KiT_LayerType, LayerTypeN, kit.NotBitFlag, nil)
View Source
var KiT_Network = kit.Types.AddType(&Network{}, NetworkProps)
View Source
var KiT_PrjnType = kit.Enums.AddEnumExt(emer.KiT_PrjnType, PrjnTypeN, kit.NotBitFlag, nil)
View Source
var KiT_SuperLayer = kit.Types.AddType(&SuperLayer{}, LayerProps)
View Source
var KiT_TRCLayer = kit.Types.AddType(&TRCLayer{}, LayerProps)
View Source
var LayerProps = ki.Props{
	"EnumType:Typ": KiT_LayerType,
	"ToolBar": ki.PropSlice{
		{"Defaults", ki.Props{
			"icon": "reset",
			"desc": "return all parameters to their intial default values",
		}},
		{"InitWts", ki.Props{
			"icon": "update",
			"desc": "initialize the layer's weight values according to prjn parameters, for all *sending* projections out of this layer",
		}},
		{"InitActs", ki.Props{
			"icon": "update",
			"desc": "initialize the layer's activation values",
		}},
		{"sep-act", ki.BlankProp{}},
		{"LesionNeurons", ki.Props{
			"icon": "close",
			"desc": "Lesion (set the Off flag) for given proportion of neurons in the layer (number must be 0 -- 1, NOT percent!)",
			"Args": ki.PropSlice{
				{"Proportion", ki.Props{
					"desc": "proportion (0 -- 1) of neurons to lesion",
				}},
			},
		}},
		{"UnLesionNeurons", ki.Props{
			"icon": "reset",
			"desc": "Un-Lesion (reset the Off flag) for all neurons in the layer",
		}},
	},
}

LayerProps are required to get the extended EnumType

View Source
var NetworkProps = leabra.NetworkProps
View Source
var PrjnProps = ki.Props{
	"EnumType:Typ": KiT_PrjnType,
}
View Source
var (
	SuperNeuronVars = []string{"Burst", "BurstPrv"}
)

Functions

func AddInputPulv2D added in v1.1.2

func AddInputPulv2D(nt *leabra.Network, name string, shapeY, shapeX int) (input, pulv emer.Layer)

AddInputPulv2D adds an input and corresponding Pulvinar (P suffix) TRC layer. Pulvinar is placed behind Input.

func AddInputPulv4D added in v1.1.2

func AddInputPulv4D(nt *leabra.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (input, pulv emer.Layer)

AddInputPulv4D adds an input and corresponding Pulvinar (P suffix) TRC layer Pulvinar is placed Behind Input.

func AddSuperCT2D added in v1.1.2

func AddSuperCT2D(nt *leabra.Network, name string, shapeY, shapeX int, pulvLay bool) (super, ct, pulv emer.Layer)

AddSuperCT2D adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn Full projection from Super to CT. Optionally creates a TRC Pulvinar for Super. CT is placed Behind Super, and Pulvinar behind CT if created.

func AddSuperCT4D added in v1.1.2

func AddSuperCT4D(nt *leabra.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int, pulvLay bool) (super, ct, pulv emer.Layer)

AddSuperCT4D adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn Full projection from Super to CT. Optionally creates a TRC Pulvinar for Super. CT is placed Behind Super, and Pulvinar behind CT if created.

func ConnectCtxtToCT added in v1.1.2

func ConnectCtxtToCT(nt *leabra.Network, send, recv emer.Layer, pat prjn.Pattern) emer.Prjn

ConnectCtxtToCT adds a CTCtxtPrjn from given sending layer to a CT layer

func NeuronVarByName

func NeuronVarByName(varNm string) (int, error)

NeuronVarByName returns the index of the variable in the Neuron, or error

Types

type BurstParams added in v1.0.0

type BurstParams struct {
	BurstQtr leabra.Quarters `` /* 206-byte string literal not displayed */
	ThrRel   float32         `` /* 353-byte string literal not displayed */
	ThrAbs   float32         `` /* 246-byte string literal not displayed */
}

BurstParams are parameters determining how the DeepBurst activation is computed from regular activation values.

func (*BurstParams) Defaults added in v1.0.0

func (db *BurstParams) Defaults()

func (*BurstParams) Update added in v1.0.0

func (db *BurstParams) Update()

type CTCtxtPrjn added in v1.1.2

type CTCtxtPrjn struct {
	leabra.Prjn           // access as .Prjn
	CtxtGeInc   []float32 `desc:"local per-recv unit accumulator for Ctxt excitatory conductance from sending units -- not a delta -- the full value"`
}

CTCtxtPrjn is the "context" temporally-delayed projection into CTLayer, (corticothalamic deep layer 6) where the CtxtGe excitatory input is integrated only at end of Burst Quarter.

func (*CTCtxtPrjn) Build added in v1.1.2

func (pj *CTCtxtPrjn) Build() error

func (*CTCtxtPrjn) DWt added in v1.1.2

func (pj *CTCtxtPrjn) DWt()

DWt computes the weight change (learning) for Ctxt projections

func (*CTCtxtPrjn) Defaults added in v1.1.2

func (pj *CTCtxtPrjn) Defaults()

func (*CTCtxtPrjn) InitGInc added in v1.1.2

func (pj *CTCtxtPrjn) InitGInc()

func (*CTCtxtPrjn) PrjnTypeName added in v1.1.2

func (pj *CTCtxtPrjn) PrjnTypeName() string

func (*CTCtxtPrjn) RecvCtxtGeInc added in v1.1.2

func (pj *CTCtxtPrjn) RecvCtxtGeInc()

RecvCtxtGeInc increments the receiver's CtxtGe from that of all the projections

func (*CTCtxtPrjn) RecvGInc added in v1.1.2

func (pj *CTCtxtPrjn) RecvGInc()

RecvGInc: disabled for this type

func (*CTCtxtPrjn) SendCtxtGe added in v1.1.2

func (pj *CTCtxtPrjn) SendCtxtGe(si int, dburst float32)

SendCtxtGe sends the full Burst activation from sending neuron index si, to integrate CtxtGe excitatory conductance on receivers

func (*CTCtxtPrjn) SendGDelta added in v1.1.2

func (pj *CTCtxtPrjn) SendGDelta(si int, delta float32)

SendGDelta: disabled for this type

func (*CTCtxtPrjn) Type added in v1.1.2

func (pj *CTCtxtPrjn) Type() emer.PrjnType

func (*CTCtxtPrjn) UpdateParams added in v1.1.2

func (pj *CTCtxtPrjn) UpdateParams()

type CTLayer added in v1.1.2

type CTLayer struct {
	leabra.Layer                 // access as .Layer
	BurstQtr     leabra.Quarters `` /* 206-byte string literal not displayed */
	CtxtGes      []float32       `desc:"slice of context (temporally delayed) excitatory conducances."`
}

CTLayer implements the corticothalamic projecting layer 6 deep neurons that project to the TRC pulvinar neurons, to generate the predictions. They receive phasic input representing 5IB bursting via CTCtxtPrjn inputs from SuperLayer and also from self projections.

func AddCTLayer2D added in v1.1.2

func AddCTLayer2D(nt *leabra.Network, name string, nNeurY, nNeurX int) *CTLayer

AddCTLayer2D adds a CTLayer of given size, with given name.

func AddCTLayer4D added in v1.1.2

func AddCTLayer4D(nt *leabra.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) *CTLayer

AddCTLayer4D adds a CTLayer of given size, with given name.

func (*CTLayer) Build added in v1.1.2

func (ly *CTLayer) Build() error

Build constructs the layer state, including calling Build on the projections.

func (*CTLayer) Class added in v1.1.2

func (ly *CTLayer) Class() string

func (*CTLayer) CtxtFmGe added in v1.1.2

func (ly *CTLayer) CtxtFmGe(ltime *leabra.Time)

CtxtFmGe integrates new CtxtGe excitatory conductance from projections, and computes overall Ctxt value, only on Deep layers. This must be called at the end of the DeepBurst quarter for this layer, after SendCtxtGe.

func (*CTLayer) Defaults added in v1.1.2

func (ly *CTLayer) Defaults()

func (*CTLayer) GFmInc added in v1.1.2

func (ly *CTLayer) GFmInc(ltime *leabra.Time)

GFmInc integrates new synaptic conductances from increments sent during last SendGDelta.

func (*CTLayer) InitActs added in v1.1.2

func (ly *CTLayer) InitActs()

func (*CTLayer) SendCtxtGe added in v1.1.2

func (ly *CTLayer) SendCtxtGe(ltime *leabra.Time)

SendCtxtGe sends activation over CTCtxtPrjn projections to integrate CtxtGe excitatory conductance on CT layers. This must be called at the end of the Burst quarter for this layer. Satisfies the CtxtSender interface.

func (*CTLayer) UnitVal1D added in v1.1.2

func (ly *CTLayer) UnitVal1D(varIdx int, idx int) float32

UnitVal1D returns value of given variable index on given unit, using 1-dimensional index. returns NaN on invalid index. This is the core unit var access method used by other methods, so it is the only one that needs to be updated for derived layer types.

func (*CTLayer) UnitVarIdx added in v1.1.2

func (ly *CTLayer) UnitVarIdx(varNm string) (int, error)

UnitVarIdx returns the index of given variable within the Neuron, according to UnitVarNames() list (using a map to lookup index), or -1 and error message if not found.

func (*CTLayer) UnitVarNames added in v1.1.2

func (ly *CTLayer) UnitVarNames() []string

UnitVarNames returns a list of variable names available on the units in this layer

func (*CTLayer) UnitVarNum added in v1.1.2

func (ly *CTLayer) UnitVarNum() int

UnitVarNum returns the number of Neuron-level variables for this layer. This is needed for extending indexes in derived types.

type CtxtSender added in v1.1.2

type CtxtSender interface {
	leabra.LeabraLayer

	// SendCtxtGe sends activation over CTCtxtPrjn projections to integrate
	// CtxtGe excitatory conductance on CT layers.
	// This must be called at the end of the Burst quarter for this layer.
	SendCtxtGe(ltime *leabra.Time)
}

CtxtSender is an interface for layers that implement the SendCtxtGe method (SuperLayer, CTLayer)

type LayerType added in v1.0.0

type LayerType emer.LayerType

LayerType has the DeepLeabra extensions to the emer.LayerType types, for gui

const (
	CT_ LayerType = LayerType(emer.LayerTypeN) + iota
	TRC_
	LayerTypeN
)

gui versions

func StringToLayerType added in v1.0.0

func StringToLayerType(s string) (LayerType, error)

func (LayerType) String added in v1.0.0

func (i LayerType) String() string

type Network

type Network struct {
	leabra.Network
}

deep.Network has parameters for running a DeepLeabra network

func (*Network) AddInputPulv2D added in v1.0.0

func (nt *Network) AddInputPulv2D(name string, shapeY, shapeX int) (input, pulv emer.Layer)

AddInputPulv2D adds an input and corresponding Pulvinar (P suffix) TRC layer Pulvinar is placed Behind Input.

func (*Network) AddInputPulv4D added in v1.0.0

func (nt *Network) AddInputPulv4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (input, pulv emer.Layer)

AddInputPulv4D adds an input and corresponding Pulvinar (P suffix) TRC layer Pulvinar is placed Behind Input.

func (*Network) AddSuperCT2D added in v1.1.2

func (nt *Network) AddSuperCT2D(name string, shapeY, shapeX int, pulvLay bool) (super, ct, pulv emer.Layer)

AddSuperCT2D adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn Full projection from Super to CT. Optionally creates a TRC Pulvinar for Super. CT is placed Behind Super, and Pulvinar behind CT if created.

func (*Network) AddSuperCT4D added in v1.1.2

func (nt *Network) AddSuperCT4D(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int, pulvLay bool) (super, ct, pulv emer.Layer)

AddSuperCT4D adds a superficial (SuperLayer) and corresponding CT (CT suffix) layer with CTCtxtPrjn Full projection from Super to CT. Optionally creates a TRC Pulvinar for Super. CT is placed Behind Super, and Pulvinar behind CT if created.

func (*Network) CTCtxt added in v1.1.2

func (nt *Network) CTCtxt(ltime *leabra.Time)

CTCtxt sends context to CT layers and integrates CtxtGe on CT layers

func (*Network) ConnectCtxtToCT added in v1.1.2

func (nt *Network) ConnectCtxtToCT(send, recv emer.Layer, pat prjn.Pattern) emer.Prjn

ConnectCtxtToCT adds a CTCtxtPrjn from given sending layer to a CT layer

func (*Network) Defaults

func (nt *Network) Defaults()

Defaults sets all the default parameters for all layers and projections

func (*Network) QuarterFinal

func (nt *Network) QuarterFinal(ltime *leabra.Time)

QuarterFinal does updating after end of a quarter

func (*Network) UnitVarNames added in v1.1.0

func (nt *Network) UnitVarNames() []string

UnitVarNames returns a list of variable names available on the units in this layer

func (*Network) UpdateParams

func (nt *Network) UpdateParams()

UpdateParams updates all the derived parameters if any have changed, for all layers and projections

type NeurVars added in v1.1.2

type NeurVars int32
const (
	BurstVar NeurVars = iota

	BurstPrvVar

	CtxtGeVar
)

type PrjnType added in v1.0.0

type PrjnType emer.PrjnType

PrjnType has the DeepLeabra extensions to the emer.PrjnType types, for gui

const (
	CTCtxt_ PrjnType = PrjnType(emer.PrjnTypeN) + iota
	PrjnTypeN
)

gui versions

func StringToPrjnType added in v1.0.0

func StringToPrjnType(s string) (PrjnType, error)

func (PrjnType) String added in v1.0.0

func (i PrjnType) String() string

type SuperLayer added in v1.1.2

type SuperLayer struct {
	leabra.Layer               // access as .Layer
	Burst        BurstParams   `` /* 142-byte string literal not displayed */
	SuperNeurs   []SuperNeuron `desc:"slice of super neuron values -- same size as Neurons"`
}

SuperLayer is the DeepLeabra superficial layer, based on basic rate-coded leabra.Layer. Computes the Burst activation from regular activations.

func AddSuperLayer2D added in v1.1.2

func AddSuperLayer2D(nt *leabra.Network, name string, nNeurY, nNeurX int) *SuperLayer

AddSuperLayer2D adds a SuperLayer of given size, with given name.

func AddSuperLayer4D added in v1.1.2

func AddSuperLayer4D(nt *leabra.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) *SuperLayer

AddSuperLayer4D adds a SuperLayer of given size, with given name.

func (*SuperLayer) Build added in v1.1.2

func (ly *SuperLayer) Build() error

Build constructs the layer state, including calling Build on the projections.

func (*SuperLayer) BurstFmAct added in v1.1.2

func (ly *SuperLayer) BurstFmAct(ltime *leabra.Time)

BurstFmAct updates Burst layer 5IB bursting value from current Act (superficial activation), subject to thresholding.

func (*SuperLayer) BurstPrv added in v1.1.2

func (ly *SuperLayer) BurstPrv()

BurstPrv saves Burst as BurstPrv

func (*SuperLayer) CyclePost added in v1.1.2

func (ly *SuperLayer) CyclePost(ltime *leabra.Time)

CyclePost calls BurstFmAct

func (*SuperLayer) DecayState added in v1.1.2

func (ly *SuperLayer) DecayState(decay float32)

func (*SuperLayer) Defaults added in v1.1.2

func (ly *SuperLayer) Defaults()

func (*SuperLayer) InitActs added in v1.1.2

func (ly *SuperLayer) InitActs()

func (*SuperLayer) QuarterFinal added in v1.1.2

func (ly *SuperLayer) QuarterFinal(ltime *leabra.Time)

QuarterFinal does updating after end of a quarter

func (*SuperLayer) SendCtxtGe added in v1.1.2

func (ly *SuperLayer) SendCtxtGe(ltime *leabra.Time)

SendCtxtGe sends Burst activation over CTCtxtPrjn projections to integrate CtxtGe excitatory conductance on CT layers. This must be called at the end of the Burst quarter for this layer. Satisfies the CtxtSender interface.

func (*SuperLayer) UnitVal1D added in v1.1.2

func (ly *SuperLayer) UnitVal1D(varIdx int, idx int) float32

UnitVal1D returns value of given variable index on given unit, using 1-dimensional index. returns NaN on invalid index. This is the core unit var access method used by other methods, so it is the only one that needs to be updated for derived layer types.

func (*SuperLayer) UnitVarIdx added in v1.1.2

func (ly *SuperLayer) UnitVarIdx(varNm string) (int, error)

UnitVarIdx returns the index of given variable within the Neuron, according to UnitVarNames() list (using a map to lookup index), or -1 and error message if not found.

func (*SuperLayer) UnitVarNames added in v1.1.2

func (ly *SuperLayer) UnitVarNames() []string

UnitVarNames returns a list of variable names available on the units in this layer

func (*SuperLayer) UnitVarNum added in v1.1.2

func (ly *SuperLayer) UnitVarNum() int

UnitVarNum returns the number of Neuron-level variables for this layer. This is needed for extending indexes in derived types.

func (*SuperLayer) UpdateParams added in v1.1.2

func (ly *SuperLayer) UpdateParams()

UpdateParams updates all params given any changes that might have been made to individual values including those in the receiving projections of this layer

type SuperNeuron added in v1.1.2

type SuperNeuron struct {
	Burst    float32 `desc:"5IB bursting activation value, computed by thresholding regular activation"`
	BurstPrv float32 `desc:"previous bursting activation -- used for context-based learning"`
}

SuperNeuron has the neuron values for SuperLayer

func (*SuperNeuron) VarByIdx added in v1.1.2

func (sn *SuperNeuron) VarByIdx(idx int) float32

type TRCLayer added in v1.1.2

type TRCLayer struct {
	leabra.Layer           // access as .Layer
	TRC          TRCParams `` /* 141-byte string literal not displayed */
	DriverLay    string    `desc:"name of SuperLayer that sends 5IB Burst driver inputs to this layer"`
}

TRCLayer is the thalamic relay cell layer for DeepLeabra. It has normal activity during the minus phase, as activated by non-driver inputs and is then driven by strong 5IB driver inputs in the plus phase, which are directly copied from a named DriverLay layer (not using a projection). The DriverLay MUST have the same shape as this TRC layer, including Pools!

func AddTRCLayer2D added in v1.1.2

func AddTRCLayer2D(nt *leabra.Network, name string, nNeurY, nNeurX int) *TRCLayer

AddTRCLayer2D adds a TRCLayer of given size, with given name.

func AddTRCLayer4D added in v1.1.2

func AddTRCLayer4D(nt *leabra.Network, name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) *TRCLayer

AddTRCLayer4D adds a TRCLayer of given size, with given name.

func (*TRCLayer) Class added in v1.1.2

func (ly *TRCLayer) Class() string

func (*TRCLayer) Defaults added in v1.1.2

func (ly *TRCLayer) Defaults()

func (*TRCLayer) DriverLayer added in v1.1.2

func (ly *TRCLayer) DriverLayer() (*leabra.Layer, error)

DriverLayer returns the driver SuperLayer based on DriverLay name

func (*TRCLayer) GFmInc added in v1.1.2

func (ly *TRCLayer) GFmInc(ltime *leabra.Time)

GFmInc integrates new synaptic conductances from increments sent during last SendGDelta.

func (*TRCLayer) UpdateParams added in v1.1.2

func (ly *TRCLayer) UpdateParams()

UpdateParams updates all params given any changes that might have been made to individual values including those in the receiving projections of this layer

type TRCParams added in v1.0.0

type TRCParams struct {
	BurstQtr   leabra.Quarters `` /* 188-byte string literal not displayed */
	DriveScale float32         `def:"0.3" min:"0.0" desc:"multiplier on driver input strength, multiplies activation of driver layer"`
	MaxInhib   float32         `` /* 735-byte string literal not displayed */
	InhibPool  bool            `` /* 191-byte string literal not displayed */
	Binarize   bool            `` /* 234-byte string literal not displayed */
	BinThr     float32         `viewif:"Binarize" desc:"Threshold for binarizing in terms of sending Burst activation"`
	BinOn      float32         `` /* 190-byte string literal not displayed */
	BinOff     float32         `def:"0" viewif:"Binarize" desc:"Resulting driver Ge value for units below threshold -- typically 0."`
}

TRCParams provides parameters for how the plus-phase (outcome) state of thalamic relay cell (e.g., Pulvinar) neurons is computed from the corresponding driver neuron Burst activation.

func (*TRCParams) Defaults added in v1.0.0

func (tp *TRCParams) Defaults()

func (*TRCParams) DriveGe added in v1.1.2

func (tp *TRCParams) DriveGe(act float32) float32

DriveGe returns effective excitatory conductance to use for given driver input Burst activation

func (*TRCParams) Update added in v1.0.0

func (tp *TRCParams) Update()

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL