Documentation ¶
Overview ¶
Package emer provides minimal interfaces for the basic structural elements of neural networks including: * emer.Network, emer.Layer, emer.Unit, emer.Prjn (projection that interconnects layers)
These interfaces are intended to be just sufficient to support visualization and generic analysis kinds of functions, but explicitly avoid exposing ANY of the algorithmic aspects, so that those can be purely encoded in the implementation structs.
At this point, given the extra complexity it would require, these interfaces do not support the ability to build or modify networks.
Index ¶
- Constants
- Variables
- type LayNames
- type Layer
- type LayerType
- type Layers
- type Network
- type Prjn
- type PrjnType
- type Prjns
- func (pl *Prjns) Add(p Prjn)
- func (pl *Prjns) ElemLabel(idx int) string
- func (pl *Prjns) Recv(recv Layer) (Prjn, bool)
- func (pl *Prjns) RecvName(recv string) Prjn
- func (pl *Prjns) RecvNameTry(recv string) (Prjn, error)
- func (pl *Prjns) RecvNameTypeTry(recv, typ string) (Prjn, error)
- func (pl *Prjns) Send(send Layer) (Prjn, bool)
- func (pl *Prjns) SendName(sender string) Prjn
- func (pl *Prjns) SendNameTry(sender string) (Prjn, error)
- func (pl *Prjns) SendNameTypeTry(sender, typ string) (Prjn, error)
Constants ¶
const ( Version = "v1.1.44" GitCommit = "8f8d288" // the commit JUST BEFORE the release VersionDate = "2021-12-19 22:47" // UTC )
Variables ¶
var KiT_LayerType = kit.Enums.AddEnum(LayerTypeN, kit.NotBitFlag, nil)
var KiT_PrjnType = kit.Enums.AddEnum(PrjnTypeN, kit.NotBitFlag, nil)
var LayerDimNames2D = []string{"Y", "X"}
LayerDimNames2D provides the standard Shape dimension names for 2D layers
var LayerDimNames4D = []string{"PoolY", "PoolX", "NeurY", "NeurX"}
LayerDimNames4D provides the standard Shape dimension names for 4D layers which have Pools and then neurons within pools.
Functions ¶
This section is empty.
Types ¶
type LayNames ¶ added in v1.0.7
type LayNames []string
LayNames is a list of layer names. Has convenience methods for adding, validating.
func (*LayNames) AddAllBut ¶ added in v1.0.7
AddAllBut adds all layers in network except those in exlude list
func (*LayNames) AddOne ¶ added in v1.1.13
AddOne adds one layer name to list -- python version -- doesn't support varargs
type Layer ¶
type Layer interface { params.Styler // TypeName, Name, and Class methods for parameter styling // InitName MUST be called to initialize the layer's pointer to itself as an emer.Layer // which enables the proper interface methods to be called. Also sets the name, and // the parent network that this layer belongs to (which layers may want to retain). InitName(lay Layer, name string, net Network) // Label satisfies the gi.Labeler interface for getting the name of objects generically Label() string // SetName sets name of layer SetName(nm string) // SetClass sets CSS-style class name(s) for this layer (space-separated if multiple) SetClass(cls string) // IsOff returns true if layer has been turned Off (lesioned) -- for experimentation IsOff() bool // SetOff sets the "off" (lesioned) status of layer SetOff(off bool) // Shape returns the organization of units in the layer, in terms of an array of dimensions. // Row-major ordering is default (Y then X), outer-most to inner-most. // if 2D, then it is a simple Y,X layer with no sub-structure (pools). // If 4D, then it number of pools Y, X and then number of units per pool Y, X Shape() *etensor.Shape // Is2D() returns true if this is a 2D layer (no Pools) Is2D() bool // Is4D() returns true if this is a 4D layer (has Pools as inner 2 dimensions) Is4D() bool // Idx4DFrom2D returns the 4D index from 2D coordinates // within which inner dims are interleaved. Returns false if 2D coords are invalid. Idx4DFrom2D(x, y int) ([]int, bool) // Type returns the functional type of layer according to LayerType (extensible in // more specialized algorithms) Type() LayerType // SetType sets the functional type of layer SetType(typ LayerType) // Config configures the basic parameters of the layer Config(shape []int, typ LayerType) // Thread() returns the thread number (go worker thread) to use in updating this layer. // The user is responsible for allocating layers to threads, trying to maintain an even // distribution across layers and establishing good break-points. Thread() int // SetThread sets the thread number (go worker thread) to use in updating this layer. SetThread(thr int) // RelPos returns the relative 3D position specification for this layer // for display in the 3D NetView -- see Pos() for display conventions. RelPos() relpos.Rel // SetRelPos sets the the relative 3D position specification for this layer SetRelPos(r relpos.Rel) // Pos returns the 3D position of the lower-left-hand corner of the layer. // The 3D view has layers arranged in X-Y planes stacked vertically along the Z axis. // Somewhat confusingly, this differs from the standard 3D graphics convention, // where the vertical dimension is Y and Z is the depth dimension. However, in the // more "layer-centric" way of thinking about it, it is natural for the width & height // to map onto X and Y, and then Z is left over for stacking vertically. Pos() mat32.Vec3 // SetPos sets the 3D position of this layer -- will generally be overwritten by // automatic RelPos setting, unless that doesn't specify a valid relative position. SetPos(pos mat32.Vec3) // Size returns the display size of this layer for the 3D view -- see Pos() for general info. // This is multiplied by the RelPos.Scale factor to rescale layer sizes, and takes // into account 2D and 4D layer structures. Size() mat32.Vec2 // Index returns a 0..n-1 index of the position of the layer within list of layers // in the network. For backprop networks, index position has computational significance. // For Leabra networks, it only has significance in determining who gets which weights for // enforcing initial weight symmetry -- higher layers get weights from lower layers. Index() int // SetIndex sets the layer index SetIndex(idx int) // UnitVarNames returns a list of variable names available on the units in this layer. // This is typically a global list so do not modify! UnitVarNames() []string // UnitVarProps returns a map of unit variable properties, with the key being the // name of the variable, and the value gives a space-separated list of // go-tag-style properties for that variable. // The NetView recognizes the following properties: // range:"##" = +- range around 0 for default display scaling // min:"##" max:"##" = min, max display range // auto-scale:"+" or "-" = use automatic scaling instead of fixed range or not. // zeroctr:"+" or "-" = control whether zero-centering is used // Note: this is a global list so do not modify! UnitVarProps() map[string]string // UnitVarIdx returns the index of given variable within the Neuron, // according to *this layer's* UnitVarNames() list (using a map to lookup index), // or -1 and error message if not found. UnitVarIdx(varNm string) (int, error) // UnitVarNum returns the number of Neuron-level variables // for this layer. This is needed for extending indexes in derived types. UnitVarNum() int // UnitVal1D returns value of given variable index on given unit, using 1-dimensional index. // returns NaN on invalid index. // This is the core unit var access method used by other methods, // so it is the only one that needs to be updated for derived layer types. UnitVal1D(varIdx int, idx int) float32 // UnitVals fills in values of given variable name on unit, // for each unit in the layer, into given float32 slice (only resized if not big enough). // Returns error on invalid var name. UnitVals(vals *[]float32, varNm string) error // UnitValsTensor fills in values of given variable name on unit // for each unit in the layer, into given tensor. // If tensor is not already big enough to hold the values, it is // set to the same shape as the layer. // Returns error on invalid var name. UnitValsTensor(tsr etensor.Tensor, varNm string) error // UnitVal returns value of given variable name on given unit, // using shape-based dimensional index. // Returns NaN on invalid var name or index. UnitVal(varNm string, idx []int) float32 // RecvPrjns returns the full list of receiving projections RecvPrjns() *Prjns // NRecvPrjns returns the number of receiving projections NRecvPrjns() int // RecvPrjn returns a specific receiving projection RecvPrjn(idx int) Prjn // SendPrjns returns the full list of sending projections SendPrjns() *Prjns // NSendPrjns returns the number of sending projections NSendPrjns() int // SendPrjn returns a specific sending projection SendPrjn(idx int) Prjn // RecvPrjnVals fills in values of given synapse variable name, // for projection from given sending layer and neuron 1D index, // for all receiving neurons in this layer, // into given float32 slice (only resized if not big enough). // prjnType is the string representation of the prjn type -- used if non-empty, // useful when there are multiple projections between two layers. // Returns error on invalid var name. // If the receiving neuron is not connected to the given sending layer or neuron // then the value is set to mat32.NaN(). // Returns error on invalid var name or lack of recv prjn (vals always set to nan on prjn err). RecvPrjnVals(vals *[]float32, varNm string, sendLay Layer, sendIdx1D int, prjnType string) error // SendPrjnVals fills in values of given synapse variable name, // for projection into given receiving layer and neuron 1D index, // for all sending neurons in this layer, // into given float32 slice (only resized if not big enough). // prjnType is the string representation of the prjn type -- used if non-empty, // useful when there are multiple projections between two layers. // Returns error on invalid var name. // If the sending neuron is not connected to the given receiving layer or neuron // then the value is set to mat32.NaN(). // Returns error on invalid var name or lack of recv prjn (vals always set to nan on prjn err). SendPrjnVals(vals *[]float32, varNm string, recvLay Layer, recvIdx1D int, prjnType string) error // Defaults sets default parameter values for all Layer and recv projection parameters Defaults() // UpdateParams() updates parameter values for all Layer and recv projection parameters, // based on any other params that might have changed. UpdateParams() // ApplyParams applies given parameter style Sheet to this layer and its recv projections. // Calls UpdateParams on anything set to ensure derived parameters are all updated. // If setMsg is true, then a message is printed to confirm each parameter that is set. // it always prints a message if a parameter fails to be set. // returns true if any params were set, and error if there were any errors. ApplyParams(pars *params.Sheet, setMsg bool) (bool, error) // NonDefaultParams returns a listing of all parameters in the Layer that // are not at their default values -- useful for setting param styles etc. NonDefaultParams() string // AllParams returns a listing of all parameters in the Layer AllParams() string // WriteWtsJSON writes the weights from this layer from the receiver-side perspective // in a JSON text format. We build in the indentation logic to make it much faster and // more efficient. WriteWtsJSON(w io.Writer, depth int) // ReadWtsJSON reads the weights from this layer from the receiver-side perspective // in a JSON text format. This is for a set of weights that were saved *for one layer only* // and is not used for the network-level ReadWtsJSON, which reads into a separate // structure -- see SetWts method. ReadWtsJSON(r io.Reader) error // SetWts sets the weights for this layer from weights.Layer decoded values SetWts(lw *weights.Layer) error // Build constructs the layer and projection state based on the layer shapes // and patterns of interconnectivity Build() error // VarRange returns the min / max values for given variable // over the layer VarRange(varNm string) (min, max float32, err error) }
Layer defines the basic interface for neural network layers, used for managing the structural elements of a network, and for visualization, I/O, etc. Interfaces are automatically pointers -- think of this as a pointer to your specific layer type, with a very basic interface for accessing general structural properties. Nothing algorithm-specific is implemented here -- all of that goes in your specific layer struct.
type LayerType ¶
type LayerType int32
LayerType is the type of the layer: Input, Hidden, Target, Compare. Class parameter styles automatically key off of these types. Specialized algorithms can extend this to other types, but these types encompass most standard neural network models.
const ( // Hidden is an internal representational layer that does not receive direct input / targets Hidden LayerType = iota // Input is a layer that receives direct external input in its Ext inputs Input // Target is a layer that receives direct external target inputs used for driving plus-phase learning Target // Compare is a layer that receives external comparison inputs, which drive statistics but // do NOT drive activation or learning directly Compare LayerTypeN )
The layer types
func (*LayerType) FromString ¶
func (LayerType) MarshalJSON ¶
func (*LayerType) UnmarshalJSON ¶
type Network ¶
type Network interface { // InitName MUST be called to initialize the network's pointer to itself as an emer.Network // which enables the proper interface methods to be called. Also sets the name. InitName(net Network, name string) // Name() returns name of the network Name() string // Label satisfies the gi.Labeler interface for getting the name of objects generically Label() string // NLayers returns the number of layers in the network NLayers() int // Layer returns layer (as emer.Layer interface) at given index -- does not // do extra bounds checking Layer(idx int) Layer // LayerByName returns layer of given name, nil if not found. // Layer names must be unique and a map is used so this is a fast operation LayerByName(name string) Layer // LayerByNameTry returns layer of given name, // returns error and emits a log message if not found. // Layer names must be unique and a map is used so this is a fast operation LayerByNameTry(name string) (Layer, error) // Defaults sets default parameter values for everything in the Network Defaults() // UpdateParams() updates parameter values for all Network parameters, // based on any other params that might have changed. UpdateParams() // ApplyParams applies given parameter style Sheet to layers and prjns in this network. // Calls UpdateParams on anything set to ensure derived parameters are all updated. // If setMsg is true, then a message is printed to confirm each parameter that is set. // it always prints a message if a parameter fails to be set. // returns true if any params were set, and error if there were any errors. ApplyParams(pars *params.Sheet, setMsg bool) (bool, error) // NonDefaultParams returns a listing of all parameters in the Network that // are not at their default values -- useful for setting param styles etc. NonDefaultParams() string // AllParams returns a listing of all parameters in the Network AllParams() string // UnitVarNames returns a list of variable names available on the units in this network. // This list determines what is shown in the NetView (and the order of vars list). // Not all layers need to support all variables, but must safely return mat32.NaN() for // unsupported ones. // This is typically a global list so do not modify! UnitVarNames() []string // UnitVarProps returns a map of unit variable properties, with the key being the // name of the variable, and the value gives a space-separated list of // go-tag-style properties for that variable. // The NetView recognizes the following properties: // range:"##" = +- range around 0 for default display scaling // min:"##" max:"##" = min, max display range // auto-scale:"+" or "-" = use automatic scaling instead of fixed range or not. // zeroctr:"+" or "-" = control whether zero-centering is used // Note: this is typically a global list so do not modify! UnitVarProps() map[string]string // SynVarNames returns the names of all the variables on the synapses in this network. // This list determines what is shown in the NetView (and the order of vars list). // Not all projections need to support all variables, but must safely return mat32.NaN() for // unsupported ones. // This is typically a global list so do not modify! SynVarNames() []string // SynVarProps returns a map of synapse variable properties, with the key being the // name of the variable, and the value gives a space-separated list of // go-tag-style properties for that variable. // The NetView recognizes the following properties: // range:"##" = +- range around 0 for default display scaling // min:"##" max:"##" = min, max display range // auto-scale:"+" or "-" = use automatic scaling instead of fixed range or not. // zeroctr:"+" or "-" = control whether zero-centering is used // Note: this is typically a global list so do not modify! SynVarProps() map[string]string // WriteWtsJSON writes network weights (and any other state that adapts with learning) // to JSON-formatted output. WriteWtsJSON(w io.Writer) error // ReadWtsJSON reads network weights (and any other state that adapts with learning) // from JSON-formatted input. Reads into a temporary weights.Network structure that // is then passed to SetWts to actually set the weights. ReadWtsJSON(r io.Reader) error // SetWts sets the weights for this network from weights.Network decoded values SetWts(nw *weights.Network) error // SaveWtsJSON saves network weights (and any other state that adapts with learning) // to a JSON-formatted file. If filename has .gz extension, then file is gzip compressed. SaveWtsJSON(filename gi.FileName) error // OpenWtsJSON opens network weights (and any other state that adapts with learning) // from a JSON-formatted file. If filename has .gz extension, then file is gzip uncompressed. OpenWtsJSON(filename gi.FileName) error // NewLayer creates a new concrete layer of appropriate type for this network NewLayer() Layer // NewPrjn creates a new concrete projection of appropriate type for this network NewPrjn() Prjn // ConnectLayerNames establishes a projection between two layers, referenced by name // adding to the recv and send projection lists on each side of the connection. // Returns error if not successful. // Does not yet actually connect the units within the layers -- that requires Build. ConnectLayerNames(send, recv string, pat prjn.Pattern, typ PrjnType) (rlay, slay Layer, pj Prjn, err error) // ConnectLayers establishes a projection between two layers, // adding to the recv and send projection lists on each side of the connection. // Returns false if not successful. Does not yet actually connect the units within the layers -- that // requires Build. ConnectLayers(send, recv Layer, pat prjn.Pattern, typ PrjnType) Prjn // Bounds returns the minimum and maximum display coordinates of the network for 3D display Bounds() (min, max mat32.Vec3) // VarRange returns the min / max values for given variable VarRange(varNm string) (min, max float32, err error) }
Network defines the basic interface for a neural network, used for managing the structural elements of a network, and for visualization, I/O, etc
type Prjn ¶
type Prjn interface { params.Styler // TypeName, Name, and Class methods for parameter styling // Init MUST be called to initialize the prjn's pointer to itself as an emer.Prjn // which enables the proper interface methods to be called. Init(prjn Prjn) // SendLay returns the sending layer for this projection SendLay() Layer // RecvLay returns the receiving layer for this projection RecvLay() Layer // Pattern returns the pattern of connectivity for interconnecting the layers Pattern() prjn.Pattern // SetPattern sets the pattern of connectivity for interconnecting the layers. // Returns Prjn so it can be chained to set other properties too SetPattern(pat prjn.Pattern) Prjn // Type returns the functional type of projection according to PrjnType (extensible in // more specialized algorithms) Type() PrjnType // SetType sets the functional type of projection according to PrjnType // Returns Prjn so it can be chained to set other properties too SetType(typ PrjnType) Prjn // PrjnTypeName returns the string rep of functional type of projection // according to PrjnType (extensible in more specialized algorithms, by // redefining this method as needed). PrjnTypeName() string // Connect sets the basic connection parameters for this projection (send, recv, pattern, and type) Connect(send, recv Layer, pat prjn.Pattern, typ PrjnType) // SetClass sets CSS-style class name(s) for this projection (space-separated if multiple) // Returns Prjn so it can be chained to set other properties too SetClass(cls string) Prjn // Label satisfies the gi.Labeler interface for getting the name of objects generically Label() string // IsOff returns true if projection or either send or recv layer has been turned Off. // Useful for experimentation IsOff() bool // SetOff sets the projection Off status (i.e., lesioned) SetOff(off bool) // SynVarNames returns the names of all the variables on the synapse // This is typically a global list so do not modify! SynVarNames() []string // SynVarProps returns a map of synapse variable properties, with the key being the // name of the variable, and the value gives a space-separated list of // go-tag-style properties for that variable. // The NetView recognizes the following properties: // range:"##" = +- range around 0 for default display scaling // min:"##" max:"##" = min, max display range // auto-scale:"+" or "-" = use automatic scaling instead of fixed range or not. // zeroctr:"+" or "-" = control whether zero-centering is used // Note: this is a global list so do not modify! SynVarProps() map[string]string // SynIdx returns the index of the synapse between given send, recv unit indexes // (1D, flat indexes). Returns -1 if synapse not found between these two neurons. // This requires searching within connections for receiving unit (a bit slow). SynIdx(sidx, ridx int) int // SynVarIdx returns the index of given variable within the synapse, // according to *this prjn's* SynVarNames() list (using a map to lookup index), // or -1 and error message if not found. SynVarIdx(varNm string) (int, error) // SynVarNum returns the number of synapse-level variables // for this prjn. This is needed for extending indexes in derived types. SynVarNum() int // SynVal1D returns value of given variable index (from SynVarIdx) on given SynIdx. // Returns NaN on invalid index. // This is the core synapse var access method used by other methods, // so it is the only one that needs to be updated for derived layer types. SynVal1D(varIdx int, synIdx int) float32 // SynVals sets values of given variable name for each synapse, using the natural ordering // of the synapses (sender based for Leabra), // into given float32 slice (only resized if not big enough). // Returns error on invalid var name. SynVals(vals *[]float32, varNm string) error // SynVal returns value of given variable name on the synapse // between given send, recv unit indexes (1D, flat indexes). // Returns mat32.NaN() for access errors. SynVal(varNm string, sidx, ridx int) float32 // SetSynVal sets value of given variable name on the synapse // between given send, recv unit indexes (1D, flat indexes). // Typically only supports base synapse variables and is not extended // for derived types. // Returns error for access errors. SetSynVal(varNm string, sidx, ridx int, val float32) error // Defaults sets default parameter values for all Prjn parameters Defaults() // UpdateParams() updates parameter values for all Prjn parameters, // based on any other params that might have changed. UpdateParams() // ApplyParams applies given parameter style Sheet to this projection. // Calls UpdateParams if anything set to ensure derived parameters are all updated. // If setMsg is true, then a message is printed to confirm each parameter that is set. // it always prints a message if a parameter fails to be set. // returns true if any params were set, and error if there were any errors. ApplyParams(pars *params.Sheet, setMsg bool) (bool, error) // NonDefaultParams returns a listing of all parameters in the Projection that // are not at their default values -- useful for setting param styles etc. NonDefaultParams() string // AllParams returns a listing of all parameters in the Projection AllParams() string // WriteWtsJSON writes the weights from this projection from the receiver-side perspective // in a JSON text format. We build in the indentation logic to make it much faster and // more efficient. WriteWtsJSON(w io.Writer, depth int) // ReadWtsJSON reads the weights from this projection from the receiver-side perspective // in a JSON text format. This is for a set of weights that were saved *for one prjn only* // and is not used for the network-level ReadWtsJSON, which reads into a separate // structure -- see SetWts method. ReadWtsJSON(r io.Reader) error // SetWts sets the weights for this projection from weights.Prjn decoded values SetWts(pw *weights.Prjn) error // Build constructs the full connectivity among the layers as specified in this projection. Build() error }
Prjn defines the basic interface for a projection which connects two layers. Name is set automatically to: SendLay().Name() + "To" + RecvLay().Name()
type PrjnType ¶
type PrjnType int32
PrjnType is the type of the projection (extensible for more specialized algorithms). Class parameter styles automatically key off of these types.
const ( // Forward is a feedforward, bottom-up projection from sensory inputs to higher layers Forward PrjnType = iota // Back is a feedback, top-down projection from higher layers back to lower layers Back // Lateral is a lateral projection within the same layer / area Lateral // Inhib is an inhibitory projection that drives inhibitory synaptic inputs instead of excitatory Inhib PrjnTypeN )
The projection types
func (*PrjnType) FromString ¶
func (PrjnType) MarshalJSON ¶
func (*PrjnType) UnmarshalJSON ¶
type Prjns ¶
type Prjns []Prjn
Prjns is a slice of projections
func (*Prjns) ElemLabel ¶
ElemLabel satisfies the gi.SliceLabeler interface to provide labels for slice elements
func (*Prjns) RecvName ¶
RecvName finds the projection with given recv layer name, nil if not found see Try version for error checking.
func (*Prjns) RecvNameTry ¶
RecvNameTry finds the projection with given recv layer name. returns error message if not found
func (*Prjns) RecvNameTypeTry ¶ added in v1.0.0
RecvNameTypeTry finds the projection with given recv layer name and Type string. returns error message if not found.
func (*Prjns) SendName ¶
SendName finds the projection with given send layer name, nil if not found see Try version for error checking.
func (*Prjns) SendNameTry ¶
SendNameTry finds the projection with given send layer name. returns error message if not found