Documentation ¶
Overview ¶
bench runs a benchmark model with 5 layers (3 hidden, Input, Output) all of the same size, for benchmarking different size networks. These are not particularly realistic models for actual applications (e.g., large models tend to have much more topographic patterns of connectivity and larger layers with fewer connections), but they are easy to run..
Index ¶
- Variables
- func CenterPoolIdxs(ly emer.Layer, n int) []int
- func ConfigEpcLog(dt *etable.Table)
- func ConfigNet(b *testing.B, net *axon.Network, ...)
- func ConfigPats(pats *etable.Table, numPats int, inputShape [2]int, outputShape [2]int)
- func TrainNet(net *axon.Network, pats, epcLog *etable.Table, epcs int, verbose bool)
Constants ¶
This section is empty.
Variables ¶
View Source
var ParamSets = params.Sets{ {Name: "Base", Desc: "these are the best params", Sheets: params.Sheets{ "Network": ¶ms.Sheet{ {Sel: "Prjn", Desc: "", Params: params.Params{ "Prjn.Learn.LRate.Base": "0.1", "Prjn.SWt.Adapt.LRate": "0.1", "Prjn.SWt.Init.SPct": "0.5", }}, {Sel: "Layer", Desc: "", Params: params.Params{ "Layer.Inhib.ActAvg.Nominal": "0.08", "Layer.Inhib.Layer.Gi": "1.05", "Layer.Act.Gbar.L": "0.2", }}, {Sel: "#Input", Desc: "", Params: params.Params{ "Layer.Inhib.Layer.Gi": "0.9", "Layer.Act.Clamp.Ge": "1.5", }}, {Sel: "#Output", Desc: "", Params: params.Params{ "Layer.Inhib.Layer.Gi": "0.70", "Layer.Act.Clamp.Ge": "0.8", }}, {Sel: ".BackPrjn", Desc: "top-down back-projections MUST have lower relative weight scale, otherwise network hallucinates", Params: params.Params{ "Prjn.PrjnScale.Rel": "0.2", }}, }, }}, }
Functions ¶
func CenterPoolIdxs ¶
CenterPoolIdxs returns the unit indexes for 2x2 center pools if sub-pools are present, then only first such subpool is used. TODO: Figure out what this is doing
func ConfigEpcLog ¶
func ConfigPats ¶
Types ¶
This section is empty.
Click to show internal directories.
Click to hide internal directories.