Documentation
¶
Overview ¶
Package neural provides structure for creating and modifying neural networks.
Index ¶
- Variables
- func AbsFitness(nn *Network, inputs, expected [][]float64) int
- func BentIdentity(x float64) float64
- func GeneratePopulation(opt interface{}, popSize int) []pop.Individual
- func GraphPrintActivator(a ActivatorFunc)
- func Identity(x float64) float64
- func Init(newNgo NetworkGenerationOptions, newCrossover NeuralCrossover, f FitnessFunc)
- func OptInit(opt interface{})
- func Rectifier(x float64) float64
- func Sinc(x float64) float64
- func Softplus(x float64) float64
- func Softsign(x float64) float64
- func Softstep(x float64) float64
- type ActivatorFunc
- type ActivatorMutationOptions
- type AverageCrossover
- type Body
- type ColumnGenerationOptions
- type FitnessFunc
- type FloatMutationOptions
- type InitOptions
- type Network
- func (nn *Network) CanCrossover(other pop.Individual) bool
- func (nn *Network) Copy() *Network
- func (nn *Network) Crossover(other pop.Individual) pop.Individual
- func (n *Network) Fitness(inputs, expected [][]float64) int
- func (nn *Network) Mutate()
- func (nn *Network) MutateOpts(mOpt NetworkMutationOptions) *Network
- func (nn Network) Print()
- func (modNet_p *Network) Run(inputs []float64) []float64
- type NetworkGenerationOptions
- type NetworkMutationOptions
- type NeuralCrossover
- type Neuron
- type PointCrossover
- type UniformCrossover
Constants ¶
This section is empty.
Variables ¶
var ( AllActivators = ActivatorMutationOptions{ Rectifier, Identity, BentIdentity, Softplus, Softstep, Softsign, Sinc, Perceptron_Threshold(0.5), Rectifier_Exponential(1.5), Rectifier_Exponential(4), } )
Functions ¶
func AbsFitness ¶
func BentIdentity ¶
func GeneratePopulation ¶
func GeneratePopulation(opt interface{}, popSize int) []pop.Individual
func GraphPrintActivator ¶
func GraphPrintActivator(a ActivatorFunc)
*
- Print the activator function as an ASCII graph
- Uses a staticly defined range
func Init ¶
func Init(newNgo NetworkGenerationOptions, newCrossover NeuralCrossover, f FitnessFunc)
Types ¶
type ActivatorFunc ¶
An activator function just maps float values to other float values. The function can be as simplistic or complicated as desired-- eventually a set of common activators will be collected.
func MutateActivator ¶
func MutateActivator(mOpt ActivatorMutationOptions) ActivatorFunc
All activators are currently weighted the same in this mutation. A future implementation could also take in how highly each activator should be weighed, and could avoid mutating into the current activator (although this only has the effect of slightly reducing real mutation chance) *
- Mutate an activator function.
func Perceptron_Threshold ¶
func Perceptron_Threshold(t float64) ActivatorFunc
func Rectifier_Exponential ¶
func Rectifier_Exponential(a float64) ActivatorFunc
func Rectifier_Parametric ¶
func Rectifier_Parametric(a float64) ActivatorFunc
type ActivatorMutationOptions ¶
type ActivatorMutationOptions []ActivatorFunc
type AverageCrossover ¶
type AverageCrossover struct { // This weight is applied to all weights in the first // network selected, before the average of the networks // is calculated. A weight more distant from 1 will // swing the averaged networks toward more closely // emulating one network or the other. Cannot be negative. // // This might need to be modified into two weightMods // if crossover pairings are determined non-randomly WeightMod float64 }
For every neuron in the two networks, take the weights that neuron has and average them for a new network. They'll be averaged by ((weight1 * weightMod) + weight2) / (weightMod + 1)
func (AverageCrossover) Crossover ¶
func (ac AverageCrossover) Crossover(a, b *Network) *Network
type Body ¶
type Body [][]Neuron
A Body is what we would like to call the actual network -- it's just a 2d slice of neurons.
func (*Body) Mutate ¶
func (b *Body) Mutate(mOpt NetworkMutationOptions)
*
- Mutate this network body.
type ColumnGenerationOptions ¶
type FitnessFunc ¶
func MatchFitness ¶
func MatchFitness(tolerance float64) FitnessFunc
type FloatMutationOptions ¶
type InitOptions ¶
type InitOptions struct { Ngo NetworkGenerationOptions Cross NeuralCrossover Fitness FitnessFunc }
type Network ¶
type Network struct { Activator ActivatorFunc Body Body }
A Neural Network has a body which it runs values through and an activator function which is used at each neuron to process those values.
func GenerateNetwork ¶
func GenerateNetwork(nnOpt NetworkGenerationOptions) *Network
*
- Convert generation options into
- a new neural network
func (*Network) CanCrossover ¶
func (nn *Network) CanCrossover(other pop.Individual) bool
func (*Network) Crossover ¶
func (nn *Network) Crossover(other pop.Individual) pop.Individual
func (*Network) Fitness ¶
*
- Evaluate the fitness of a network
- low fitness is good, high fitness is bad.
func (*Network) MutateOpts ¶
func (nn *Network) MutateOpts(mOpt NetworkMutationOptions) *Network
*
- Mutate this network
type NetworkGenerationOptions ¶
type NetworkGenerationOptions struct { NetworkMutationOptions MinColumns int MaxColumns int MaxInputs int MaxOutputs int BaseMutations int }
func (NetworkGenerationOptions) Mutate ¶
func (genOpt NetworkGenerationOptions) Mutate(n *Network) *Network
type NetworkMutationOptions ¶
type NetworkMutationOptions struct { WeightOptions FloatMutationOptions ColumnOptions ColumnGenerationOptions ActivatorOptions ActivatorMutationOptions // checked per column NeuronReplacementChance float64 NeuronAdditionChance float64 WeightSwapChance float64 // checked per network ColumnRemovalChance float64 ColumnAdditionChance float64 NeuronMutationChance float64 ActivatorMutationChance float64 }
type NeuralCrossover ¶
type Neuron ¶
type Neuron []float64
A Neuron is a list of weights. Classically, the weights on a neuron would normally represent what that neuron would multiply its inputs by to obtain it's value.
These weights do not represent that. These weights represent what this neuron should multiply its input by before sending it to the next column, for each element in the next column.
Effectively, each neuron receives pre-weighted values. There's no difference in how the neurons function-- interpret a neuron's weights as the set of weights from the previous column where the index in each previous column's neuron's weights matches the index of the desired neuron in the following column, if you so choose.
All Neurons connect to all Neurons in the following column. A weight of 0.0 represents what would classically be no connection.
There probably isn't a significant difference in performance between these two representations. The significant implementation difference is where the delay happens on channel sending-- does it happen as signals are sent, or does it happen as they are received?
type PointCrossover ¶
type PointCrossover struct {
NumPoints int
}
Randomly determine NumPoints points to stitch two networks together at. For each NumPoints, a point in a similar position along both networks will be chosen to split at. This will be more consistent if neural networks cannot expand or reduce in size.
func (PointCrossover) Crossover ¶
func (pc PointCrossover) Crossover(a, b *Network) *Network
type UniformCrossover ¶
type UniformCrossover struct { // This proportion of neurons that are chosen // from the first network selected. // The remaining proporiton 1 - chosenProportion // come from the other network. // Cannot be negative. ChosenProportion float64 }
Choose a bunch of random neurons from each network and make a new network out of them. I don't think this is a very good idea for neural networks, but we'll see.
func (*UniformCrossover) Crossover ¶
func (uc *UniformCrossover) Crossover(a, b *Network) *Network