neuronet

module
v0.0.0-...-0d293fa Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 21, 2019 License: MIT

README

NeuroNet

Go Concurrent Neural Network

How to run

Preparation
  • Put MNISTTranslate, NeuroNet, and the Bash- and/or Batchfile into your GOROOT directory
  • Download the MNIST Dataset
  • Put the files into the ./MNISTTranslate directory
Quick run
  • Run the program
    • Linux: ./BakkBash
    • Windows: ./BakkBatch.cmd
Custom
  • Change ./NeuroNet/Batchfile/Batch to your specifications
  • Modify how your network should look like (example in ./NeuroNet/Data/Networks/Numbers)
  • First time (or after changes) you need to build it
    • MNISTTranslate: go build ./MNISTTranslate/MNISTTranslate.go
    • NeuroNet: go build ./NeuroNet/NeuroNet.go
  • Run MNISTTranslate
  • Put the translated files into your specified folders
    • Standard Test folder: ./NeuroNet/Data/Test
    • Standard Train folder: ./NeuroNet/Data/Train
  • Run NeuroNet

Batchfile

Each line in the batch file represents one action which the software will execute.

  • ResultFile: → Path to the result output
  • NetworkFile: → Path to the network creation file
  • PersistenceFile: → Path to the persistence
  • TrainFile: → Path to the training data
  • TestFile: → Path to the test data
  • PreProcessing: → None / MeanSubstraction / Proportional
  • Parallel: → Number of goroutines
  • WorkerBatch: → Batch size for each worker before result merge
  • LearningRate: → Rate of learning
  • Lambda: → Lambda value for elastic net regularization
  • MinWeight: → Minimum weight for connectome initialization
  • MaxWeight: → Maximum weight for connectome initialization
  • TargetColumnsStart: → First field that is a target value
  • TargetColumnsEnd: → First field that isn’t a target value
  • Train: → Train the network x times
  • Test: → Run a test x times

Network

Each line represents one layer in the network.

Schema: Activation,Neurons

Current activation functions:

  • Identity
  • Logistic
  • TanH
  • ReLU
  • LeakyReLU
  • ELU
  • SoftMax

In example "SoftMax, 10" (without quotation marks) creates one SoftMax layer with 10 neurons. Currently a bias neuron will be added to every but the last layer.

Directories

Path Synopsis
Package main contains the main function
Package main contains the main function
Core/Brain
Package Network contains the functionality of and sets up the neural network
Package Network contains the functionality of and sets up the neural network
Core/Persistence
Package Persistence to save and load progress or results
Package Persistence to save and load progress or results

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL