Skip to content

fr3fou/gone

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

gone

Github Actions Widget GoReport Widget GoDoc Widget

A simple neural network library in Go from scratch. 0 dependencies*

there are 0 neural network related dependencies, the only dependencies are for testing (stretchr/testify) and for persistence (golang/protobuf)

Example

Getting started

 g := gone.New(
  0.1,
  gone.MSE(),
  gone.Layer{
   Nodes: 2,
  },
  gone.Layer{
   Nodes:     4,
   Activator: gone.Sigmoid(),
  },
  gone.Layer{
   Nodes: 1,
  },
 )

 g.Train(gone.SGD(), gone.DataSet{
  {
   Inputs:  []float64{1, 0},
   Targets: []float64{1},
  },
  {
   Inputs:  []float64{0, 1},
   Targets: []float64{1},
  },
  {
   Inputs:  []float64{1, 1},
   Targets: []float64{0},
  },
  {
   Inputs:  []float64{0, 0},
   Targets: []float64{0},
  },
 }, 5000)

 g.Predict([]float64{1, 1})

Saving model to disk

 g.Save("test.gone")

Loading model back into memory

 g, err := gone.Load("test.gone")

TODO

gone/

  • Types of task:
    • Classification
    • Regression
  • Bias
    • Matrix, rather than a single number
  • Feedforward (Predict)
  • Train
    • Support shuffling the data
    • Epochs
    • Backpropagation
    • Batching
    • Different loss functions
      • Mean Squared Error
      • Cross Entropy Error
  • Saving data - Done thanks to protobuf
  • Loading data
  • Adam optimizer
  • Nestrov + Momentum for GD
  • Fix MSE computation in debug mode (not used in actual backpropagation)
  • Somehow persist configurations for Activation, Loss and Optimizer functions in the protobuf messages (???, if we want to do it like it tensorflow, we'd have to do interface{} and do type assertions)
  • Convolutional Layers
    • Flatten layer

matrix/

  • Randomize
  • Transpose
  • Scale
  • AddMatrix
  • Add
  • SubtractMatrix
  • Subtract
  • Multiply
  • Multiply
  • Flatten
  • Unflatten
  • NewFromArray - makes a single row
  • Map
  • Fold
  • Methods to support chaining
     n.Weights[i].
  Multiply(output).                         // weighted sum of the previous layer)
  Add(n.Layers[i+1].Bias).                  // bias
  Map(func(val float64, x, y int) float64 { // activation
   return n.Layers[i+1].Activator.F(val)
  })

Research

  • Derivatives ~
  • Partial Derivatives ~
  • Linear vs non-linear problems (activation function)
  • Gradient Descent
    • (Batch) Gradient Descent (GD)
    • Stochastic Gradient Descent (SGD)
    • Mini-Batch Gradient Descent (MBGD?)
  • Softmax (needed for multi class classification!)
  • Mean Squared Error
  • Cross Entropy Error (needed for multi class classification!)
  • How to determine how many layers and nodes to use
  • One Hot Encoding
  • Convolutional Layers
  • Reinforcment learning
  • Genetic Algorithms

Examples

Shoutouts

  • David Josephs - was of HUGE help with algebra and other ML-related questions; also helped me spot some nasty bugs!

Resources used

Releases

No releases published

Packages

No packages published

Languages