BrabeNetz is a supervised neural network written in C++, aiming to be as fast as possible. It can effectively multithread on the CPU where needed, allocate and free fast (by
malloc
/free
), access values faster (pointer-arrays instead ofvector
) and is well documented.
I've written two examples of using BrabeNetz in the Trainer class to train a XOR ({0,0}=0
, {0,1}=1
, ..) and recognize handwritten characters.
In my XOR example, I'm using a {2,3,1}
topology (2
input-, 3
hidden- and 1
output-neurons), but BrabeNetz is scalable until the hardware reaches its limits. The digits recognizer is using a {784,500,100,10}
network to train handwritten digits from the MNIST DB.
Be sure to read the network description, and check out my digit recognizer written in Qt (using a trained BrabeNetz MNIST dataset)
Build: Release x64 | Windows 10 64bit
CPU: Intel i7 6700k @ 4.0GHz x 8cores
RAM: HyperX Fury DDR4 32GB CL14 2400MHz
SSD: Samsung 850 EVO 540MB/s
Commit: 53328c3
Training a XOR 1000 times takes just 0.49ms
Actual prediction of the digit recognizer network
Effectively using all available cores (24/24, 100% workload)
BrabeNetz running on Linux (Debian 9, Linux 4.9.62, KDE Plasma)
Task Resource viewer (htop) on Linux (Debian 9, Linux 4.9.62, KDE Plasma)
- Faster algorithms via
malloc
/free
instead ofnew
/delete
, and pointers instead ofstd::vector
- Smart multithreading by OpenMP where worth the spawn-overhead
- Scalability (Neuron size, Layer count) - only limited by hardware
- Easy to use (Inputs, outputs)
- Randomly generated values to begin with
- Easily binary save/load with
network::save(string)
/network::load(string)
(state.nn
file) - Sigmoid squashing function
- Biases for each neuron
network_topology
helper objects for loading/saving state and inspecting network
-
Build library
- Download/Clone from GitHub and change custom definitions (see this for more info)
- Open Developer Commandprompt for Visual Studio and navigate to the
BrabeNetz\BrabeNetz
folder - Run
msbuild BrabeNetz.vcxproj /p:Configuration=Release /p:Platform=x64
(Use the configuration and platform you need) - Link the library (in
BrabeNetz\BrabeNetz\x64\Release
) to your Project - Add headers to your project (every file ending with
.h
inBrabeNetz\BrabeNetz
)
-
Constructors
network(initializer_list<int>)
: Create a new neural network with the given topology vector and fill it with random numbers ({ 2, 3, 4, 1}
= 2 Input, 3 Hidden, 4 Hidden, 1 Output Neurons - total of 4 layers)network(network_topology&)
: Create a new neural network with the given network topology and load_ it's valuesnetwork(string)
: Create a new neural network with the given path to thesate.nn
file and load it.
-
Functions
double* feed(double* input_values)
: Feed the networkinput_values
and return an array of output values (where the array's length is the size of the output layer in topology)double* train(double* input_values, double* expected_output, double& out_total_error)
: Feed the networkinput_values
and backwards-propagate to adjust the weights/biases and reduce error. Returns the output layer's values,out_total_error
will be set to the total error of the output layer (This can be used to check if more training is needed)void save(string path)
: Save the current network state (topology, weights, biases) to disk (with the given path or default:state.nn
)void set_learnrate(double value)
: Set the learn rate of the network (used bytrain(..)
function). Should either be a constant (0.5
) or1 / (total train times + 1)
network_topology& build_topology()
: Build and set the network topology object of the current network's state (Can be used for network visualization or similar)