Skip to content

Commit

Permalink
Neural networks improvements (#89)
Browse files Browse the repository at this point in the history
* MultilayerPerceptron interface changes

- Signature closer to other algorithms
- New predict method
- Remove desired error
- Move maxIterations to constructor

* MLP tests for multiple hidden layers and multi-class

* Update all MLP-related tests

* coding style fixes

* Backpropagation included in multilayer-perceptron
  • Loading branch information
dmonllao authored and akondas committed May 17, 2017
1 parent 7ab80b6 commit 4af8449
Show file tree
Hide file tree
Showing 18 changed files with 369 additions and 343 deletions.
3 changes: 1 addition & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -76,8 +76,7 @@ Example scripts are available in a separate repository [php-ai/php-ml-examples](
* Workflow
* [Pipeline](http://php-ml.readthedocs.io/en/latest/machine-learning/workflow/pipeline)
* Neural Network
* [Multilayer Perceptron](http://php-ml.readthedocs.io/en/latest/machine-learning/neural-network/multilayer-perceptron/)
* [Backpropagation training](http://php-ml.readthedocs.io/en/latest/machine-learning/neural-network/backpropagation/)
* [Multilayer Perceptron Classifier](http://php-ml.readthedocs.io/en/latest/machine-learning/neural-network/multilayer-perceptron-classifier/)
* Cross Validation
* [Random Split](http://php-ml.readthedocs.io/en/latest/machine-learning/cross-validation/random-split/)
* [Stratified Random Split](http://php-ml.readthedocs.io/en/latest/machine-learning/cross-validation/stratified-random-split/)
Expand Down
3 changes: 1 addition & 2 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,8 +65,7 @@ Example scripts are available in a separate repository [php-ai/php-ml-examples](
* Workflow
* [Pipeline](machine-learning/workflow/pipeline)
* Neural Network
* [Multilayer Perceptron](machine-learning/neural-network/multilayer-perceptron/)
* [Backpropagation training](machine-learning/neural-network/backpropagation/)
* [Multilayer Perceptron Classifier](machine-learning/neural-network/multilayer-perceptron-classifier/)
* Cross Validation
* [Random Split](machine-learning/cross-validation/random-split/)
* [Stratified Random Split](machine-learning/cross-validation/stratified-random-split/)
Expand Down
30 changes: 0 additions & 30 deletions docs/machine-learning/neural-network/backpropagation.md

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
# MLPClassifier

A multilayer perceptron (MLP) is a feedforward artificial neural network model that maps sets of input data onto a set of appropriate outputs.

## Constructor Parameters

* $inputLayerFeatures (int) - the number of input layer features
* $hiddenLayers (array) - array with the hidden layers configuration, each value represent number of neurons in each layers
* $classes (array) - array with the different training set classes (array keys are ignored)
* $iterations (int) - number of training iterations
* $theta (int) - network theta parameter
* $activationFunction (ActivationFunction) - neuron activation function

```
use Phpml\Classification\MLPClassifier;
$mlp = new MLPClassifier(4, [2], ['a', 'b', 'c']);
// 4 nodes in input layer, 2 nodes in first hidden layer and 3 possible labels.
```

## Train

To train a MLP simply provide train samples and labels (as array). Example:


```
$mlp->train(
$samples = [[1, 0, 0, 0], [0, 1, 1, 0], [1, 1, 1, 1], [0, 0, 0, 0]],
$targets = ['a', 'a', 'b', 'c']
);
```

## Predict

To predict sample label use predict method. You can provide one sample or array of samples:

```
$mlp->predict([[1, 1, 1, 1], [0, 0, 0, 0]]);
// return ['b', 'c'];
```

## Activation Functions

* BinaryStep
* Gaussian
* HyperbolicTangent
* Sigmoid (default)
29 changes: 0 additions & 29 deletions docs/machine-learning/neural-network/multilayer-perceptron.md

This file was deleted.

3 changes: 1 addition & 2 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,7 @@ pages:
- Workflow:
- Pipeline: machine-learning/workflow/pipeline.md
- Neural Network:
- Multilayer Perceptron: machine-learning/neural-network/multilayer-perceptron.md
- Backpropagation training: machine-learning/neural-network/backpropagation.md
- Multilayer Perceptron Classifier: machine-learning/neural-network/multilayer-perceptron-classifier.md
- Cross Validation:
- RandomSplit: machine-learning/cross-validation/random-split.md
- Stratified Random Split: machine-learning/cross-validation/stratified-random-split.md
Expand Down
67 changes: 67 additions & 0 deletions src/Phpml/Classification/MLPClassifier.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
<?php

declare(strict_types=1);

namespace Phpml\Classification;

use Phpml\Classification\Classifier;
use Phpml\Exception\InvalidArgumentException;
use Phpml\NeuralNetwork\Network\MultilayerPerceptron;
use Phpml\NeuralNetwork\Training\Backpropagation;
use Phpml\NeuralNetwork\ActivationFunction;
use Phpml\NeuralNetwork\Layer;
use Phpml\NeuralNetwork\Node\Bias;
use Phpml\NeuralNetwork\Node\Input;
use Phpml\NeuralNetwork\Node\Neuron;
use Phpml\NeuralNetwork\Node\Neuron\Synapse;
use Phpml\Helper\Predictable;

class MLPClassifier extends MultilayerPerceptron implements Classifier
{

/**
* @param mixed $target
* @return int
*/
public function getTargetClass($target): int
{
if (!in_array($target, $this->classes)) {
throw InvalidArgumentException::invalidTarget($target);
}
return array_search($target, $this->classes);
}

/**
* @param array $sample
*
* @return mixed
*/
protected function predictSample(array $sample)
{
$output = $this->setInput($sample)->getOutput();

$predictedClass = null;
$max = 0;
foreach ($output as $class => $value) {
if ($value > $max) {
$predictedClass = $class;
$max = $value;
}
}
return $this->classes[$predictedClass];
}

/**
* @param array $sample
* @param mixed $target
*/
protected function trainSample(array $sample, $target)
{

// Feed-forward.
$this->setInput($sample)->getOutput();

// Back-propagate.
$this->backpropagation->backpropagate($this->getLayers(), $this->getTargetClass($target));
}
}
19 changes: 18 additions & 1 deletion src/Phpml/Exception/InvalidArgumentException.php
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,14 @@ public static function invalidClustersNumber()
return new self('Invalid clusters number');
}

/**
* @return InvalidArgumentException
*/
public static function invalidTarget($target)
{
return new self('Target with value ' . $target . ' is not part of the accepted classes');
}

/**
* @param string $language
*
Expand All @@ -89,6 +97,15 @@ public static function invalidLayerNodeClass()
*/
public static function invalidLayersNumber()
{
return new self('Provide at least 2 layers: 1 input and 1 output');
return new self('Provide at least 1 hidden layer');
}

/**
* @return InvalidArgumentException
*/
public static function invalidClassesNumber()
{
return new self('Provide at least 2 different classes');
}

}
2 changes: 1 addition & 1 deletion src/Phpml/NeuralNetwork/Network/LayeredNetwork.php
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ public function setInput($input)
foreach ($this->getLayers() as $layer) {
foreach ($layer->getNodes() as $node) {
if ($node instanceof Neuron) {
$node->refresh();
$node->reset();
}
}
}
Expand Down
82 changes: 76 additions & 6 deletions src/Phpml/NeuralNetwork/Network/MultilayerPerceptron.php
Original file line number Diff line number Diff line change
Expand Up @@ -4,34 +4,93 @@

namespace Phpml\NeuralNetwork\Network;

use Phpml\Estimator;
use Phpml\Exception\InvalidArgumentException;
use Phpml\NeuralNetwork\Training\Backpropagation;
use Phpml\NeuralNetwork\ActivationFunction;
use Phpml\NeuralNetwork\Layer;
use Phpml\NeuralNetwork\Node\Bias;
use Phpml\NeuralNetwork\Node\Input;
use Phpml\NeuralNetwork\Node\Neuron;
use Phpml\NeuralNetwork\Node\Neuron\Synapse;
use Phpml\Helper\Predictable;

class MultilayerPerceptron extends LayeredNetwork
abstract class MultilayerPerceptron extends LayeredNetwork implements Estimator
{
use Predictable;

/**
* @param array $layers
* @var array
*/
protected $classes = [];

/**
* @var int
*/
private $iterations;

/**
* @var Backpropagation
*/
protected $backpropagation = null;

/**
* @param int $inputLayerFeatures
* @param array $hiddenLayers
* @param array $classes
* @param int $iterations
* @param ActivationFunction|null $activationFunction
* @param int $theta
*
* @throws InvalidArgumentException
*/
public function __construct(array $layers, ActivationFunction $activationFunction = null)
public function __construct(int $inputLayerFeatures, array $hiddenLayers, array $classes, int $iterations = 10000, ActivationFunction $activationFunction = null, int $theta = 1)
{
if (count($layers) < 2) {
if (empty($hiddenLayers)) {
throw InvalidArgumentException::invalidLayersNumber();
}

$this->addInputLayer(array_shift($layers));
$this->addNeuronLayers($layers, $activationFunction);
$nClasses = count($classes);
if ($nClasses < 2) {
throw InvalidArgumentException::invalidClassesNumber();
}
$this->classes = array_values($classes);

$this->iterations = $iterations;

$this->addInputLayer($inputLayerFeatures);
$this->addNeuronLayers($hiddenLayers, $activationFunction);
$this->addNeuronLayers([$nClasses], $activationFunction);

$this->addBiasNodes();
$this->generateSynapses();

$this->backpropagation = new Backpropagation($theta);
}

/**
* @param array $samples
* @param array $targets
*/
public function train(array $samples, array $targets)
{
for ($i = 0; $i < $this->iterations; ++$i) {
$this->trainSamples($samples, $targets);
}
}

/**
* @param array $sample
* @param mixed $target
*/
protected abstract function trainSample(array $sample, $target);

/**
* @param array $sample
* @return mixed
*/
protected abstract function predictSample(array $sample);

/**
* @param int $nodes
*/
Expand Down Expand Up @@ -92,4 +151,15 @@ private function generateNeuronSynapses(Layer $currentLayer, Neuron $nextNeuron)
$nextNeuron->addSynapse(new Synapse($currentNeuron));
}
}

/**
* @param array $samples
* @param array $targets
*/
private function trainSamples(array $samples, array $targets)
{
foreach ($targets as $key => $target) {
$this->trainSample($samples[$key], $target);
}
}
}
2 changes: 1 addition & 1 deletion src/Phpml/NeuralNetwork/Node/Neuron.php
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ public function getOutput(): float
return $this->output;
}

public function refresh()
public function reset()
{
$this->output = 0;
}
Expand Down
4 changes: 1 addition & 3 deletions src/Phpml/NeuralNetwork/Training.php
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,6 @@ interface Training
/**
* @param array $samples
* @param array $targets
* @param float $desiredError
* @param int $maxIterations
*/
public function train(array $samples, array $targets, float $desiredError = 0.001, int $maxIterations = 10000);
public function train(array $samples, array $targets);
}
Loading

0 comments on commit 4af8449

Please sign in to comment.