Tutorials and implementations for "Self-normalizing networks"(SNNs) as suggested by Klambauer et al. (arXiv pre-print).
- Python 3.5 and Tensorflow 1.1
Tensorflow 1.4 already has the function "tf.nn.selu" and "tf.contrib.nn.alpha_dropout" that implement the SELU activation function and the suggested dropout version.
- Multilayer Perceptron (notebook)
- Convolutional Neural Network on MNIST (notebook)
- Convolutional Neural Network on CIFAR10 (notebook)
- KERAS: Convolutional Neural Network on MNIST (python script)
- KERAS: Convolutional Neural Network on CIFAR10 (python script)
- How to obtain the SELU parameters alpha and lambda for arbitrary fixed points (notebook)
are provided as code chunks here: selu.py
are provided here: Figure1
are provided as mathematica notebooks here: