tinynn is a lightweight deep learning framework written in Python3 (with NumPy).
pip install tinynn
git clone https://github.com/borgwang/tinynn.git
cd tinynn/examples
# MNIST classification
python mnist/run.py
# a toy regression task
python nn_paint/run.py
# reinforcement learning demo (gym environment required)
python rl/run.py
# define a model
net = Net([Dense(50), ReLU(), Dense(100), ReLU(), Dense(10)])
model = Model(net=net, loss=MSE(), optimizer=Adam(lr))
# train
for batch in iterator(train_x, train_y):
preds = model.forward(batch.inputs)
loss, grads = model.backward(preds, batch.targets)
model.apply_grads(grads)
- layers: Dense, Conv2D, ConvTranspose2D, RNN, MaxPool2D, Dropout, BatchNormalization
- activation: ReLU, LeakyReLU, Sigmoid, Tanh, Softplus
- losses: SoftmaxCrossEntropy, SigmoidCrossEntropy, MAE, MSE, Huber
- optimizer: RAdam, Adam, SGD, Momentum, RMSProp, Adagrad, Adadelta
Please follow the Google Python Style Guide for Python coding style.
MIT