Skip to content

Latest commit

 

History

History
12 lines (7 loc) · 310 Bytes

README.md

File metadata and controls

12 lines (7 loc) · 310 Bytes

Multi-layer-Perceptron

Multi-layer Perceptron from scratch with numpy


The MLP in model.py contains backpropagation, regularization, and activation functions like ReLU and softmax.

It does not use any frameworks but numpy.

The performance on MNIST is around 95% accurate.

Enjoy the codes.