Neural Network training with Adam optimizer from scratch
Full code for training and testing of a simple neural network on the MNIST data set for recognition of single digits between 0 and 9 (Accuracy around 98 %). Everything is implemented from scratch, including the Adam optimizer. Make sure all the files are in your current folder and run "train.m".
Check out http://neuralnetworksanddeeplearning.com/index.html to learn about the theory of neural networks and https://arxiv.org/abs/1412.6980 to understand the Adam optimizer!
Cite As
Johannes Langelaar (2026). Neural Network training with Adam optimizer from scratch (https://www.mathworks.com/matlabcentral/fileexchange/90461-neural-network-training-with-adam-optimizer-from-scratch), MATLAB Central File Exchange. Retrieved .
MATLAB Release Compatibility
Platform Compatibility
Windows macOS LinuxTags
Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.
| Version | Published | Release Notes | |
|---|---|---|---|
| 1.0.0 |
