This post lists materials that I find useful during my journey of learning neural networks from scratch.
Victor Zhou’s Neural Networks From Scratch
Neural Networks From Scratch is a 4-post series that introduces classic neural networks, recurrent neural networks (RNNs) and convolutional neural networks (CNNs). It doesn’t require any prior knowledge except for some math. One good thing about this series is that it’s very hands-on and you will learn, step-by-step, how to write a simple nueral network from scratch using only numpy to solve a real problem.
Michael Nielsen’s Visual proof that neural nets can compute any function
Visual proof that neural nets can compute any function gives a visual explanation of the universal approximation theorem that neural networks can be used to approximate any continuous function to any desired precision.