What's Next?

What's Next

You have implemented backpropagation, multi-layer networks, weight initialization, regularization, and Adam from scratch. Here are the natural next steps:

  • MicroGPT — Build a transformer language model using an autograd engine. The Value class and backprop you just mastered are exactly what drives it
  • Convolutional Networks — 2D convolution, pooling layers, and feature maps for image classification
  • Recurrent Networks — LSTMs and GRUs for sequences, using the same backprop-through-time principle
  • PyTorch — Now that you understand what autograd does internally, using PyTorch will feel natural rather than magical

Further Reading

  • Deep Learning by Goodfellow, Bengio & Courville — The comprehensive textbook, freely available online
  • Neural Networks and Deep Learning by Michael Nielsen — Excellent visual explanations of backprop
  • Andrej Karpathy's backprop video — Builds micrograd live, the same approach as this course
  • The Elements of Statistical Learning — Chapter 11 covers neural networks in the statistical framework
← Previous