Learn how to code Neural Networks from scratch with one of the greats

Learn how to code Neural Networks from scratch with one of the greats
Photo by Alina Grubnyak / Unsplash

Andrej Karpathy was director of artificial intelligence at Tesla, and is a founder of the research group OpenAI.

Relatively few people know that he quite recently started a YouTube Channel, which is filled with interesting and important topics.

Today, I'd like to highlight this video:

What is it?

Andrej walks you through the creation of a library called micrograd, which is a library that is designed to mimic the API of PyTorch, a deep learning library. This is not a theoretical exercise–by the time you've finished, you'll have built a library that will allow you to build neural networks and perform back-propagation.

Who is this for?

It is useful for anybody who wants to better understand how how deep learning works. This might not be for beginners–if you're just starting out, I'd recommend you focus more on practical exercises  that can produce results.

What problem does it solve?

It helps you understand what PyTorch is doing beneath the hood. Personally, this:

  • Improved my intuition about how to solve problems with PyTorch
  • Helped me to understand how to use PyTorch to implement research papers
  • Helped me to troubleshoot any issues I have with PyTorch.