English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 55 lectures (9h 39m) | 2.29 GB

Get An Intuitive Understanding of Deep Learning

Are you interested in Artificial Intelligence (AI), Machine Learning and Artificial Neural Network?

Are you afraid of getting started with Deep Learning because it sounds too technical?

Have you been watching Deep Learning videos, but still don’t feel like you “get” it?

I’ve been there myself! I don’t have an engineering background. I learned to code on my own. But AI still seemed completely out of reach.

This course was built to save you many months of frustration trying to decipher Deep Learning. After taking this course, you’ll feel ready to tackle more advanced, cutting-edge topics in AI.

In this course:

We assume as little prior knowledge as possible. No engineering or computer science background required (except for basic Python knowledge). You don’t know all the math needed for Deep Learning? That’s OK. We’ll go through them all together – step by step.

We’ll “reinvent” a deep neural network so you’ll have an intimate knowledge of the underlying mechanics. This will make you feel more comfortable with Deep Learning and give you an intuitive feel for the subject.

We’ll also build a basic neural network from scratch in PyTorch and PyTorch Lightning and train an MNIST model for handwritten digit recognition.

After taking this course:

You’ll finally feel you have an “intuitive” understanding of Deep Learning and feel confident expanding your knowledge further.

If you go back to the popular courses you had trouble understanding before (like Andrew Ng’s courses or Jeremy Howards’ Fastai course), you’ll be pleasantly surprised at how much more you can understand.

You’ll be able to understand what experts like Geoffrey Hinton are saying in articles or Andrej Karpathy is saying during Tesla Autonomy Day.

You’ll be well equipped with both practical and theoretical understanding to start exploring more advanced neural network architectures like Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), transformers, etc. and start your journey towards the cutting edge of AI, Supervised and Unsupervised learning, and more.

You can start experimenting with your own AI projects using PyTorch and Supervised Learning

What you’ll learn

- Develop an intuitive understanding of Deep Learning
- Visual and intuitive understanding of core math concepts behind Deep Learning
- Detailed view of how exactly deep neural networks work beneath the hood
- Computational graphs (which libraries like PyTorch and Tensorflow are built on)
- Build neural networks from scratch using PyTorch and PyTorch Lightening
- You’ll be ready to explore the cutting edge of AI and more advanced neural networks like CNNs, RNNs and Transformers
- You’ll be able to understand what deep learning experts are talking about in articles and interviews
- You’ll be able to start experimenting with your own AI projects using PyTorch

## Table of Contents

**Deep learning – the big picture**

1 Introduction

2 What is Machine Learning exactly

3 Different types of machine learning supervised, unsupervised, and reinforcement

4 The big picture

5 Deep neural network as features and weights

6 Loss functions and training vs inference

7 Why deep learning is unintuitive and how to get good at it

8 How to make neural networks feel intuitive

9 Course overview

**Reinventing deep neural network from scratch**

10 Linear regression and MSE loss

11 Numerical analysis – a.k.a. “trial-and-error”

12 Network view

13 Perceptrons

14 The “Deep” in deep learning

15 Activation Function

16 Overparameterization and overfitting

17 Linear Algebra detour

18 Vectorization (= parallelization)

19 Scalability and emergent properties

20 Recap of the forward pass and brief introduction to backward pass

**How the model learns on its own – Back Propagation algorithm deep-div**

21 The back propagation algorithm

22 Calculus detour

23 Calculus detour II

24 Gradient descent

25 Calculus detour – partial derivatives and gradient descent

26 Calculus detour – the Chain Rule

27 Calculus detour – the Chain Rule II

28 Computational graph I – forward pass

29 Computational graph II – backward pass

30 Computational graph III – backward pass II

31 Computational graph IV – backward pass III

32 Forward and backward pass recap and wrap up

**How to make neural networks work in reality**

33 Vanishing gradient problem

34 Vanishing gradient solutions I

35 Vanishing gradient solutions II

36 Stochastic and mini-batch gradient descent

37 Other optimizers I

38 Other optimizers II

39 Hyperparameter tuning strategies

40 Batch normalization

41 Overfitting I – problem and solution overview

42 Overfitting II – regularization and drop out

43 Softmax activation

44 Loss functions

45 Cross entropy loss

**Coding deep neural networks in PyTorch and PyTorch Lightning**

46 Setting up a coding environment using Anaconda and Jupyter Notebook in Vscode

47 Train an MNIST model from scratch in plain PyTorch I

48 Train an MNIST model from scratch in plain PyTorch II

49 Train an MNIST model from scratch in plain PyTorch III

50 Train an MNIST model from scratch in plain PyTorch IV

51 Train an MNIST model using PyTorch’s nn module I

52 Train an MNIST model using PyTorch’s nn module II

53 Train an MNIST model using PyTorch Lightning I

54 Train an MNIST model using PyTorch Lightning II

55 Next steps

Resolve the captcha to access the links!