The Complete Machine Learning Course with Python

The Complete Machine Learning Course with Python

English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 18.5 Hours | 2.93 GB

Build a Portfolio of 12 Machine Learning Projects with Python, SVM, Regression, Unsupervised Machine Learning & More!

Build Powerful Machine Learning Models to Solve Any Problem

You’ll go from beginner to extremely high-level and your instructor will build each algorithm with you step by step on screen.

By the end of the course, you will have trained machine learning algorithms to classify flowers, predict house price, identify handwritings or digits, identify staff that is most likely to leave prematurely, detect cancer cells and much more!

Inside the course, you’ll learn how to:

  • Set up a Python development environment correctly
  • Gain complete machine learning tool sets to tackle most real world problems
  • Understand the various regression, classification and other ml algorithms performance metrics such as R-squared, MSE, accuracy, confusion matrix, prevision, recall, etc. and when to use them.
  • Combine multiple models with by bagging, boosting or stacking
  • Make use to unsupervised Machine Learning (ML) algorithms such as Hierarchical clustering, k-means clustering etc. to understand your data
  • Develop in Jupyter (IPython) notebook, Spyder and various IDE
  • Communicate visually and effectively with Matplotlib and Seaborn
  • Engineer new features to improve algorithm predictions
  • Make use of train/test, K-fold and Stratified K-fold cross validation to select correct model and predict model perform with unseen data
  • Use SVM for handwriting recognition, and classification problems in general
  • Use decision trees to predict staff attrition
  • Apply the association rule to retail shopping datasets
  • And much much more!
Table of Contents

Introduction
1 What Does the Course Cover
2 How to Succeed in This Course
3 Project Files

Getting Started with Anaconda
4 Windows OS Downloading Installing Anaconda
5 Windows OS Managing Environment
6 Mac OS Intructions on Installing Anaconda and Managing Environment
7 Practice Activity Create a New Environment
8 Navigating the Spyder Jupyter Notebook Interface
9 Downloading the IRIS Datasets
10 Data Exploration and Analysis
11 Presenting Your Data

Regression
12 Introduction
13 Categories of Machine Learning
14 Working with Scikit-Learn
15 Boston Housing Data – EDA
16 Correlation Analysis and Feature Selection
17 Simple Linear Regression Modelling with Boston Housing Data
18 Robust Regression
19 Evaluate Model Performance
20 Multiple Regression with statsmodel
21 Multiple Regression and Feature Importance
22 Ordinary Least Square Regression and Gradient Descent
23 Regularised Method for Regression
24 Polynomial Regression
25 Dealing with Non-linear relationships
26 Feature Importance Revisited
27 Data Pre-Processing 1
28 Data Pre-Processing 2
29 Variance Bias Trade Off – Validation Curve
30 Variance Bias Trade Off – Learning Curve
31 Cross Validation

Classification
32 Introduction
33 Logistic Regression 1
34 Logistic Regression 2
35 MNIST Project 1 – Introduction
36 MNIST Project 2 – SGDClassifier
37 MNIST Project 3 – Performance Measures
38 MNIST Project 4 – Confusion Matrix Precision Recall and F1 Score
39 MNIST Project 5 – Precision and Recall Tradeoff
40 MNIST Project 6 – The ROC Curve
41 MNIST Exercise

Support Vector Machine SVM
42 Introduction
43 Support Vector Machine SVM Concepts
44 Linear SVM Classification
45 Polynomial Kernel
46 Gaussian Radial Basis Function
47 Support Vector Regression
48 Advantages and Disadvantages of SVM

Tree
49 Introduction
50 What is Decision Tree
51 Training a Decision Tree
52 Visualising a Decision Trees
53 Decision Tree Learning Algorithm
54 Decision Tree Regression
55 Overfitting and Grid Search
56 Where to From Here
57 Project HR – Loading and preprocesing data
58 Project HR – Modelling

Ensemble Machine Learning
59 Introduction
60 Ensemble Learning Methods Introduction
61 Bagging Part 1
62 Bagging Part 2
63 Random Forests
64 Extra-Trees
65 AdaBoost
66 Gradient Boosting Machine
67 XGBoost
68 Project HR – Human Resources Analytics
69 Ensemble of ensembles Part 1
70 Ensemble of ensembles Part 2

k-Nearest Neighbours kNN
71 kNN Introduction
72 kNN Concepts
73 kNN and Iris Dataset Demo
74 Distance Metric
75 Project Cancer Detection Part 1
76 Project Cancer Detection Part 2

Dimensionality Reduction
77 Introduction
78 Dimensionality Reduction Concept
79 PCA Introduction
80 Dimensionality Reduction Demo
81 Project Wine 1 Dimensionality Reduction with PCA
82 Project Abalone
83 Project Wine 2 Choosing the Number of Components
84 Kernel PCA
85 Kernel PCA Demo
86 LDA Comparison between LDA and PCA

Unsupervised Learning Clustering
87 Introduction
88 Clustering Concepts
89 MLextend
90 Wards Agglomerative Hierarchical Clustering
91 Truncating Dendrogram
92 k-Means Clustering
93 Elbow Method
94 Silhouette Analysis
95 Mean Shift