Machine Learning: Natural Language Processing in Python (V2)

Machine Learning: Natural Language Processing in Python (V2)

English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 152 lectures (22h 5m) | 6.53 GB

NLP: Use Markov Models, NLTK, Artificial Intelligence, Deep Learning, Machine Learning, and Data Science in Python

Welcome to Machine Learning: Natural Language Processing in Python (Version 2).

This is a massive 4-in-1 course covering:

1) Vector models and text preprocessing methods

2) Probability models and Markov models

3) Machine learning methods

4) Deep learning and neural network methods

In part 1, which covers vector models and text preprocessing methods, you will learn about why vectors are so essential in data science and artificial intelligence. You will learn about various techniques for converting text into vectors, such as the CountVectorizer and TF-IDF, and you’ll learn the basics of neural embedding methods like word2vec, and GloVe.

You’ll then apply what you learned for various tasks, such as:

  • Text classification
  • Document retrieval / search engine
  • Text summarization

Along the way, you’ll also learn important text preprocessing steps, such as tokenization, stemming, and lemmatization.

You’ll be introduced briefly to classic NLP tasks such as parts-of-speech tagging.

In part 2, which covers probability models and Markov models, you’ll learn about one of the most important models in all of data science and machine learning in the past 100 years. It has been applied in many areas in addition to NLP, such as finance, bioinformatics, and reinforcement learning.

In this course, you’ll see how such probability models can be used in various ways, such as:

  • Building a text classifier
  • Article spinning
  • Text generation (generating poetry)

Importantly, these methods are an essential prerequisite for understanding how the latest Transformer (attention) models such as BERT and GPT-3 work. Specifically, we’ll learn about 2 important tasks which correspond with the pre-training objectives for BERT and GPT.

In part 3, which covers machine learning methods, you’ll learn about more of the classic NLP tasks, such as:

  • Spam detection
  • Sentiment analysis
  • Latent semantic analysis (also known as latent semantic indexing)
  • Topic modeling

This section will be application-focused rather than theory-focused, meaning that instead of spending most of our effort learning about the details of various ML algorithms, you’ll be focusing on how they can be applied to the above tasks.

Of course, you’ll still need to learn something about those algorithms in order to understand what’s going on. The following algorithms will be used:

  • Naive Bayes
  • Logistic Regression
  • Principal Components Analysis (PCA) / Singular Value Decomposition (SVD)
  • Latent Dirichlet Allocation (LDA)

These are not just “any” machine learning / artificial intelligence algorithms but rather, ones that have been staples in NLP and are thus an essential part of any NLP course.

In part 4, which covers deep learning methods, you’ll learn about modern neural network architectures that can be applied to solve NLP tasks. Thanks to their great power and flexibility, neural networks can be used to solve any of the aforementioned tasks in the course.

You’ll learn about:

  • Feedforward Artificial Neural Networks (ANNs)
  • Embeddings
  • Convolutional Neural Networks (CNNs)
  • Recurrent Neural Networks (RNNs)

The study of RNNs will involve modern architectures such as the LSTM and GRU which have been widely used by Google, Amazon, Apple, Facebook, etc. for difficult tasks such as language translation, speech recognition, and text-to-speech.

Obviously, as the latest Transformers (such as BERT and GPT-3) are examples of deep neural networks, this part of the course is an essential prerequisite for understanding Transformers.

What you’ll learn

  • How to convert text into vectors using CountVectorizer, TF-IDF, word2vec, and GloVe
  • How to implement a document retrieval system / search engine / similarity search / vector similarity
  • Probability models, language models and Markov models (prerequisite for Transformers, BERT, and GPT-3)
  • How to implement a cipher decryption algorithm using genetic algorithms and language modeling
  • How to implement spam detection
  • How to implement sentiment analysis
  • How to implement an article spinner
  • How to implement text summarization
  • How to implement latent semantic indexing
  • How to implement topic modeling with LDA, NMF, and SVD
  • Machine learning (Naive Bayes, Logistic Regression, PCA, SVD, Latent Dirichlet Allocation)
  • Deep learning (ANNs, CNNs, RNNs, LSTM, GRU) (more important prerequisites for BERT and GPT-3)
  • Hugging Face Transformers (VIP only)
  • How to use Python, Scikit-Learn, Tensorflow, +More for NLP
  • Text preprocessing, tokenization, stopwords, lemmatization, and stemming
  • Parts-of-speech (POS) tagging and named entity recognition (NER)
Table of Contents

Introduction
1 Introduction and Outline
2 Are You Beginner, Intermediate, or Advanced All are OK!

Getting Set Up
3 Where to get the Code
4 How to use Github & Extra Coding Tips (Optional)

Vector Models and Text Preprocessing
5 Vector Models & Text Preprocessing Intro
6 Basic Definitions for NLP
7 What is a Vector
8 Bag of Words
9 Count Vectorizer (Theory)
10 Tokenization
11 Stopwords
12 Stemming and Lemmatization
13 Stemming and Lemmatization Demo
14 Count Vectorizer (Code)
15 Vector Similarity
16 TF-IDF (Theory)
17 (Interactive) Recommender Exercise Prompt
18 TF-IDF (Code)
19 Word-to-Index Mapping
20 How to Build TF-IDF From Scratch
21 Neural Word Embeddings
22 Neural Word Embeddings Demo
23 Vector Models & Text Preprocessing Summary
24 Text Summarization Preview
25 How To Do NLP In Other Languages
26 Suggestion Box

Probabilistic Models (Introduction)
27 Probabilistic Models (Introduction)

Markov Models (Intermediate)
28 Markov Models Section Introduction
29 The Markov Property
30 The Markov Model
31 Probability Smoothing and Log-Probabilities
32 Building a Text Classifier (Theory)
33 Building a Text Classifier (Exercise Prompt)
34 Building a Text Classifier (Code pt 1)
35 Building a Text Classifier (Code pt 2)
36 Language Model (Theory)
37 Language Model (Exercise Prompt)
38 Language Model (Code pt 1)
39 Language Model (Code pt 2)
40 Markov Models Section Summary

Article Spinner (Intermediate)
41 Article Spinning – Problem Description
42 Article Spinning – N-Gram Approach
43 Article Spinner Exercise Prompt
44 Article Spinner in Python (pt 1)
45 Article Spinner in Python (pt 2)
46 Case Study Article Spinning Gone Wrong

Cipher Decryption (Advanced)
47 Section Introduction
48 Ciphers
49 Language Models (Review)
50 Genetic Algorithms
51 Code Preparation
52 Code pt 1
53 Code pt 2
54 Code pt 3
55 Code pt 4
56 Code pt 5
57 Code pt 6
58 Cipher Decryption – Additional Discussion
59 Section Conclusion

Machine Learning Models (Introduction)
60 Machine Learning Models (Introduction)

Spam Detection
61 Spam Detection – Problem Description
62 Naive Bayes Intuition
63 Spam Detection – Exercise Prompt
64 Aside Class Imbalance, ROC, AUC, and F1 Score (pt 1)
65 Aside Class Imbalance, ROC, AUC, and F1 Score (pt 2)
66 Spam Detection in Python

Sentiment Analysis
67 Sentiment Analysis – Problem Description
68 Logistic Regression Intuition (pt 1)
69 Multiclass Logistic Regression (pt 2)
70 Logistic Regression Training and Interpretation (pt 3)
71 Sentiment Analysis – Exercise Prompt
72 Sentiment Analysis in Python (pt 1)
73 Sentiment Analysis in Python (pt 2)

Text Summarization
74 Text Summarization Section Introduction
75 Text Summarization Using Vectors
76 Text Summarization Exercise Prompt
77 Text Summarization in Python
78 TextRank Intuition
79 TextRank – How It Really Works (Advanced)
80 TextRank Exercise Prompt (Advanced)
81 TextRank in Python (Advanced)
82 Text Summarization in Python – The Easy Way (Beginner)
83 Text Summarization Section Summary

Topic Modeling
84 Topic Modeling Section Introduction
85 Latent Dirichlet Allocation (LDA) – Essentials
86 LDA – Code Preparation
87 LDA – Maybe Useful Picture (Optional)
88 Latent Dirichlet Allocation (LDA) – Intuition (Advanced)
89 Topic Modeling with Latent Dirichlet Allocation (LDA) in Python
90 Non-Negative Matrix Factorization (NMF) Intuition
91 Topic Modeling with Non-Negative Matrix Factorization (NMF) in Python
92 Topic Modeling Section Summary

Latent Semantic Analysis (Latent Semantic Indexing)
93 LSA LSI Section Introduction
94 SVD (Singular Value Decomposition) Intuition
95 LSA LSI Applying SVD to NLP
96 Latent Semantic Analysis Latent Semantic Indexing in Python
97 LSA LSI Exercises

Deep Learning (Introduction)
98 Deep Learning Introduction (Intermediate-Advanced)

The Neuron
99 The Neuron – Section Introduction
100 Fitting a Line
101 Classification Code Preparation
102 Text Classification in Tensorflow
103 The Neuron
104 How does a model learn
105 The Neuron – Section Summary

Feedforward Artificial Neural Networks
106 ANN – Section Introduction
107 Forward Propagation
108 The Geometrical Picture
109 Activation Functions
110 Multiclass Classification
111 ANN Code Preparation
112 Text Classification ANN in Tensorflow
113 Text Preprocessing Code Preparation
114 Text Preprocessing in Tensorflow
115 Embeddings
116 CBOW (Advanced)
117 CBOW Exercise Prompt
118 CBOW in Tensorflow (Advanced)
119 ANN – Section Summary
120 Aside How to Choose Hyperparameters (Optional)

Convolutional Neural Networks
121 CNN – Section Introduction
122 What is Convolution
123 What is Convolution (Pattern Matching)
124 What is Convolution (Weight Sharing)
125 Convolution on Color Images
126 CNN Architecture
127 CNNs for Text
128 Convolutional Neural Network for NLP in Tensorflow
129 CNN – Section Summary

Recurrent Neural Networks
130 RNN – Section Introduction
131 Simple RNN Elman Unit (pt 1)
132 Simple RNN Elman Unit (pt 2)
133 RNN Code Preparation
134 RNNs Paying Attention to Shapes
135 GRU and LSTM (pt 1)
136 GRU and LSTM (pt 2)
137 RNN for Text Classification in Tensorflow
138 Parts-of-Speech (POS) Tagging in Tensorflow
139 Named Entity Recognition (NER) in Tensorflow
140 Exercise Return to CNNs (Advanced)
141 RNN – Section Summary

Setting Up Your Environment FAQ
142 Anaconda Environment Setup
143 How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow

Extra Help With Python Coding for Beginners FAQ
144 How to Code by Yourself (part 1)
145 How to Code by Yourself (part 2)
146 Proof that using Jupyter Notebook is the same as not using it

Effective Learning Strategies for Machine Learning FAQ
147 How to Succeed in this Course (Long Version)
148 Is this for Beginners or Experts Academic or Practical Fast or slow-paced
149 Machine Learning and AI Prerequisite Roadmap (pt 1)
150 Machine Learning and AI Prerequisite Roadmap (pt 2)

Appendix FAQ Finale
151 What is the Appendix
152 BONUS

Homepage