Learning Neural Networks with Tensorflow

Learning Neural Networks with Tensorflow

English | MP4 | AVC 1920×1080 | AAC 44KHz 2ch | 3h 34m | 673 MB

Dive into Neural Networks by solving real real-world datasets using Tensorflow

Neural Networks are used all around us: they index photos into categories, translate text, suggest replies for emails, and beat the best games. Many people are eager to apply this knowledge to their own data, but many fail to achieve the results they expect.

In this course, we’ll start by building a simple flower recognition program, making you feel comfortable with Tensorflow, and it will teach you several important concepts in Neural Networks. Next, you’ll start working with high-dimensional uses to predict one output: 1275 molecular features you can use to predict the atomization energy of an atom. The next program we’ll create is a handwritten number recognition system trained on the famous MNIST dataset. We’ll work our way up from a simple multilayer perceptron to a state of the art Deep Convolutional Neural Network.

In the final program, estimate what a celebrity looks like, checking for new pictures to see whether a celebrity is attractive, wears a hat, has lipstick on, and many more properties that are difficult to estimate with “traditional” computer vision techniques.

After the course, you’ll not only be able to build a Neural Network for your own dataset, you’ll also be able to reason which techniques will improve your Neural Network.

What You Will Learn

  • Work with the Iris Dataset by downloading and visualizing it
  • Install and use Docker
  • Download the data and visualize it
  • Predict the ground state energy of molecules
  • Improve the network by understanding the activation function
  • Work with the MNIST dataset
  • Add pooling layers to reduce your trainable parameters
  • Explore batch normalization
Table of Contents

01 The Course Overview
02 Solving Public Datasets
03 Why We Use Docker and Installation Instructions
04 Our Code, in a Jupyter Notebook
05 Understanding TensorFlow
06 The Iris Dataset
07 The Human Brain and How to Formalize It
08 Backpropagation
09 Overfitting — Why We Split Our Train and Test Data
10 Ground State Energies of 16,242 Molecules
11 First Approach – Easy Layer Building
12 Preprocessing Data
13 Understanding the Activation Function
14 The Importance of Hyperparameters
15 Images of Written Digits
16 Dense Layer Approach
17 Convolution and Pooling Layers
18 Convolution and Pooling Layers (Continued)
19 From Activations to Probabilities – the Softmax Function
20 Optimization and Loss Functions
21 Large-Scale CelebFaces Attributes (CelebA) Dataset
22 Building an Input Pipeline in TensorFlow
23 Building a Convolutional Neural Network
24 Batch Normalization
25 Understanding What Your Network Learned –Visualizing Activations