Regression Analysis for Statistics and Machine Learning in R

Regression Analysis for Statistics and Machine Learning in R

English | MP4 | AVC 1920×1080 | AAC 44KHz 2ch | 7h 18m | 1.28 GB

Learn complete hands-on Regression Analysis for practical Statistical Modelling and Machine Learning in R

With so many R Statistics and Machine Learning courses around, why enroll for this?

Regression analysis is one of the central aspects of both statistical- and machine learning-based analysis. This course will teach you regression analysis for both statistical data analysis and machine learning in R in a practical, hands-on way. It explores relevant concepts in a practical way, from basic to expert level. This course can help you achieve better grades, gain new analysis tools for your academic career, implement your knowledge in a work setting, and make business forecasting-related decisions. You will go all the way from implementing and inferring simple OLS (Ordinary Least Square) regression models to dealing with issues of multicollinearity in regression to machine learning-based regression models.

Become a Regression Analysis Expert and Harness the Power of R for Your Analysis

  • Get started with R and RStudio. Install these on your system, learn to load packages, and read in different types of data in R
  • Carry out data cleaning and data visualization using R
  • Implement Ordinary Least Square (OLS) regression in R and learn how to interpret the results.
  • Learn how to deal with multicollinearity both through the variable selection and regularization techniques such as ridge regression
  • Carry out variable and regression model selection using both statistical and machine learning techniques, including using cross-validation methods.
  • Evaluate the regression model accuracy
  • Implement Generalized Linear Models (GLMs) such as logistic regression and Poisson regression. Use logistic regression as a binary classifier to distinguish between male and female voices.
  • Use non-parametric techniques such as Generalized Additive Models (GAMs) to work with non-linear and non-parametric data.
  • Work with tree-based machine learning models

Learn

  • Implement and infer Ordinary Least Square (OLS) regression using R
  • Apply statistical- and machine-learning based regression models to deal with problems such as multicollinearity
  • Carry out the variable selection and assess model accuracy using techniques such as cross-validation
  • Implement and infer Generalized Linear Models (GLMs), including using logistic regression as a binary classifier
Table of Contents

Get Started with Practical Regression Analysis in R
1 INTRODUCTION TO THE COURSE – The Key Concepts and Software Tools
2 Difference Between Statistical Analysis & Machine Learning
3 Getting Started with R and R Studio
4 Reading in Data with R
5 Data Cleaning with R
6 Some More Data Cleaning with R
7 Basic Exploratory Data Analysis in R
8 Conclusion to Section 1

Ordinary Least Square Regression Modelling
9 OLS Regression- Theory
10 OLS-Implementation
11 More on Result Interpretations
12 Confidence Interval-Theory
13 Calculate the Confidence Interval in R
14 Confidence Interval and OLS Regressions
15 Linear Regression without Intercept
16 Implement ANOVA on OLS Regression
17 Multiple Linear Regression
18 Multiple Linear regression with Interaction and Dummy Variables
19 Some Basic Conditions that OLS Models Have to Fulfill
20 Conclusions to Section 2

Deal with Multicollinearity in OLS Regression Models
21 Identify Multicollinearity
22 Doing Regression Analyses with Correlated Predictor Variables
23 Principal Component Regression in R
24 Partial Least Square Regression in R
25 Ridge Regression in R
26 LASSO Regression
27 Conclusion to Section 3

Variable & Model Selection
28 Why Do Any Kind of Selection
29 Select the Most Suitable OLS Regression Model
30 Select Model Subsets
31 Machine Learning Perspective on Evaluate Regression Model Accuracy
32 Evaluate Regression Model Performance
33 LASSO Regression for Variable Selection
34 Identify the Contribution of Predictors in Explaining the Variation in Y
35 Conclusions to Section 4

Dealing with Other Violations of the OLS Regression Models
36 Data Transformations
37 Robust Regression-Deal with Outliers
38 Dealing with Heteroscedasticity
39 Conclusions to Section 5

Generalized Linear Models (GLMs)
40 What are GLMs
41 Logistic regression
42 Logistic Regression for Binary Response Variable
43 Multinomial Logistic Regression
44 Regression for Count Data
45 Goodness of fit testing
46 Conclusions to Section 6

Working with Non-Parametric and Non-Linear Data
47 Polynomial and Non-linear regression
48 Generalized Additive Models (GAMs) in R
49 Boosted GAM Regression
50 Multivariate Adaptive Regression Splines (MARS)
51 CART-Regression Trees in R
52 Conditional Inference Trees
53 Random Forest(RF)
54 Gradient Boosting Regression
55 ML Model Selection
56 Conclusions to Section 7