Loading...

Course Description

When working with real-world datasets, more than a single model may be required to capture the complexity of the data. Ensemble methods prove to be extremely useful with complex datasets by allowing us to combine simpler models to fully grasp the patterns in the data, thereby improving the predictive power of the models.

In this course, you’ll discover how to use two ensemble methods: random forests and boosted decision trees. You’ll practice these ensemble methods with datasets in R and apply the ensemble techniques you’ve learned to build robust predictive models. You’ll practice improving decision tree performance using random forest models and practice interpreting those models. You’ll then use another technique and apply boosting to reduce errors and aggregate predictions to decision trees.

You are required to have completed the following courses or have equivalent experience before taking this course:

  • Nonlinear Regression Models
  • Modeling Interactions Between Predictors
  • Foundations of Predictive Modeling

Faculty Author

Sumanta Basu

Benefits to the Learner

  • Identify the limitations of decision trees
  • Fit random forest models and practice implementing them in R
  • Use mean decrease in accuracy (MDA), mean decrease in impurity (MDI), and partial dependence plots to interpret "black box" ensemble methods
  • Apply the basics of boosting to classification trees and implement with datasets in R

Target Audience

  • Current and aspiring data scientists and analysts
  • Business decision makers
  • Marketing analysts
  • Consultants
  • Executives
  • Anyone seeking to gain deeper exposure to data science

Applies Towards the Following Certificates

Loading...
Enroll Now - Select a section to enroll in
Type
2 week
Dates
Sep 18, 2024 to Oct 01, 2024
Total Number of Hours
16.0
Course Fee(s)
Contract Fee $100.00
Type
2 week
Dates
Dec 11, 2024 to Dec 24, 2024
Total Number of Hours
16.0
Course Fee(s)
Contract Fee $100.00
Required fields are indicated by .