MP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHz, 2 Ch
Genre: eLearning | Language: English + .srt | Duration: 21 lectures (2h 58m) | Size: 1.36 GB
Learn to build decision trees for applied machine learning from scratch in Python.
The most common decision tree algorithms
Understand the core idea behind decision trees
Developing code from scratch
Applying ML for practical problems
Bagging and Boosting
Random Forest, Gradient Boosting
Decision trees are one of the hottest topics in Machine Learning. They dominate many Kaggle competitions nowadays. Empower yourself for challenges.
This course covers both fundamentals of decision tree algorithms such as CHAID, ID3, C4.5, CART, Regression Trees and its hands-on practical applications. Besides, we will mention some bagging and boosting methods such as Random Forest or Gradient Boosting to increase decision tree accuracy. Finally, we will focus on some tree based frameworks such as LightGBM, XGBoost and Chefboost.
We will create our own decision tree framework from scratch in Python. Meanwhile, step by step exercises guide you to understand concepts clearly.
This course appeals to ones who interested in Machine Learning, Data Science and Data Mining.
Interested in Machine Learning
Wonder Data Mining