The following topics are covered in this lesson:
- Downloading a real-world dataset from a Kaggle competition
- Performing feature engineering and prepare the dataset for training
- Training and interpreting a gradient boosting model using XGBoost
- Training with KFold cross-validation and ensembling results
- Configuring the gradient boosting model and tuning hyperparameters
Please provide your valuable feedback on this link to help us improve the course experience.
Ask questions and get help on the discussion forum.
Attend weekly study hours on the Jovian Discord Server