Implementing the extreme gradient boosting method for glass identification using XGBoost with scikit-learn 

XGBoost stands for extreme gradient boosting. It is a variant of the gradient boosting machine that aims to improve performance and speed. The XGBoost library in Python implements the gradient boosting decision tree algorithm. The name gradient boosting comes from its us of the gradient descent algorithm to minimize loss when adding new models. XGBoost can handle both regression and classification tasks.

XGBoost is the algorithm of choice among those participating in Kaggle competitions because of its performance and speed of execution in difficult machine learning problems.

Some of the important parameters that are used in XGBoost are as follows:

  • n_estimators/ntrees: This specifies the number of trees to build. The default value is 50.
  • max_depth: This specifies the maximum tree depth. The default value is 6. Higher values will make the model more complex and may lead to overfitting. Setting this value to 0 specifies no limit. 
  • min_rows: This specifies the minimum number of observations for a leaf. The default value is 1.
  • learn_rate: This specifies the learning rate by which to shrink the feature weights. Shrinking feature weights after each boosting step makes the boosting process more conservative and prevents overfitting. The range is 0.0 to 1.0. The default value is 0.3.
  • sample_rate: This specifies the row sampling ratio of the training instance (the x axis). For example, setting this value to 0.5 tells XGBoost to randomly collect half of the data instances to grow trees. The default value is 1 and the range is 0.0 to 1.0. Higher values may improve training accuracy. 
  • col_sample_rate: This specifies the column sampling rate (the y axis) for each split in each level. The default value is 1.0 and the range is from 0 to 1.0. Higher values may improve training accuracy. 

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset