Performance of machine learning models
Model | AUC | Recall | Accuracy | F1 score | Precision | |
Base model | ||||||
AdaBoost | 0.83 | 0.80 | 0.73 | 0.65 | 0.55 | |
LightGBM | 0.88 | 0.67 | 0.82 | 0.69 | 0.71 | |
XGBoost | 0.88 | 0.93 | 0.76 | 0.70 | 0.56 | |
Gradient Boosting | 0.85 | 0.80 | 0.82 | 0.73 | 0.67 | |
Extra trees | 0.86 | 0.80 | 0.76 | 0.67 | 0.57 | |
Random forest | 0.90 | 0.80 | 0.82 | 0.73 | 0.67 | |
CatBoost | 0.89 | 0.80 | 0.80 | 0.71 | 0.63 | |
Final model | ||||||
PFCML-MT | Test set | 0.87 | 0.80 | 0.82 | 0.73 | 0.67 |
Temporal validation set | 0.84 | 0.75 | 0.78 | 0.71 | 0.68 |
AdaBoost, adaptive boosting; AUC, area under the receiver operator characteristic curve; CatBoost, categorical boosting; LightGBM, light gradient boosting machine; XGBoost, eXtreme gradient boosting.