PyCaret Model Training Report

Setup & Best Model
Best Model Plots
Feature Importance
Explainer

Setup Parameters

ParameterValue
target MPG
session_id 42
index False
If you want to know all the experiment setup parameters, please check the PyCaret documentation for the classification/regression exp function.

Best Model: GradientBoostingRegressor

ParameterValue
alpha 0.9
ccp_alpha 0.0
criterion friedman_mse
init None
learning_rate 0.1
loss squared_error
max_depth 3
max_features None
max_leaf_nodes None
min_impurity_decrease 0.0
min_samples_leaf 1
min_samples_split 2
min_weight_fraction_leaf 0.0
n_estimators 100
n_iter_no_change None
random_state 42
subsample 1.0
tol 0.0001
validation_fraction 0.1
verbose 0
warm_start False

Comparison Results on the Cross-Validation Set

Model MAE MSE RMSE R2 RMSLE MAPE TT (Sec)
Gradient Boosting Regressor 2.2775 9.8743 3.0921 0.8383 0.1197 0.0980 0.681
Extra Trees Regressor 2.2119 10.2477 3.1304 0.8323 0.1220 0.0949 2.212
Light Gradient Boosting Machine 2.3218 10.4931 3.1818 0.8282 0.1252 0.1011 0.263
CatBoost Regressor 2.3204 10.5063 3.1906 0.8270 0.1256 0.1011 8.883
Random Forest Regressor 2.3161 11.0170 3.2515 0.8210 0.1252 0.0990 1.916
Extreme Gradient Boosting 2.4277 11.9887 3.3949 0.8045 0.1336 0.1057 0.497
Elastic Net 2.6119 12.1337 3.4462 0.8029 0.1426 0.1168 0.116
Lasso Regression 2.6238 12.2869 3.4649 0.8011 0.1438 0.1172 0.134
Lasso Least Angle Regression 2.6238 12.2868 3.4649 0.8011 0.1438 0.1172 0.157
AdaBoost Regressor 2.5949 12.5846 3.4968 0.7939 0.1378 0.1153 2.469
Bayesian Ridge 2.6494 12.5149 3.5121 0.7920 0.1433 0.1194 0.268
Ridge Regression 2.6852 12.7684 3.5480 0.7872 0.1448 0.1212 0.108
Linear Regression 2.6893 12.7997 3.5523 0.7866 0.1450 0.1214 0.122
Least Angle Regression 2.7583 13.3766 3.6327 0.7759 0.1489 0.1249 0.165
Huber Regressor 2.6780 14.2077 3.7197 0.7699 0.1404 0.1138 1.508
Decision Tree Regressor 2.6552 15.5784 3.8636 0.7507 0.1470 0.1108 0.253
Orthogonal Matching Pursuit 3.3731 20.2491 4.4464 0.6709 0.1767 0.1475 0.418
K Neighbors Regressor 3.4315 21.1052 4.5405 0.6546 0.1692 0.1448 0.858
Dummy Regressor 6.6547 62.8366 7.8973 -0.0391 0.3303 0.3219 0.129
Passive Aggressive Regressor 7.5227 84.7568 9.0993 -0.4762 0.4067 0.3652 0.420

Results on the Test Set for the best model

Model MAE MSE RMSE R2 RMSLE MAPE
Gradient Boosting Regressor 2.2015 9.911 3.1482 0.8273 0.1198 0.094

Best Model Plots on the testing set

Residuals

residuals

Error

error

Cooks

cooks

Learning

learning

Vc

vc

Manifold

manifold

Rfe

rfe

Feature

feature

Feature_all

feature_all

PyCaret Feature Importance Report

Feature importance analysis from atrained Random Forest

Use gini impurity forcalculating feature importance for classificationand Variance Reduction for regression

tree_importance

SHAP Summary from a trained lightgbm

shap_summary