ML之xgboost&GBM:基于xgboost&GBM算法对HiggsBoson数据集(Kaggle竞赛)训练(两模型性能PK)实现二分类预测

 

 

 

目录

输出结果

设计思路

核心代码


 

 

 

输出结果

ML之xgboost&GBM:基于xgboost&GBM算法对HiggsBoson数据集(Kaggle竞赛)训练(两模型性能PK)实现二分类预测_人工智能
ML之xgboost&GBM:基于xgboost&GBM算法对HiggsBoson数据集(Kaggle竞赛)训练(两模型性能PK)实现二分类预测_ML_02
ML之xgboost&GBM:基于xgboost&GBM算法对HiggsBoson数据集(Kaggle竞赛)训练(两模型性能PK)实现二分类预测_ML_03

finish loading from csv 
weight statistics: wpos=1522.37, wneg=904200, ratio=593.94

loading data end, start to boost trees
training GBM from sklearn
      Iter       Train Loss   Remaining Time 
         1           1.2069           49.52s
         2           1.1437           43.51s
         3           1.0909           37.43s
         4           1.0471           30.96s
         5           1.0096           25.09s
         6           0.9775           19.90s
         7           0.9505           15.22s
         8           0.9264            9.94s
         9           0.9058            4.88s
        10           0.8878            0.00s
sklearn.GBM total costs: 50.88141202926636 seconds


training xgboost
[0]	train-ams@0.15:3.69849
[1]	train-ams@0.15:3.96339
[2]	train-ams@0.15:4.26978
[3]	train-ams@0.15:4.32619
[4]	train-ams@0.15:4.41415
[5]	train-ams@0.15:4.49395
[6]	train-ams@0.15:4.64614
[7]	train-ams@0.15:4.64058
[8]	train-ams@0.15:4.73064
[9]	train-ams@0.15:4.79447
XGBoost with 1 thread costs: 24.5108642578125 seconds
[0]	train-ams@0.15:3.69849
[1]	train-ams@0.15:3.96339
[2]	train-ams@0.15:4.26978
[3]	train-ams@0.15:4.32619
[4]	train-ams@0.15:4.41415
[5]	train-ams@0.15:4.49395
[6]	train-ams@0.15:4.64614
[7]	train-ams@0.15:4.64058
[8]	train-ams@0.15:4.73064
[9]	train-ams@0.15:4.79447
XGBoost with 2 thread costs: 11.449955940246582 seconds
[0]	train-ams@0.15:3.69849
[1]	train-ams@0.15:3.96339
[2]	train-ams@0.15:4.26978
[3]	train-ams@0.15:4.32619
[4]	train-ams@0.15:4.41415
[5]	train-ams@0.15:4.49395
[6]	train-ams@0.15:4.64614
[7]	train-ams@0.15:4.64058
[8]	train-ams@0.15:4.73064
[9]	train-ams@0.15:4.79447
XGBoost with 4 thread costs: 8.809934616088867 seconds
[0]	train-ams@0.15:3.69849
[1]	train-ams@0.15:3.96339
[2]	train-ams@0.15:4.26978
[3]	train-ams@0.15:4.32619
[4]	train-ams@0.15:4.41415
[5]	train-ams@0.15:4.49395
[6]	train-ams@0.15:4.64614
[7]	train-ams@0.15:4.64058
[8]	train-ams@0.15:4.73064
[9]	train-ams@0.15:4.79447
XGBoost with 8 thread costs: 7.875434875488281 seconds
XGBoost total costs: 52.64618968963623 seconds

 

设计思路

ML之xgboost&GBM:基于xgboost&GBM算法对HiggsBoson数据集(Kaggle竞赛)训练(两模型性能PK)实现二分类预测_人工智能_04

 

 

 

 

核心代码