[ECC DS 9주차] 분류1_캐글 산단테르 고객 만족 예측
0. 대회 소개
-
370개의 피처로 주어진 데이터 세트를 기반으로
고객 만족 여부
를 예측하는 문제 -
피처 이름의 경우 모두 익명 처리됨 -> 이름만을 가지고 어떤 속성인지는 추정할 수 x
-
클래스 레이블: TARGET
-
1: 불만족
-
0: 만족
-
-
성능 평가 지표:
ROC-AUC
- 대부분이 만족이고 불만족인 데이터는 일부일 것이기 때문
-
데이터 출처: https://www.kaggle.com/competitions/santander-customer-satisfaction/data
1. 데이터 전처리
1-1. 라이브러리 import & 데이터 로딩
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import matplotlib
cust_df = pd.read_csv("/content/drive/MyDrive/Colab Notebooks/ESAA 8기/YB/10주차/data/train_santander.csv",encoding = 'latin-1')
print('dataset shape:', cust_df.shape)
cust_df.head(3)
dataset shape: (76020, 371)
ID | var3 | var15 | imp_ent_var16_ult1 | imp_op_var39_comer_ult1 | imp_op_var39_comer_ult3 | imp_op_var40_comer_ult1 | imp_op_var40_comer_ult3 | imp_op_var40_efect_ult1 | imp_op_var40_efect_ult3 | ... | saldo_medio_var33_hace2 | saldo_medio_var33_hace3 | saldo_medio_var33_ult1 | saldo_medio_var33_ult3 | saldo_medio_var44_hace2 | saldo_medio_var44_hace3 | saldo_medio_var44_ult1 | saldo_medio_var44_ult3 | var38 | TARGET | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 1 | 2 | 23 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 39205.17 | 0 |
1 | 3 | 2 | 34 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 49278.03 | 0 |
2 | 4 | 2 | 23 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 67333.77 | 0 |
3 rows × 371 columns
- 클래스 값 컬럼을 포함하여 피처가 371개 존재함
1-2. EDA
### 데이터 정보 확인
cust_df.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 76020 entries, 0 to 76019 Columns: 371 entries, ID to TARGET dtypes: float64(111), int64(260) memory usage: 215.2 MB
-
111개의 피처가 float형, 260개의 피처가 int형
-
모든 피처가 숫자형임
-
Null 값은 없음
### target 속성값의 분포
print(cust_df['TARGET'].value_counts())
unsatisfied_cnt = cust_df[cust_df['TARGET'] == 1].TARGET.count()
total_cnt = cust_df.TARGET.count()
print()
print('unsatisfied 비율은 {0:.2f}'.format((unsatisfied_cnt / total_cnt)))
0 73012 1 3008 Name: TARGET, dtype: int64 unsatisfied 비율은 0.04
- 대부분이 만족(0)이며, 불만족인 고객은 얼마 되지 않는 4%에 불과
### 각 피처의 값 분포
cust_df.describe( )
ID | var3 | var15 | imp_ent_var16_ult1 | imp_op_var39_comer_ult1 | imp_op_var39_comer_ult3 | imp_op_var40_comer_ult1 | imp_op_var40_comer_ult3 | imp_op_var40_efect_ult1 | imp_op_var40_efect_ult3 | ... | saldo_medio_var33_hace2 | saldo_medio_var33_hace3 | saldo_medio_var33_ult1 | saldo_medio_var33_ult3 | saldo_medio_var44_hace2 | saldo_medio_var44_hace3 | saldo_medio_var44_ult1 | saldo_medio_var44_ult3 | var38 | TARGET | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
count | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | ... | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 7.602000e+04 | 76020.000000 |
mean | 75964.050723 | -1523.199277 | 33.212865 | 86.208265 | 72.363067 | 119.529632 | 3.559130 | 6.472698 | 0.412946 | 0.567352 | ... | 7.935824 | 1.365146 | 12.215580 | 8.784074 | 31.505324 | 1.858575 | 76.026165 | 56.614351 | 1.172358e+05 | 0.039569 |
std | 43781.947379 | 39033.462364 | 12.956486 | 1614.757313 | 339.315831 | 546.266294 | 93.155749 | 153.737066 | 30.604864 | 36.513513 | ... | 455.887218 | 113.959637 | 783.207399 | 538.439211 | 2013.125393 | 147.786584 | 4040.337842 | 2852.579397 | 1.826646e+05 | 0.194945 |
min | 1.000000 | -999999.000000 | 5.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | ... | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 5.163750e+03 | 0.000000 |
25% | 38104.750000 | 2.000000 | 23.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | ... | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 6.787061e+04 | 0.000000 |
50% | 76043.000000 | 2.000000 | 28.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | ... | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 1.064092e+05 | 0.000000 |
75% | 113748.750000 | 2.000000 | 40.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | ... | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 1.187563e+05 | 0.000000 |
max | 151838.000000 | 238.000000 | 105.000000 | 210000.000000 | 12888.030000 | 21024.810000 | 8237.820000 | 11073.570000 | 6600.000000 | 6600.000000 | ... | 50003.880000 | 20385.720000 | 138831.630000 | 91778.730000 | 438329.220000 | 24650.010000 | 681462.900000 | 397884.300000 | 2.203474e+07 | 1.000000 |
8 rows × 371 columns
-
var3
컬럼의 경우 min 값이 -999999임-
NaN이나 특정 예외 값을 -999999로 변환했을 것이라고 예상할 수 있음
-
print(cust_df, var3.value_counts()[:10])
으로 조사 결과 -999999 값이 116개나 있음을 확인할 수 있음
-
var3의 이상치를 최빈값인 2로 변환
cust_df['var3'].replace(-999999, 2, inplace = True) # 이상치 변환
cust_df.drop('ID', axis = 1, inplace = True) # 불필요한 컬럼 제거
1-3. 데이터 구성하기
### 피처 세트와 레이블 세트 분리
# 레이블 컬럼은 DataFrame의 맨 마지막에 위치해 컬럼 위치 -1로 분리
X_features = cust_df.iloc[:, :-1]
y_labels = cust_df.iloc[:, -1]
print('피처 데이터 shape:{0}'.format(X_features.shape))
피처 데이터 shape:(76020, 369)
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X_features, y_labels,
test_size = 0.2, random_state = 0)
train_cnt = y_train.count()
test_cnt = y_test.count()
print('학습 세트 Shape:{0}, 테스트 세트 Shape:{1}'.format(X_train.shape , X_test.shape))
학습 세트 Shape:(60816, 369), 테스트 세트 Shape:(15204, 369)
# 비대칭한 데이터 세트이므로 클래스인 Target 값 분포도가 train과 test에 모두 비슷한지 확인
print('학습 세트 레이블 값 분포 비율')
print(y_train.value_counts()/train_cnt)
print('\n 테스트 세트 레이블 값 분포 비율')
print(y_test.value_counts()/test_cnt)
학습 세트 레이블 값 분포 비율 0 0.960964 1 0.039036 Name: TARGET, dtype: float64 테스트 세트 레이블 값 분포 비율 0 0.9583 1 0.0417 Name: TARGET, dtype: float64
- train과 test 모두 TARGET 값의 분포가 원본 데이터와 유사하게 전체 4% 정도의 불만족 값(= 1)으로 만들어짐
2. 모델링(Modeling)
XGBoost
와LightGBM
활용
2-1. XGBoost
from xgboost import XGBClassifier
from sklearn.metrics import roc_auc_score
※ 테스트 데이터 세트를 XGBoost의 평가 데이터 세트로 사용하면 과적합의 가능성을 증가시킬 수 있음
a) 기본 모델 생성/학습/예측
### 모델 객체 생성
xgb_clf = XGBClassifier(n_estimators = 500, random_state = 156)
### 모델 학습
# 성능 평가 지표를 auc로, 조기 중단 파라미터는 100으로 설정
xgb_clf.fit(X_train, y_train, early_stopping_rounds = 100,
eval_metric = "auc", eval_set = [(X_train, y_train), (X_test, y_test)])
### 모델 성능 평가
xgb_roc_score = roc_auc_score(y_test, xgb_clf.predict_proba(X_test)[:,1], average = 'macro')
print('ROC AUC: {0:.4f}'.format(xgb_roc_score))
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.82005 validation_1-auc:0.81157 [1] validation_0-auc:0.83400 validation_1-auc:0.82452 [2] validation_0-auc:0.83870 validation_1-auc:0.82745 [3] validation_0-auc:0.84419 validation_1-auc:0.82922 [4] validation_0-auc:0.84783 validation_1-auc:0.83298 [5] validation_0-auc:0.85125 validation_1-auc:0.83500 [6] validation_0-auc:0.85501 validation_1-auc:0.83653 [7] validation_0-auc:0.85831 validation_1-auc:0.83782 [8] validation_0-auc:0.86143 validation_1-auc:0.83802 [9] validation_0-auc:0.86452 validation_1-auc:0.83914 [10] validation_0-auc:0.86717 validation_1-auc:0.83954 [11] validation_0-auc:0.87013 validation_1-auc:0.83983 [12] validation_0-auc:0.87369 validation_1-auc:0.84033 [13] validation_0-auc:0.87620 validation_1-auc:0.84055 [14] validation_0-auc:0.87799 validation_1-auc:0.84135 [15] validation_0-auc:0.88071 validation_1-auc:0.84117 [16] validation_0-auc:0.88237 validation_1-auc:0.84101 [17] validation_0-auc:0.88352 validation_1-auc:0.84071 [18] validation_0-auc:0.88457 validation_1-auc:0.84052 [19] validation_0-auc:0.88592 validation_1-auc:0.84023 [20] validation_0-auc:0.88788 validation_1-auc:0.84012 [21] validation_0-auc:0.88845 validation_1-auc:0.84022 [22] validation_0-auc:0.88980 validation_1-auc:0.84007 [23] validation_0-auc:0.89019 validation_1-auc:0.84009 [24] validation_0-auc:0.89193 validation_1-auc:0.83974 [25] validation_0-auc:0.89253 validation_1-auc:0.84015 [26] validation_0-auc:0.89329 validation_1-auc:0.84101 [27] validation_0-auc:0.89386 validation_1-auc:0.84087 [28] validation_0-auc:0.89416 validation_1-auc:0.84074 [29] validation_0-auc:0.89660 validation_1-auc:0.83999 [30] validation_0-auc:0.89737 validation_1-auc:0.83959 [31] validation_0-auc:0.89911 validation_1-auc:0.83952 [32] validation_0-auc:0.90103 validation_1-auc:0.83901 [33] validation_0-auc:0.90250 validation_1-auc:0.83885 [34] validation_0-auc:0.90275 validation_1-auc:0.83887 [35] validation_0-auc:0.90290 validation_1-auc:0.83864 [36] validation_0-auc:0.90460 validation_1-auc:0.83834 [37] validation_0-auc:0.90497 validation_1-auc:0.83810 [38] validation_0-auc:0.90515 validation_1-auc:0.83811 [39] validation_0-auc:0.90533 validation_1-auc:0.83813 [40] validation_0-auc:0.90574 validation_1-auc:0.83776 [41] validation_0-auc:0.90690 validation_1-auc:0.83720 [42] validation_0-auc:0.90715 validation_1-auc:0.83684 [43] validation_0-auc:0.90736 validation_1-auc:0.83672 [44] validation_0-auc:0.90758 validation_1-auc:0.83674 [45] validation_0-auc:0.90767 validation_1-auc:0.83694 [46] validation_0-auc:0.90778 validation_1-auc:0.83687 [47] validation_0-auc:0.90791 validation_1-auc:0.83678 [48] validation_0-auc:0.90829 validation_1-auc:0.83694 [49] validation_0-auc:0.90869 validation_1-auc:0.83676 [50] validation_0-auc:0.90890 validation_1-auc:0.83655 [51] validation_0-auc:0.91067 validation_1-auc:0.83669 [52] validation_0-auc:0.91238 validation_1-auc:0.83641 [53] validation_0-auc:0.91352 validation_1-auc:0.83690 [54] validation_0-auc:0.91386 validation_1-auc:0.83693 [55] validation_0-auc:0.91406 validation_1-auc:0.83681 [56] validation_0-auc:0.91545 validation_1-auc:0.83680 [57] validation_0-auc:0.91556 validation_1-auc:0.83667 [58] validation_0-auc:0.91628 validation_1-auc:0.83665 [59] validation_0-auc:0.91725 validation_1-auc:0.83591 [60] validation_0-auc:0.91762 validation_1-auc:0.83576 [61] validation_0-auc:0.91784 validation_1-auc:0.83534 [62] validation_0-auc:0.91872 validation_1-auc:0.83513 [63] validation_0-auc:0.91892 validation_1-auc:0.83510 [64] validation_0-auc:0.91896 validation_1-auc:0.83508 [65] validation_0-auc:0.91907 validation_1-auc:0.83519 [66] validation_0-auc:0.91970 validation_1-auc:0.83510 [67] validation_0-auc:0.91982 validation_1-auc:0.83523 [68] validation_0-auc:0.92007 validation_1-auc:0.83457 [69] validation_0-auc:0.92015 validation_1-auc:0.83460 [70] validation_0-auc:0.92024 validation_1-auc:0.83446 [71] validation_0-auc:0.92037 validation_1-auc:0.83462 [72] validation_0-auc:0.92087 validation_1-auc:0.83394 [73] validation_0-auc:0.92094 validation_1-auc:0.83410 [74] validation_0-auc:0.92133 validation_1-auc:0.83394 [75] validation_0-auc:0.92141 validation_1-auc:0.83368 [76] validation_0-auc:0.92321 validation_1-auc:0.83413 [77] validation_0-auc:0.92415 validation_1-auc:0.83359 [78] validation_0-auc:0.92503 validation_1-auc:0.83353 [79] validation_0-auc:0.92539 validation_1-auc:0.83293 [80] validation_0-auc:0.92577 validation_1-auc:0.83253 [81] validation_0-auc:0.92677 validation_1-auc:0.83187 [82] validation_0-auc:0.92706 validation_1-auc:0.83230 [83] validation_0-auc:0.92800 validation_1-auc:0.83216 [84] validation_0-auc:0.92822 validation_1-auc:0.83206 [85] validation_0-auc:0.92870 validation_1-auc:0.83196 [86] validation_0-auc:0.92875 validation_1-auc:0.83200 [87] validation_0-auc:0.92881 validation_1-auc:0.83208 [88] validation_0-auc:0.92919 validation_1-auc:0.83174 [89] validation_0-auc:0.92940 validation_1-auc:0.83160 [90] validation_0-auc:0.92948 validation_1-auc:0.83155 [91] validation_0-auc:0.92959 validation_1-auc:0.83165 [92] validation_0-auc:0.92964 validation_1-auc:0.83172 [93] validation_0-auc:0.93031 validation_1-auc:0.83160 [94] validation_0-auc:0.93032 validation_1-auc:0.83150 [95] validation_0-auc:0.93037 validation_1-auc:0.83132 [96] validation_0-auc:0.93084 validation_1-auc:0.83090 [97] validation_0-auc:0.93091 validation_1-auc:0.83091 [98] validation_0-auc:0.93168 validation_1-auc:0.83066 [99] validation_0-auc:0.93245 validation_1-auc:0.83058 [100] validation_0-auc:0.93286 validation_1-auc:0.83029 [101] validation_0-auc:0.93361 validation_1-auc:0.82955 [102] validation_0-auc:0.93359 validation_1-auc:0.82962 [103] validation_0-auc:0.93435 validation_1-auc:0.82893 [104] validation_0-auc:0.93446 validation_1-auc:0.82837 [105] validation_0-auc:0.93480 validation_1-auc:0.82815 [106] validation_0-auc:0.93579 validation_1-auc:0.82744 [107] validation_0-auc:0.93583 validation_1-auc:0.82728 [108] validation_0-auc:0.93610 validation_1-auc:0.82651 [109] validation_0-auc:0.93617 validation_1-auc:0.82650 [110] validation_0-auc:0.93659 validation_1-auc:0.82621 [111] validation_0-auc:0.93663 validation_1-auc:0.82620 [112] validation_0-auc:0.93710 validation_1-auc:0.82591 [113] validation_0-auc:0.93781 validation_1-auc:0.82498 ROC AUC: 0.8413
b) Hyper Parameter Tuning
- 칼럼의 개수가 많음 ->
과적합
가능성
max_depth, min_child_weight, colsample_bytree 하이퍼 파라미터만 일차 튜닝 대상으로 튜닝
-
학습 시간이 많이 필요한 ML 모델인 경우 처음에는 먼저 2 ~ 3개 정도의 파라미터를 결합해 최적 파라미터를 찾아낸 뒤에 해당 최적 파라미터를 기반으로 다시 1 ~ 2개의 파라미터를 결합하여 파라미터 튜닝을 수행하는 방법도 존재
-
다음 코드는 8개의 하이퍼 파라미터 경우의 수를 가짐
-
수행 시간이 오래 걸림
-
n_estimator = 100
,early_stopping_rounds = 30
으로 하여 테스트 한 뒤 나중에 하이퍼 파라미터 튜닝이 완료되면 다시 증가시키자.
-
from sklearn.model_selection import GridSearchCV
# 하이퍼 파라미터 테스트의 수행 속도를 향상시키기 위해 n_estimators를 100으로 감소
xgb_clf = XGBClassifier(n_estimators = 100)
### 하이퍼 파라미터 튜닝
params = {'max_depth':[5, 7] ,
'min_child_weight':[1,3] ,
'colsample_bytree':[0.5, 0.75] }
### 교차 검증
# cv는 3으로 지정
gridcv = GridSearchCV(xgb_clf, param_grid = params, cv = 3)
gridcv.fit(X_train, y_train, early_stopping_rounds = 30,
eval_metric = "auc", eval_set = [(X_train, y_train), (X_test, y_test)])
print('GridSearchCV 최적 파라미터:',gridcv.best_params_)
### 성능 평가
xgb_roc_score = roc_auc_score(y_test, gridcv.predict_proba(X_test)[:,1], average = 'macro')
print('ROC AUC: {0:.4f}'.format(xgb_roc_score))
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.80289 validation_1-auc:0.80420 [1] validation_0-auc:0.80677 validation_1-auc:0.80892 [2] validation_0-auc:0.82161 validation_1-auc:0.81979 [3] validation_0-auc:0.82877 validation_1-auc:0.82509 [4] validation_0-auc:0.83227 validation_1-auc:0.82868 [5] validation_0-auc:0.83179 validation_1-auc:0.82649 [6] validation_0-auc:0.82662 validation_1-auc:0.81879 [7] validation_0-auc:0.83687 validation_1-auc:0.82901 [8] validation_0-auc:0.84136 validation_1-auc:0.83545 [9] validation_0-auc:0.84304 validation_1-auc:0.83301 [10] validation_0-auc:0.84221 validation_1-auc:0.83210 [11] validation_0-auc:0.84723 validation_1-auc:0.83457 [12] validation_0-auc:0.85187 validation_1-auc:0.83740 [13] validation_0-auc:0.85373 validation_1-auc:0.83434 [14] validation_0-auc:0.85631 validation_1-auc:0.83641 [15] validation_0-auc:0.85719 validation_1-auc:0.83617 [16] validation_0-auc:0.85906 validation_1-auc:0.83443 [17] validation_0-auc:0.86100 validation_1-auc:0.83417 [18] validation_0-auc:0.86275 validation_1-auc:0.83579 [19] validation_0-auc:0.86439 validation_1-auc:0.83680 [20] validation_0-auc:0.86511 validation_1-auc:0.83633 [21] validation_0-auc:0.86619 validation_1-auc:0.83515 [22] validation_0-auc:0.86671 validation_1-auc:0.83481 [23] validation_0-auc:0.86748 validation_1-auc:0.83488 [24] validation_0-auc:0.86861 validation_1-auc:0.83440 [25] validation_0-auc:0.86928 validation_1-auc:0.83477 [26] validation_0-auc:0.86955 validation_1-auc:0.83447 [27] validation_0-auc:0.86969 validation_1-auc:0.83420 [28] validation_0-auc:0.87033 validation_1-auc:0.83437 [29] validation_0-auc:0.87109 validation_1-auc:0.83385 [30] validation_0-auc:0.87112 validation_1-auc:0.83374 [31] validation_0-auc:0.87144 validation_1-auc:0.83318 [32] validation_0-auc:0.87162 validation_1-auc:0.83293 [33] validation_0-auc:0.87226 validation_1-auc:0.83267 [34] validation_0-auc:0.87256 validation_1-auc:0.83229 [35] validation_0-auc:0.87290 validation_1-auc:0.83248 [36] validation_0-auc:0.87326 validation_1-auc:0.83255 [37] validation_0-auc:0.87348 validation_1-auc:0.83267 [38] validation_0-auc:0.87418 validation_1-auc:0.83279 [39] validation_0-auc:0.87460 validation_1-auc:0.83252 [40] validation_0-auc:0.87483 validation_1-auc:0.83230 [41] validation_0-auc:0.87490 validation_1-auc:0.83221
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.80699 validation_1-auc:0.80599 [1] validation_0-auc:0.81029 validation_1-auc:0.81309 [2] validation_0-auc:0.82504 validation_1-auc:0.82148 [3] validation_0-auc:0.83183 validation_1-auc:0.82497 [4] validation_0-auc:0.83585 validation_1-auc:0.82786 [5] validation_0-auc:0.83864 validation_1-auc:0.83222 [6] validation_0-auc:0.83483 validation_1-auc:0.83054 [7] validation_0-auc:0.84061 validation_1-auc:0.83365 [8] validation_0-auc:0.84451 validation_1-auc:0.83729 [9] validation_0-auc:0.84411 validation_1-auc:0.83520 [10] validation_0-auc:0.84329 validation_1-auc:0.83367 [11] validation_0-auc:0.84914 validation_1-auc:0.83676 [12] validation_0-auc:0.85414 validation_1-auc:0.83983 [13] validation_0-auc:0.85569 validation_1-auc:0.83826 [14] validation_0-auc:0.85887 validation_1-auc:0.83916 [15] validation_0-auc:0.85987 validation_1-auc:0.84004 [16] validation_0-auc:0.86042 validation_1-auc:0.84034 [17] validation_0-auc:0.86156 validation_1-auc:0.84030 [18] validation_0-auc:0.86302 validation_1-auc:0.84172 [19] validation_0-auc:0.86434 validation_1-auc:0.84223 [20] validation_0-auc:0.86568 validation_1-auc:0.84250 [21] validation_0-auc:0.86605 validation_1-auc:0.84272 [22] validation_0-auc:0.86705 validation_1-auc:0.84268 [23] validation_0-auc:0.86850 validation_1-auc:0.84327 [24] validation_0-auc:0.86954 validation_1-auc:0.84271 [25] validation_0-auc:0.87027 validation_1-auc:0.84226 [26] validation_0-auc:0.87095 validation_1-auc:0.84241 [27] validation_0-auc:0.87150 validation_1-auc:0.84239 [28] validation_0-auc:0.87169 validation_1-auc:0.84245 [29] validation_0-auc:0.87285 validation_1-auc:0.84238 [30] validation_0-auc:0.87363 validation_1-auc:0.84227 [31] validation_0-auc:0.87384 validation_1-auc:0.84226 [32] validation_0-auc:0.87479 validation_1-auc:0.84213 [33] validation_0-auc:0.87543 validation_1-auc:0.84204 [34] validation_0-auc:0.87578 validation_1-auc:0.84186 [35] validation_0-auc:0.87612 validation_1-auc:0.84176 [36] validation_0-auc:0.87654 validation_1-auc:0.84157 [37] validation_0-auc:0.87698 validation_1-auc:0.84108 [38] validation_0-auc:0.87713 validation_1-auc:0.84137 [39] validation_0-auc:0.87756 validation_1-auc:0.84143 [40] validation_0-auc:0.87780 validation_1-auc:0.84092 [41] validation_0-auc:0.87789 validation_1-auc:0.84109 [42] validation_0-auc:0.87835 validation_1-auc:0.84125 [43] validation_0-auc:0.87964 validation_1-auc:0.84079 [44] validation_0-auc:0.87993 validation_1-auc:0.84053 [45] validation_0-auc:0.88044 validation_1-auc:0.84021 [46] validation_0-auc:0.88062 validation_1-auc:0.83972 [47] validation_0-auc:0.88088 validation_1-auc:0.83934 [48] validation_0-auc:0.88092 validation_1-auc:0.83930 [49] validation_0-auc:0.88111 validation_1-auc:0.83903 [50] validation_0-auc:0.88121 validation_1-auc:0.83876 [51] validation_0-auc:0.88145 validation_1-auc:0.83816 [52] validation_0-auc:0.88161 validation_1-auc:0.83800
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.81203 validation_1-auc:0.80951 [1] validation_0-auc:0.81450 validation_1-auc:0.81585 [2] validation_0-auc:0.82678 validation_1-auc:0.82141 [3] validation_0-auc:0.82958 validation_1-auc:0.82431 [4] validation_0-auc:0.83590 validation_1-auc:0.82659 [5] validation_0-auc:0.83743 validation_1-auc:0.82696 [6] validation_0-auc:0.83508 validation_1-auc:0.82420 [7] validation_0-auc:0.84130 validation_1-auc:0.82966 [8] validation_0-auc:0.84500 validation_1-auc:0.83477 [9] validation_0-auc:0.84551 validation_1-auc:0.83453 [10] validation_0-auc:0.84494 validation_1-auc:0.83201 [11] validation_0-auc:0.85031 validation_1-auc:0.83706 [12] validation_0-auc:0.85489 validation_1-auc:0.83832 [13] validation_0-auc:0.85636 validation_1-auc:0.83727 [14] validation_0-auc:0.85873 validation_1-auc:0.83913 [15] validation_0-auc:0.85979 validation_1-auc:0.83969 [16] validation_0-auc:0.86059 validation_1-auc:0.83876 [17] validation_0-auc:0.86171 validation_1-auc:0.83857 [18] validation_0-auc:0.86319 validation_1-auc:0.83998 [19] validation_0-auc:0.86468 validation_1-auc:0.84007 [20] validation_0-auc:0.86530 validation_1-auc:0.83958 [21] validation_0-auc:0.86518 validation_1-auc:0.83915 [22] validation_0-auc:0.86705 validation_1-auc:0.84024 [23] validation_0-auc:0.86779 validation_1-auc:0.84019 [24] validation_0-auc:0.86845 validation_1-auc:0.83978 [25] validation_0-auc:0.86932 validation_1-auc:0.83934 [26] validation_0-auc:0.87042 validation_1-auc:0.83894 [27] validation_0-auc:0.87131 validation_1-auc:0.83872 [28] validation_0-auc:0.87186 validation_1-auc:0.83869 [29] validation_0-auc:0.87253 validation_1-auc:0.83839 [30] validation_0-auc:0.87319 validation_1-auc:0.83821 [31] validation_0-auc:0.87335 validation_1-auc:0.83813 [32] validation_0-auc:0.87371 validation_1-auc:0.83779 [33] validation_0-auc:0.87403 validation_1-auc:0.83776 [34] validation_0-auc:0.87413 validation_1-auc:0.83795 [35] validation_0-auc:0.87489 validation_1-auc:0.83786 [36] validation_0-auc:0.87566 validation_1-auc:0.83761 [37] validation_0-auc:0.87603 validation_1-auc:0.83823 [38] validation_0-auc:0.87664 validation_1-auc:0.83826 [39] validation_0-auc:0.87714 validation_1-auc:0.83797 [40] validation_0-auc:0.87757 validation_1-auc:0.83761 [41] validation_0-auc:0.87773 validation_1-auc:0.83762 [42] validation_0-auc:0.87793 validation_1-auc:0.83730 [43] validation_0-auc:0.87820 validation_1-auc:0.83743 [44] validation_0-auc:0.87859 validation_1-auc:0.83740 [45] validation_0-auc:0.87926 validation_1-auc:0.83654 [46] validation_0-auc:0.87939 validation_1-auc:0.83637 [47] validation_0-auc:0.87958 validation_1-auc:0.83568 [48] validation_0-auc:0.87991 validation_1-auc:0.83543 [49] validation_0-auc:0.88008 validation_1-auc:0.83578 [50] validation_0-auc:0.88098 validation_1-auc:0.83606 [51] validation_0-auc:0.88118 validation_1-auc:0.83609 [52] validation_0-auc:0.88161 validation_1-auc:0.83581
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.80246 validation_1-auc:0.80463 [1] validation_0-auc:0.80769 validation_1-auc:0.80995 [2] validation_0-auc:0.82216 validation_1-auc:0.82059 [3] validation_0-auc:0.82983 validation_1-auc:0.82797 [4] validation_0-auc:0.83294 validation_1-auc:0.82964 [5] validation_0-auc:0.83289 validation_1-auc:0.82807 [6] validation_0-auc:0.82913 validation_1-auc:0.82282 [7] validation_0-auc:0.83623 validation_1-auc:0.82949 [8] validation_0-auc:0.84169 validation_1-auc:0.83424 [9] validation_0-auc:0.84276 validation_1-auc:0.83206 [10] validation_0-auc:0.84290 validation_1-auc:0.83209 [11] validation_0-auc:0.84726 validation_1-auc:0.83490 [12] validation_0-auc:0.85185 validation_1-auc:0.83765 [13] validation_0-auc:0.85354 validation_1-auc:0.83497 [14] validation_0-auc:0.85519 validation_1-auc:0.83661 [15] validation_0-auc:0.85551 validation_1-auc:0.83619 [16] validation_0-auc:0.85658 validation_1-auc:0.83520 [17] validation_0-auc:0.85731 validation_1-auc:0.83493 [18] validation_0-auc:0.85941 validation_1-auc:0.83637 [19] validation_0-auc:0.86073 validation_1-auc:0.83734 [20] validation_0-auc:0.86157 validation_1-auc:0.83716 [21] validation_0-auc:0.86194 validation_1-auc:0.83656 [22] validation_0-auc:0.86311 validation_1-auc:0.83636 [23] validation_0-auc:0.86348 validation_1-auc:0.83664 [24] validation_0-auc:0.86439 validation_1-auc:0.83728 [25] validation_0-auc:0.86552 validation_1-auc:0.83601 [26] validation_0-auc:0.86573 validation_1-auc:0.83583 [27] validation_0-auc:0.86633 validation_1-auc:0.83558 [28] validation_0-auc:0.86674 validation_1-auc:0.83562 [29] validation_0-auc:0.86736 validation_1-auc:0.83573 [30] validation_0-auc:0.86746 validation_1-auc:0.83562 [31] validation_0-auc:0.86771 validation_1-auc:0.83550 [32] validation_0-auc:0.86803 validation_1-auc:0.83440 [33] validation_0-auc:0.86825 validation_1-auc:0.83457 [34] validation_0-auc:0.86835 validation_1-auc:0.83427 [35] validation_0-auc:0.86835 validation_1-auc:0.83423 [36] validation_0-auc:0.86872 validation_1-auc:0.83430 [37] validation_0-auc:0.86910 validation_1-auc:0.83374 [38] validation_0-auc:0.86924 validation_1-auc:0.83396 [39] validation_0-auc:0.86930 validation_1-auc:0.83367 [40] validation_0-auc:0.86965 validation_1-auc:0.83269 [41] validation_0-auc:0.86998 validation_1-auc:0.83297
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.80418 validation_1-auc:0.81169 [1] validation_0-auc:0.81174 validation_1-auc:0.81122 [2] validation_0-auc:0.82946 validation_1-auc:0.82272 [3] validation_0-auc:0.83495 validation_1-auc:0.82851 [4] validation_0-auc:0.83721 validation_1-auc:0.82887 [5] validation_0-auc:0.83847 validation_1-auc:0.83206 [6] validation_0-auc:0.83511 validation_1-auc:0.83097 [7] validation_0-auc:0.84116 validation_1-auc:0.83427 [8] validation_0-auc:0.84533 validation_1-auc:0.83959 [9] validation_0-auc:0.84475 validation_1-auc:0.83715 [10] validation_0-auc:0.84396 validation_1-auc:0.83646 [11] validation_0-auc:0.84838 validation_1-auc:0.83860 [12] validation_0-auc:0.85328 validation_1-auc:0.84141 [13] validation_0-auc:0.85385 validation_1-auc:0.83952 [14] validation_0-auc:0.85744 validation_1-auc:0.84166 [15] validation_0-auc:0.85825 validation_1-auc:0.84222 [16] validation_0-auc:0.85867 validation_1-auc:0.84194 [17] validation_0-auc:0.85911 validation_1-auc:0.84171 [18] validation_0-auc:0.86072 validation_1-auc:0.84333 [19] validation_0-auc:0.86168 validation_1-auc:0.84403 [20] validation_0-auc:0.86273 validation_1-auc:0.84473 [21] validation_0-auc:0.86330 validation_1-auc:0.84493 [22] validation_0-auc:0.86370 validation_1-auc:0.84488 [23] validation_0-auc:0.86444 validation_1-auc:0.84467 [24] validation_0-auc:0.86512 validation_1-auc:0.84399 [25] validation_0-auc:0.86602 validation_1-auc:0.84429 [26] validation_0-auc:0.86702 validation_1-auc:0.84386 [27] validation_0-auc:0.86718 validation_1-auc:0.84389 [28] validation_0-auc:0.86744 validation_1-auc:0.84382 [29] validation_0-auc:0.86878 validation_1-auc:0.84331 [30] validation_0-auc:0.86953 validation_1-auc:0.84300 [31] validation_0-auc:0.86968 validation_1-auc:0.84289 [32] validation_0-auc:0.87033 validation_1-auc:0.84275 [33] validation_0-auc:0.87049 validation_1-auc:0.84252 [34] validation_0-auc:0.87068 validation_1-auc:0.84259 [35] validation_0-auc:0.87119 validation_1-auc:0.84231 [36] validation_0-auc:0.87249 validation_1-auc:0.84221 [37] validation_0-auc:0.87261 validation_1-auc:0.84213 [38] validation_0-auc:0.87335 validation_1-auc:0.84278 [39] validation_0-auc:0.87373 validation_1-auc:0.84275 [40] validation_0-auc:0.87403 validation_1-auc:0.84327 [41] validation_0-auc:0.87425 validation_1-auc:0.84291 [42] validation_0-auc:0.87434 validation_1-auc:0.84252 [43] validation_0-auc:0.87496 validation_1-auc:0.84230 [44] validation_0-auc:0.87507 validation_1-auc:0.84221 [45] validation_0-auc:0.87596 validation_1-auc:0.84185 [46] validation_0-auc:0.87641 validation_1-auc:0.84240 [47] validation_0-auc:0.87666 validation_1-auc:0.84239 [48] validation_0-auc:0.87674 validation_1-auc:0.84242 [49] validation_0-auc:0.87750 validation_1-auc:0.84146 [50] validation_0-auc:0.87773 validation_1-auc:0.84133
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.81098 validation_1-auc:0.80916 [1] validation_0-auc:0.81358 validation_1-auc:0.81313 [2] validation_0-auc:0.82606 validation_1-auc:0.82075 [3] validation_0-auc:0.83126 validation_1-auc:0.82413 [4] validation_0-auc:0.83431 validation_1-auc:0.82739 [5] validation_0-auc:0.83669 validation_1-auc:0.82762 [6] validation_0-auc:0.83388 validation_1-auc:0.82326 [7] validation_0-auc:0.83882 validation_1-auc:0.82781 [8] validation_0-auc:0.84310 validation_1-auc:0.83488 [9] validation_0-auc:0.84383 validation_1-auc:0.83609 [10] validation_0-auc:0.84300 validation_1-auc:0.83454 [11] validation_0-auc:0.84807 validation_1-auc:0.83811 [12] validation_0-auc:0.85239 validation_1-auc:0.84034 [13] validation_0-auc:0.85437 validation_1-auc:0.83857 [14] validation_0-auc:0.85624 validation_1-auc:0.83976 [15] validation_0-auc:0.85747 validation_1-auc:0.83981 [16] validation_0-auc:0.85828 validation_1-auc:0.83989 [17] validation_0-auc:0.85956 validation_1-auc:0.83887 [18] validation_0-auc:0.86130 validation_1-auc:0.84004 [19] validation_0-auc:0.86217 validation_1-auc:0.84007 [20] validation_0-auc:0.86326 validation_1-auc:0.84087 [21] validation_0-auc:0.86345 validation_1-auc:0.84050 [22] validation_0-auc:0.86461 validation_1-auc:0.84070 [23] validation_0-auc:0.86511 validation_1-auc:0.84144 [24] validation_0-auc:0.86593 validation_1-auc:0.84165 [25] validation_0-auc:0.86742 validation_1-auc:0.84260 [26] validation_0-auc:0.86788 validation_1-auc:0.84219 [27] validation_0-auc:0.86837 validation_1-auc:0.84252 [28] validation_0-auc:0.86907 validation_1-auc:0.84149 [29] validation_0-auc:0.87030 validation_1-auc:0.84211 [30] validation_0-auc:0.87049 validation_1-auc:0.84191 [31] validation_0-auc:0.87085 validation_1-auc:0.84221 [32] validation_0-auc:0.87159 validation_1-auc:0.84200 [33] validation_0-auc:0.87168 validation_1-auc:0.84175 [34] validation_0-auc:0.87189 validation_1-auc:0.84138 [35] validation_0-auc:0.87226 validation_1-auc:0.84093 [36] validation_0-auc:0.87252 validation_1-auc:0.84096 [37] validation_0-auc:0.87273 validation_1-auc:0.84114 [38] validation_0-auc:0.87290 validation_1-auc:0.84116 [39] validation_0-auc:0.87358 validation_1-auc:0.84104 [40] validation_0-auc:0.87408 validation_1-auc:0.84014 [41] validation_0-auc:0.87440 validation_1-auc:0.84087 [42] validation_0-auc:0.87511 validation_1-auc:0.84141 [43] validation_0-auc:0.87539 validation_1-auc:0.84138 [44] validation_0-auc:0.87551 validation_1-auc:0.84102 [45] validation_0-auc:0.87605 validation_1-auc:0.84065 [46] validation_0-auc:0.87596 validation_1-auc:0.84070 [47] validation_0-auc:0.87660 validation_1-auc:0.84067 [48] validation_0-auc:0.87690 validation_1-auc:0.84070 [49] validation_0-auc:0.87697 validation_1-auc:0.84027 [50] validation_0-auc:0.87744 validation_1-auc:0.84041 [51] validation_0-auc:0.87813 validation_1-auc:0.83980 [52] validation_0-auc:0.87852 validation_1-auc:0.83997 [53] validation_0-auc:0.87858 validation_1-auc:0.83974 [54] validation_0-auc:0.87878 validation_1-auc:0.83966
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.80902 validation_1-auc:0.80594 [1] validation_0-auc:0.80814 validation_1-auc:0.80463 [2] validation_0-auc:0.82929 validation_1-auc:0.82254 [3] validation_0-auc:0.83621 validation_1-auc:0.82625 [4] validation_0-auc:0.83964 validation_1-auc:0.82813 [5] validation_0-auc:0.84237 validation_1-auc:0.82614 [6] validation_0-auc:0.84139 validation_1-auc:0.82316 [7] validation_0-auc:0.84976 validation_1-auc:0.82936 [8] validation_0-auc:0.85363 validation_1-auc:0.83274 [9] validation_0-auc:0.85616 validation_1-auc:0.82987 [10] validation_0-auc:0.85494 validation_1-auc:0.82878 [11] validation_0-auc:0.86353 validation_1-auc:0.83263 [12] validation_0-auc:0.86904 validation_1-auc:0.83413 [13] validation_0-auc:0.87058 validation_1-auc:0.83155 [14] validation_0-auc:0.87381 validation_1-auc:0.83338 [15] validation_0-auc:0.87435 validation_1-auc:0.83100 [16] validation_0-auc:0.87537 validation_1-auc:0.83110 [17] validation_0-auc:0.87745 validation_1-auc:0.82964 [18] validation_0-auc:0.87937 validation_1-auc:0.83110 [19] validation_0-auc:0.88050 validation_1-auc:0.83177 [20] validation_0-auc:0.88231 validation_1-auc:0.83124 [21] validation_0-auc:0.88279 validation_1-auc:0.83070 [22] validation_0-auc:0.88427 validation_1-auc:0.83181 [23] validation_0-auc:0.88508 validation_1-auc:0.83200 [24] validation_0-auc:0.88601 validation_1-auc:0.83232 [25] validation_0-auc:0.88695 validation_1-auc:0.83234 [26] validation_0-auc:0.88704 validation_1-auc:0.83209 [27] validation_0-auc:0.88695 validation_1-auc:0.83143 [28] validation_0-auc:0.88713 validation_1-auc:0.83145 [29] validation_0-auc:0.88756 validation_1-auc:0.83132 [30] validation_0-auc:0.88828 validation_1-auc:0.83070 [31] validation_0-auc:0.88847 validation_1-auc:0.83061 [32] validation_0-auc:0.89001 validation_1-auc:0.82997 [33] validation_0-auc:0.89022 validation_1-auc:0.83029 [34] validation_0-auc:0.89046 validation_1-auc:0.83034 [35] validation_0-auc:0.89048 validation_1-auc:0.83034 [36] validation_0-auc:0.89095 validation_1-auc:0.83025 [37] validation_0-auc:0.89116 validation_1-auc:0.82992 [38] validation_0-auc:0.89208 validation_1-auc:0.82934 [39] validation_0-auc:0.89214 validation_1-auc:0.82947 [40] validation_0-auc:0.89232 validation_1-auc:0.82920 [41] validation_0-auc:0.89219 validation_1-auc:0.82938 [42] validation_0-auc:0.89215 validation_1-auc:0.82918
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.80941 validation_1-auc:0.80686 [1] validation_0-auc:0.81556 validation_1-auc:0.81361 [2] validation_0-auc:0.83096 validation_1-auc:0.82315 [3] validation_0-auc:0.83933 validation_1-auc:0.82851 [4] validation_0-auc:0.84416 validation_1-auc:0.83167 [5] validation_0-auc:0.84872 validation_1-auc:0.83320 [6] validation_0-auc:0.84648 validation_1-auc:0.82990 [7] validation_0-auc:0.85307 validation_1-auc:0.83396 [8] validation_0-auc:0.85680 validation_1-auc:0.83723 [9] validation_0-auc:0.85902 validation_1-auc:0.83808 [10] validation_0-auc:0.85868 validation_1-auc:0.83643 [11] validation_0-auc:0.86500 validation_1-auc:0.83869 [12] validation_0-auc:0.87073 validation_1-auc:0.84082 [13] validation_0-auc:0.87241 validation_1-auc:0.83814 [14] validation_0-auc:0.87685 validation_1-auc:0.84077 [15] validation_0-auc:0.87782 validation_1-auc:0.84085 [16] validation_0-auc:0.87873 validation_1-auc:0.83983 [17] validation_0-auc:0.87990 validation_1-auc:0.83837 [18] validation_0-auc:0.88157 validation_1-auc:0.83933 [19] validation_0-auc:0.88369 validation_1-auc:0.84017 [20] validation_0-auc:0.88632 validation_1-auc:0.83959 [21] validation_0-auc:0.88679 validation_1-auc:0.83942 [22] validation_0-auc:0.88774 validation_1-auc:0.83931 [23] validation_0-auc:0.88957 validation_1-auc:0.83775 [24] validation_0-auc:0.89009 validation_1-auc:0.83695 [25] validation_0-auc:0.89144 validation_1-auc:0.83624 [26] validation_0-auc:0.89167 validation_1-auc:0.83623 [27] validation_0-auc:0.89187 validation_1-auc:0.83593 [28] validation_0-auc:0.89200 validation_1-auc:0.83571 [29] validation_0-auc:0.89291 validation_1-auc:0.83684 [30] validation_0-auc:0.89339 validation_1-auc:0.83775 [31] validation_0-auc:0.89391 validation_1-auc:0.83791 [32] validation_0-auc:0.89486 validation_1-auc:0.83760 [33] validation_0-auc:0.89506 validation_1-auc:0.83720 [34] validation_0-auc:0.89503 validation_1-auc:0.83710 [35] validation_0-auc:0.89507 validation_1-auc:0.83697 [36] validation_0-auc:0.89515 validation_1-auc:0.83723 [37] validation_0-auc:0.89528 validation_1-auc:0.83679 [38] validation_0-auc:0.89543 validation_1-auc:0.83675 [39] validation_0-auc:0.89565 validation_1-auc:0.83689 [40] validation_0-auc:0.89623 validation_1-auc:0.83681 [41] validation_0-auc:0.89693 validation_1-auc:0.83614 [42] validation_0-auc:0.89682 validation_1-auc:0.83621 [43] validation_0-auc:0.89687 validation_1-auc:0.83620 [44] validation_0-auc:0.89704 validation_1-auc:0.83620
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.81702 validation_1-auc:0.81220 [1] validation_0-auc:0.82046 validation_1-auc:0.81514 [2] validation_0-auc:0.83453 validation_1-auc:0.82323 [3] validation_0-auc:0.83849 validation_1-auc:0.82737 [4] validation_0-auc:0.84346 validation_1-auc:0.82934 [5] validation_0-auc:0.84713 validation_1-auc:0.83083 [6] validation_0-auc:0.84430 validation_1-auc:0.82628 [7] validation_0-auc:0.85255 validation_1-auc:0.83025 [8] validation_0-auc:0.85662 validation_1-auc:0.83387 [9] validation_0-auc:0.85863 validation_1-auc:0.83446 [10] validation_0-auc:0.85883 validation_1-auc:0.83329 [11] validation_0-auc:0.86666 validation_1-auc:0.83528 [12] validation_0-auc:0.87216 validation_1-auc:0.83776 [13] validation_0-auc:0.87498 validation_1-auc:0.83590 [14] validation_0-auc:0.87717 validation_1-auc:0.83829 [15] validation_0-auc:0.87780 validation_1-auc:0.83772 [16] validation_0-auc:0.87851 validation_1-auc:0.83749 [17] validation_0-auc:0.88025 validation_1-auc:0.83555 [18] validation_0-auc:0.88120 validation_1-auc:0.83778 [19] validation_0-auc:0.88283 validation_1-auc:0.83794 [20] validation_0-auc:0.88426 validation_1-auc:0.83717 [21] validation_0-auc:0.88470 validation_1-auc:0.83607 [22] validation_0-auc:0.88607 validation_1-auc:0.83626 [23] validation_0-auc:0.88681 validation_1-auc:0.83728 [24] validation_0-auc:0.88772 validation_1-auc:0.83696 [25] validation_0-auc:0.88896 validation_1-auc:0.83704 [26] validation_0-auc:0.89040 validation_1-auc:0.83579 [27] validation_0-auc:0.89050 validation_1-auc:0.83515 [28] validation_0-auc:0.89076 validation_1-auc:0.83454 [29] validation_0-auc:0.89237 validation_1-auc:0.83393 [30] validation_0-auc:0.89325 validation_1-auc:0.83474 [31] validation_0-auc:0.89338 validation_1-auc:0.83483 [32] validation_0-auc:0.89376 validation_1-auc:0.83487 [33] validation_0-auc:0.89379 validation_1-auc:0.83532 [34] validation_0-auc:0.89386 validation_1-auc:0.83511 [35] validation_0-auc:0.89388 validation_1-auc:0.83448 [36] validation_0-auc:0.89480 validation_1-auc:0.83427 [37] validation_0-auc:0.89515 validation_1-auc:0.83394 [38] validation_0-auc:0.89520 validation_1-auc:0.83383 [39] validation_0-auc:0.89518 validation_1-auc:0.83341 [40] validation_0-auc:0.89532 validation_1-auc:0.83310 [41] validation_0-auc:0.89573 validation_1-auc:0.83306 [42] validation_0-auc:0.89584 validation_1-auc:0.83301 [43] validation_0-auc:0.89601 validation_1-auc:0.83272
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.80956 validation_1-auc:0.80714 [1] validation_0-auc:0.81193 validation_1-auc:0.80854 [2] validation_0-auc:0.82506 validation_1-auc:0.81970 [3] validation_0-auc:0.83477 validation_1-auc:0.82709 [4] validation_0-auc:0.83821 validation_1-auc:0.82847 [5] validation_0-auc:0.84212 validation_1-auc:0.82840 [6] validation_0-auc:0.83981 validation_1-auc:0.82187 [7] validation_0-auc:0.84643 validation_1-auc:0.82964 [8] validation_0-auc:0.85106 validation_1-auc:0.83246 [9] validation_0-auc:0.85265 validation_1-auc:0.83022 [10] validation_0-auc:0.85202 validation_1-auc:0.82886 [11] validation_0-auc:0.85929 validation_1-auc:0.83305 [12] validation_0-auc:0.86421 validation_1-auc:0.83632 [13] validation_0-auc:0.86580 validation_1-auc:0.83527 [14] validation_0-auc:0.86836 validation_1-auc:0.83751 [15] validation_0-auc:0.86917 validation_1-auc:0.83644 [16] validation_0-auc:0.86964 validation_1-auc:0.83589 [17] validation_0-auc:0.87138 validation_1-auc:0.83500 [18] validation_0-auc:0.87334 validation_1-auc:0.83720 [19] validation_0-auc:0.87495 validation_1-auc:0.83866 [20] validation_0-auc:0.87628 validation_1-auc:0.83891 [21] validation_0-auc:0.87623 validation_1-auc:0.83803 [22] validation_0-auc:0.87796 validation_1-auc:0.83851 [23] validation_0-auc:0.87830 validation_1-auc:0.83796 [24] validation_0-auc:0.87964 validation_1-auc:0.83699 [25] validation_0-auc:0.88010 validation_1-auc:0.83691 [26] validation_0-auc:0.88121 validation_1-auc:0.83606 [27] validation_0-auc:0.88186 validation_1-auc:0.83456 [28] validation_0-auc:0.88208 validation_1-auc:0.83477 [29] validation_0-auc:0.88224 validation_1-auc:0.83469 [30] validation_0-auc:0.88357 validation_1-auc:0.83484 [31] validation_0-auc:0.88396 validation_1-auc:0.83445 [32] validation_0-auc:0.88414 validation_1-auc:0.83445 [33] validation_0-auc:0.88423 validation_1-auc:0.83429 [34] validation_0-auc:0.88451 validation_1-auc:0.83355 [35] validation_0-auc:0.88482 validation_1-auc:0.83345 [36] validation_0-auc:0.88503 validation_1-auc:0.83251 [37] validation_0-auc:0.88608 validation_1-auc:0.83220 [38] validation_0-auc:0.88619 validation_1-auc:0.83167 [39] validation_0-auc:0.88643 validation_1-auc:0.83142 [40] validation_0-auc:0.88673 validation_1-auc:0.83086 [41] validation_0-auc:0.88682 validation_1-auc:0.83056 [42] validation_0-auc:0.88683 validation_1-auc:0.83060 [43] validation_0-auc:0.88697 validation_1-auc:0.83031 [44] validation_0-auc:0.88705 validation_1-auc:0.83048 [45] validation_0-auc:0.88751 validation_1-auc:0.83085 [46] validation_0-auc:0.88820 validation_1-auc:0.83037 [47] validation_0-auc:0.88821 validation_1-auc:0.83055 [48] validation_0-auc:0.88864 validation_1-auc:0.83022 [49] validation_0-auc:0.88880 validation_1-auc:0.83001
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.80761 validation_1-auc:0.81297 [1] validation_0-auc:0.81426 validation_1-auc:0.81464 [2] validation_0-auc:0.83333 validation_1-auc:0.82937 [3] validation_0-auc:0.83932 validation_1-auc:0.83359 [4] validation_0-auc:0.84334 validation_1-auc:0.83390 [5] validation_0-auc:0.84685 validation_1-auc:0.83404 [6] validation_0-auc:0.84426 validation_1-auc:0.83247 [7] validation_0-auc:0.84963 validation_1-auc:0.83651 [8] validation_0-auc:0.85390 validation_1-auc:0.84200 [9] validation_0-auc:0.85566 validation_1-auc:0.84119 [10] validation_0-auc:0.85553 validation_1-auc:0.83994 [11] validation_0-auc:0.86125 validation_1-auc:0.84109 [12] validation_0-auc:0.86665 validation_1-auc:0.84226 [13] validation_0-auc:0.86779 validation_1-auc:0.84071 [14] validation_0-auc:0.87189 validation_1-auc:0.84338 [15] validation_0-auc:0.87286 validation_1-auc:0.84417 [16] validation_0-auc:0.87300 validation_1-auc:0.84269 [17] validation_0-auc:0.87369 validation_1-auc:0.84204 [18] validation_0-auc:0.87513 validation_1-auc:0.84383 [19] validation_0-auc:0.87658 validation_1-auc:0.84430 [20] validation_0-auc:0.87874 validation_1-auc:0.84447 [21] validation_0-auc:0.87942 validation_1-auc:0.84448 [22] validation_0-auc:0.88042 validation_1-auc:0.84426 [23] validation_0-auc:0.88118 validation_1-auc:0.84371 [24] validation_0-auc:0.88234 validation_1-auc:0.84268 [25] validation_0-auc:0.88340 validation_1-auc:0.84221 [26] validation_0-auc:0.88386 validation_1-auc:0.84182 [27] validation_0-auc:0.88412 validation_1-auc:0.84124 [28] validation_0-auc:0.88415 validation_1-auc:0.84149 [29] validation_0-auc:0.88433 validation_1-auc:0.84107 [30] validation_0-auc:0.88466 validation_1-auc:0.84142 [31] validation_0-auc:0.88484 validation_1-auc:0.84150 [32] validation_0-auc:0.88568 validation_1-auc:0.84190 [33] validation_0-auc:0.88596 validation_1-auc:0.84186 [34] validation_0-auc:0.88708 validation_1-auc:0.84122 [35] validation_0-auc:0.88747 validation_1-auc:0.84035 [36] validation_0-auc:0.88797 validation_1-auc:0.83999 [37] validation_0-auc:0.88837 validation_1-auc:0.83991 [38] validation_0-auc:0.88845 validation_1-auc:0.83952 [39] validation_0-auc:0.88860 validation_1-auc:0.83968 [40] validation_0-auc:0.88902 validation_1-auc:0.83928 [41] validation_0-auc:0.88904 validation_1-auc:0.83914 [42] validation_0-auc:0.88986 validation_1-auc:0.83941 [43] validation_0-auc:0.89016 validation_1-auc:0.83914 [44] validation_0-auc:0.89020 validation_1-auc:0.83878 [45] validation_0-auc:0.89032 validation_1-auc:0.83835 [46] validation_0-auc:0.89047 validation_1-auc:0.83831 [47] validation_0-auc:0.89129 validation_1-auc:0.83823 [48] validation_0-auc:0.89246 validation_1-auc:0.83784 [49] validation_0-auc:0.89287 validation_1-auc:0.83787 [50] validation_0-auc:0.89304 validation_1-auc:0.83772
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.81604 validation_1-auc:0.81235 [1] validation_0-auc:0.82253 validation_1-auc:0.81400 [2] validation_0-auc:0.83498 validation_1-auc:0.82366 [3] validation_0-auc:0.84183 validation_1-auc:0.82843 [4] validation_0-auc:0.84412 validation_1-auc:0.82954 [5] validation_0-auc:0.84630 validation_1-auc:0.82708 [6] validation_0-auc:0.84502 validation_1-auc:0.82381 [7] validation_0-auc:0.85257 validation_1-auc:0.82814 [8] validation_0-auc:0.85582 validation_1-auc:0.83553 [9] validation_0-auc:0.85747 validation_1-auc:0.83650 [10] validation_0-auc:0.85660 validation_1-auc:0.83364 [11] validation_0-auc:0.86296 validation_1-auc:0.83958 [12] validation_0-auc:0.86754 validation_1-auc:0.84099 [13] validation_0-auc:0.86875 validation_1-auc:0.83850 [14] validation_0-auc:0.87078 validation_1-auc:0.84079 [15] validation_0-auc:0.87244 validation_1-auc:0.84106 [16] validation_0-auc:0.87329 validation_1-auc:0.83940 [17] validation_0-auc:0.87409 validation_1-auc:0.83849 [18] validation_0-auc:0.87675 validation_1-auc:0.83885 [19] validation_0-auc:0.87832 validation_1-auc:0.83891 [20] validation_0-auc:0.87964 validation_1-auc:0.83841 [21] validation_0-auc:0.87980 validation_1-auc:0.83807 [22] validation_0-auc:0.88082 validation_1-auc:0.83794 [23] validation_0-auc:0.88101 validation_1-auc:0.83832 [24] validation_0-auc:0.88205 validation_1-auc:0.83779 [25] validation_0-auc:0.88319 validation_1-auc:0.83835 [26] validation_0-auc:0.88435 validation_1-auc:0.83902 [27] validation_0-auc:0.88475 validation_1-auc:0.83897 [28] validation_0-auc:0.88505 validation_1-auc:0.83825 [29] validation_0-auc:0.88624 validation_1-auc:0.83658 [30] validation_0-auc:0.88669 validation_1-auc:0.83637 [31] validation_0-auc:0.88701 validation_1-auc:0.83654 [32] validation_0-auc:0.88724 validation_1-auc:0.83621 [33] validation_0-auc:0.88758 validation_1-auc:0.83656 [34] validation_0-auc:0.88769 validation_1-auc:0.83656 [35] validation_0-auc:0.88812 validation_1-auc:0.83592 [36] validation_0-auc:0.88823 validation_1-auc:0.83557 [37] validation_0-auc:0.88861 validation_1-auc:0.83561 [38] validation_0-auc:0.88872 validation_1-auc:0.83449 [39] validation_0-auc:0.88902 validation_1-auc:0.83423 [40] validation_0-auc:0.89022 validation_1-auc:0.83465 [41] validation_0-auc:0.89038 validation_1-auc:0.83434 [42] validation_0-auc:0.89041 validation_1-auc:0.83390 [43] validation_0-auc:0.89095 validation_1-auc:0.83365 [44] validation_0-auc:0.89104 validation_1-auc:0.83275 [45] validation_0-auc:0.89108 validation_1-auc:0.83254
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.80593 validation_1-auc:0.80898 [1] validation_0-auc:0.80943 validation_1-auc:0.80802 [2] validation_0-auc:0.82441 validation_1-auc:0.82007 [3] validation_0-auc:0.83164 validation_1-auc:0.82390 [4] validation_0-auc:0.83772 validation_1-auc:0.82921 [5] validation_0-auc:0.83842 validation_1-auc:0.83047 [6] validation_0-auc:0.84104 validation_1-auc:0.83226 [7] validation_0-auc:0.84469 validation_1-auc:0.83537 [8] validation_0-auc:0.84683 validation_1-auc:0.83548 [9] validation_0-auc:0.84814 validation_1-auc:0.83619 [10] validation_0-auc:0.84998 validation_1-auc:0.83683 [11] validation_0-auc:0.85144 validation_1-auc:0.83648 [12] validation_0-auc:0.85391 validation_1-auc:0.83602 [13] validation_0-auc:0.85621 validation_1-auc:0.83614 [14] validation_0-auc:0.85711 validation_1-auc:0.83651 [15] validation_0-auc:0.85910 validation_1-auc:0.83675 [16] validation_0-auc:0.86083 validation_1-auc:0.83563 [17] validation_0-auc:0.86138 validation_1-auc:0.83521 [18] validation_0-auc:0.86217 validation_1-auc:0.83553 [19] validation_0-auc:0.86277 validation_1-auc:0.83468 [20] validation_0-auc:0.86429 validation_1-auc:0.83389 [21] validation_0-auc:0.86521 validation_1-auc:0.83337 [22] validation_0-auc:0.86556 validation_1-auc:0.83346 [23] validation_0-auc:0.86612 validation_1-auc:0.83304 [24] validation_0-auc:0.86641 validation_1-auc:0.83371 [25] validation_0-auc:0.86666 validation_1-auc:0.83331 [26] validation_0-auc:0.86741 validation_1-auc:0.83306 [27] validation_0-auc:0.86773 validation_1-auc:0.83277 [28] validation_0-auc:0.86801 validation_1-auc:0.83259 [29] validation_0-auc:0.86801 validation_1-auc:0.83250 [30] validation_0-auc:0.86866 validation_1-auc:0.83234 [31] validation_0-auc:0.86964 validation_1-auc:0.83222 [32] validation_0-auc:0.87137 validation_1-auc:0.83277 [33] validation_0-auc:0.87222 validation_1-auc:0.83281 [34] validation_0-auc:0.87254 validation_1-auc:0.83300 [35] validation_0-auc:0.87342 validation_1-auc:0.83264 [36] validation_0-auc:0.87366 validation_1-auc:0.83223 [37] validation_0-auc:0.87397 validation_1-auc:0.83247 [38] validation_0-auc:0.87426 validation_1-auc:0.83302 [39] validation_0-auc:0.87454 validation_1-auc:0.83187 [40] validation_0-auc:0.87481 validation_1-auc:0.83193
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.80863 validation_1-auc:0.80011 [1] validation_0-auc:0.80571 validation_1-auc:0.80437 [2] validation_0-auc:0.82796 validation_1-auc:0.82018 [3] validation_0-auc:0.83445 validation_1-auc:0.82662 [4] validation_0-auc:0.83787 validation_1-auc:0.82864 [5] validation_0-auc:0.84117 validation_1-auc:0.83429 [6] validation_0-auc:0.84465 validation_1-auc:0.83836 [7] validation_0-auc:0.84676 validation_1-auc:0.83896 [8] validation_0-auc:0.84853 validation_1-auc:0.83854 [9] validation_0-auc:0.85047 validation_1-auc:0.83875 [10] validation_0-auc:0.85238 validation_1-auc:0.83996 [11] validation_0-auc:0.85450 validation_1-auc:0.83884 [12] validation_0-auc:0.85576 validation_1-auc:0.84019 [13] validation_0-auc:0.85703 validation_1-auc:0.84096 [14] validation_0-auc:0.85943 validation_1-auc:0.84124 [15] validation_0-auc:0.86088 validation_1-auc:0.84081 [16] validation_0-auc:0.86188 validation_1-auc:0.84113 [17] validation_0-auc:0.86250 validation_1-auc:0.84169 [18] validation_0-auc:0.86372 validation_1-auc:0.84212 [19] validation_0-auc:0.86530 validation_1-auc:0.84257 [20] validation_0-auc:0.86633 validation_1-auc:0.84221 [21] validation_0-auc:0.86695 validation_1-auc:0.84267 [22] validation_0-auc:0.86752 validation_1-auc:0.84307 [23] validation_0-auc:0.86781 validation_1-auc:0.84318 [24] validation_0-auc:0.86879 validation_1-auc:0.84368 [25] validation_0-auc:0.86911 validation_1-auc:0.84397 [26] validation_0-auc:0.86973 validation_1-auc:0.84365 [27] validation_0-auc:0.87007 validation_1-auc:0.84368 [28] validation_0-auc:0.87031 validation_1-auc:0.84414 [29] validation_0-auc:0.87098 validation_1-auc:0.84442 [30] validation_0-auc:0.87178 validation_1-auc:0.84452 [31] validation_0-auc:0.87200 validation_1-auc:0.84428 [32] validation_0-auc:0.87239 validation_1-auc:0.84424 [33] validation_0-auc:0.87282 validation_1-auc:0.84353 [34] validation_0-auc:0.87332 validation_1-auc:0.84315 [35] validation_0-auc:0.87339 validation_1-auc:0.84293 [36] validation_0-auc:0.87369 validation_1-auc:0.84312 [37] validation_0-auc:0.87406 validation_1-auc:0.84332 [38] validation_0-auc:0.87417 validation_1-auc:0.84355 [39] validation_0-auc:0.87447 validation_1-auc:0.84348 [40] validation_0-auc:0.87487 validation_1-auc:0.84267 [41] validation_0-auc:0.87560 validation_1-auc:0.84227 [42] validation_0-auc:0.87584 validation_1-auc:0.84230 [43] validation_0-auc:0.87643 validation_1-auc:0.84230 [44] validation_0-auc:0.87714 validation_1-auc:0.84183 [45] validation_0-auc:0.87746 validation_1-auc:0.84178 [46] validation_0-auc:0.87768 validation_1-auc:0.84155 [47] validation_0-auc:0.87800 validation_1-auc:0.84126 [48] validation_0-auc:0.87831 validation_1-auc:0.84108 [49] validation_0-auc:0.87876 validation_1-auc:0.84056 [50] validation_0-auc:0.87896 validation_1-auc:0.84038 [51] validation_0-auc:0.87932 validation_1-auc:0.83964 [52] validation_0-auc:0.87947 validation_1-auc:0.83961 [53] validation_0-auc:0.87968 validation_1-auc:0.83957 [54] validation_0-auc:0.87976 validation_1-auc:0.83944 [55] validation_0-auc:0.87992 validation_1-auc:0.83952 [56] validation_0-auc:0.88011 validation_1-auc:0.83974 [57] validation_0-auc:0.88042 validation_1-auc:0.84042 [58] validation_0-auc:0.88054 validation_1-auc:0.84027 [59] validation_0-auc:0.88062 validation_1-auc:0.84035
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.82051 validation_1-auc:0.81706 [1] validation_0-auc:0.81489 validation_1-auc:0.81463 [2] validation_0-auc:0.82613 validation_1-auc:0.82392 [3] validation_0-auc:0.83159 validation_1-auc:0.82687 [4] validation_0-auc:0.83529 validation_1-auc:0.82743 [5] validation_0-auc:0.84072 validation_1-auc:0.83137 [6] validation_0-auc:0.84302 validation_1-auc:0.83442 [7] validation_0-auc:0.84511 validation_1-auc:0.83555 [8] validation_0-auc:0.84587 validation_1-auc:0.83638 [9] validation_0-auc:0.84812 validation_1-auc:0.83673 [10] validation_0-auc:0.85143 validation_1-auc:0.83526 [11] validation_0-auc:0.85397 validation_1-auc:0.83620 [12] validation_0-auc:0.85616 validation_1-auc:0.83763 [13] validation_0-auc:0.85783 validation_1-auc:0.83704 [14] validation_0-auc:0.85973 validation_1-auc:0.83846 [15] validation_0-auc:0.86155 validation_1-auc:0.83830 [16] validation_0-auc:0.86278 validation_1-auc:0.83844 [17] validation_0-auc:0.86419 validation_1-auc:0.83895 [18] validation_0-auc:0.86488 validation_1-auc:0.83874 [19] validation_0-auc:0.86616 validation_1-auc:0.83966 [20] validation_0-auc:0.86668 validation_1-auc:0.83973 [21] validation_0-auc:0.86747 validation_1-auc:0.83954 [22] validation_0-auc:0.86830 validation_1-auc:0.84018 [23] validation_0-auc:0.86902 validation_1-auc:0.83987 [24] validation_0-auc:0.86929 validation_1-auc:0.83983 [25] validation_0-auc:0.86938 validation_1-auc:0.83959 [26] validation_0-auc:0.86970 validation_1-auc:0.83898 [27] validation_0-auc:0.87019 validation_1-auc:0.83939 [28] validation_0-auc:0.87097 validation_1-auc:0.83947 [29] validation_0-auc:0.87104 validation_1-auc:0.83980 [30] validation_0-auc:0.87112 validation_1-auc:0.83968 [31] validation_0-auc:0.87148 validation_1-auc:0.83949 [32] validation_0-auc:0.87202 validation_1-auc:0.83963 [33] validation_0-auc:0.87219 validation_1-auc:0.83981 [34] validation_0-auc:0.87369 validation_1-auc:0.83896 [35] validation_0-auc:0.87390 validation_1-auc:0.83899 [36] validation_0-auc:0.87403 validation_1-auc:0.83903 [37] validation_0-auc:0.87436 validation_1-auc:0.83901 [38] validation_0-auc:0.87455 validation_1-auc:0.83884 [39] validation_0-auc:0.87532 validation_1-auc:0.83887 [40] validation_0-auc:0.87636 validation_1-auc:0.83877 [41] validation_0-auc:0.87660 validation_1-auc:0.83889 [42] validation_0-auc:0.87782 validation_1-auc:0.83865 [43] validation_0-auc:0.87821 validation_1-auc:0.83833 [44] validation_0-auc:0.87862 validation_1-auc:0.83833 [45] validation_0-auc:0.87869 validation_1-auc:0.83843 [46] validation_0-auc:0.87908 validation_1-auc:0.83817 [47] validation_0-auc:0.87924 validation_1-auc:0.83787 [48] validation_0-auc:0.87956 validation_1-auc:0.83809 [49] validation_0-auc:0.87973 validation_1-auc:0.83826 [50] validation_0-auc:0.87980 validation_1-auc:0.83813 [51] validation_0-auc:0.88012 validation_1-auc:0.83836
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.80803 validation_1-auc:0.80806 [1] validation_0-auc:0.80888 validation_1-auc:0.80894 [2] validation_0-auc:0.82405 validation_1-auc:0.81945 [3] validation_0-auc:0.83156 validation_1-auc:0.82609 [4] validation_0-auc:0.83401 validation_1-auc:0.82772 [5] validation_0-auc:0.83658 validation_1-auc:0.82950 [6] validation_0-auc:0.84047 validation_1-auc:0.83059 [7] validation_0-auc:0.84358 validation_1-auc:0.83185 [8] validation_0-auc:0.84541 validation_1-auc:0.83323 [9] validation_0-auc:0.84653 validation_1-auc:0.83440 [10] validation_0-auc:0.84920 validation_1-auc:0.83548 [11] validation_0-auc:0.85030 validation_1-auc:0.83627 [12] validation_0-auc:0.85268 validation_1-auc:0.83664 [13] validation_0-auc:0.85483 validation_1-auc:0.83676 [14] validation_0-auc:0.85628 validation_1-auc:0.83643 [15] validation_0-auc:0.85783 validation_1-auc:0.83561 [16] validation_0-auc:0.85833 validation_1-auc:0.83599 [17] validation_0-auc:0.85876 validation_1-auc:0.83658 [18] validation_0-auc:0.85988 validation_1-auc:0.83665 [19] validation_0-auc:0.86047 validation_1-auc:0.83641 [20] validation_0-auc:0.86213 validation_1-auc:0.83672 [21] validation_0-auc:0.86221 validation_1-auc:0.83658 [22] validation_0-auc:0.86260 validation_1-auc:0.83668 [23] validation_0-auc:0.86371 validation_1-auc:0.83622 [24] validation_0-auc:0.86459 validation_1-auc:0.83612 [25] validation_0-auc:0.86483 validation_1-auc:0.83655 [26] validation_0-auc:0.86517 validation_1-auc:0.83589 [27] validation_0-auc:0.86547 validation_1-auc:0.83557 [28] validation_0-auc:0.86581 validation_1-auc:0.83558 [29] validation_0-auc:0.86586 validation_1-auc:0.83583 [30] validation_0-auc:0.86616 validation_1-auc:0.83515 [31] validation_0-auc:0.86638 validation_1-auc:0.83521 [32] validation_0-auc:0.86697 validation_1-auc:0.83518 [33] validation_0-auc:0.86719 validation_1-auc:0.83535 [34] validation_0-auc:0.86757 validation_1-auc:0.83520 [35] validation_0-auc:0.86812 validation_1-auc:0.83506 [36] validation_0-auc:0.86869 validation_1-auc:0.83428 [37] validation_0-auc:0.86935 validation_1-auc:0.83442 [38] validation_0-auc:0.86997 validation_1-auc:0.83406 [39] validation_0-auc:0.87001 validation_1-auc:0.83394 [40] validation_0-auc:0.87003 validation_1-auc:0.83399 [41] validation_0-auc:0.87011 validation_1-auc:0.83382 [42] validation_0-auc:0.87020 validation_1-auc:0.83370 [43] validation_0-auc:0.87117 validation_1-auc:0.83317
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.80906 validation_1-auc:0.80435 [1] validation_0-auc:0.80610 validation_1-auc:0.80703 [2] validation_0-auc:0.82829 validation_1-auc:0.82098 [3] validation_0-auc:0.83657 validation_1-auc:0.82669 [4] validation_0-auc:0.83911 validation_1-auc:0.82849 [5] validation_0-auc:0.84163 validation_1-auc:0.83122 [6] validation_0-auc:0.84566 validation_1-auc:0.83619 [7] validation_0-auc:0.84720 validation_1-auc:0.83653 [8] validation_0-auc:0.84844 validation_1-auc:0.83526 [9] validation_0-auc:0.84960 validation_1-auc:0.83672 [10] validation_0-auc:0.85155 validation_1-auc:0.83775 [11] validation_0-auc:0.85323 validation_1-auc:0.83779 [12] validation_0-auc:0.85457 validation_1-auc:0.83858 [13] validation_0-auc:0.85554 validation_1-auc:0.83864 [14] validation_0-auc:0.85653 validation_1-auc:0.83865 [15] validation_0-auc:0.85860 validation_1-auc:0.83784 [16] validation_0-auc:0.85972 validation_1-auc:0.83823 [17] validation_0-auc:0.86067 validation_1-auc:0.83797 [18] validation_0-auc:0.86144 validation_1-auc:0.83898 [19] validation_0-auc:0.86230 validation_1-auc:0.83909 [20] validation_0-auc:0.86313 validation_1-auc:0.83899 [21] validation_0-auc:0.86356 validation_1-auc:0.83833 [22] validation_0-auc:0.86385 validation_1-auc:0.83846 [23] validation_0-auc:0.86472 validation_1-auc:0.83755 [24] validation_0-auc:0.86502 validation_1-auc:0.83757 [25] validation_0-auc:0.86617 validation_1-auc:0.83725 [26] validation_0-auc:0.86736 validation_1-auc:0.83663 [27] validation_0-auc:0.86768 validation_1-auc:0.83683 [28] validation_0-auc:0.86785 validation_1-auc:0.83707 [29] validation_0-auc:0.86868 validation_1-auc:0.83690 [30] validation_0-auc:0.86924 validation_1-auc:0.83645 [31] validation_0-auc:0.86947 validation_1-auc:0.83698 [32] validation_0-auc:0.86992 validation_1-auc:0.83699 [33] validation_0-auc:0.87008 validation_1-auc:0.83722 [34] validation_0-auc:0.87030 validation_1-auc:0.83671 [35] validation_0-auc:0.87040 validation_1-auc:0.83635 [36] validation_0-auc:0.87083 validation_1-auc:0.83606 [37] validation_0-auc:0.87100 validation_1-auc:0.83600 [38] validation_0-auc:0.87123 validation_1-auc:0.83581 [39] validation_0-auc:0.87191 validation_1-auc:0.83518 [40] validation_0-auc:0.87261 validation_1-auc:0.83526 [41] validation_0-auc:0.87289 validation_1-auc:0.83530 [42] validation_0-auc:0.87363 validation_1-auc:0.83549 [43] validation_0-auc:0.87382 validation_1-auc:0.83527 [44] validation_0-auc:0.87395 validation_1-auc:0.83509 [45] validation_0-auc:0.87422 validation_1-auc:0.83512 [46] validation_0-auc:0.87461 validation_1-auc:0.83508 [47] validation_0-auc:0.87471 validation_1-auc:0.83492 [48] validation_0-auc:0.87482 validation_1-auc:0.83531 [49] validation_0-auc:0.87589 validation_1-auc:0.83511
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.81882 validation_1-auc:0.81739 [1] validation_0-auc:0.81386 validation_1-auc:0.81337 [2] validation_0-auc:0.82661 validation_1-auc:0.82334 [3] validation_0-auc:0.83557 validation_1-auc:0.82907 [4] validation_0-auc:0.83892 validation_1-auc:0.83167 [5] validation_0-auc:0.84144 validation_1-auc:0.83105 [6] validation_0-auc:0.84403 validation_1-auc:0.83360 [7] validation_0-auc:0.84648 validation_1-auc:0.83417 [8] validation_0-auc:0.84841 validation_1-auc:0.83581 [9] validation_0-auc:0.84876 validation_1-auc:0.83592 [10] validation_0-auc:0.85065 validation_1-auc:0.83624 [11] validation_0-auc:0.85204 validation_1-auc:0.83676 [12] validation_0-auc:0.85449 validation_1-auc:0.83495 [13] validation_0-auc:0.85519 validation_1-auc:0.83583 [14] validation_0-auc:0.85682 validation_1-auc:0.83720 [15] validation_0-auc:0.85847 validation_1-auc:0.83743 [16] validation_0-auc:0.85930 validation_1-auc:0.83756 [17] validation_0-auc:0.86089 validation_1-auc:0.83841 [18] validation_0-auc:0.86188 validation_1-auc:0.83752 [19] validation_0-auc:0.86281 validation_1-auc:0.83698 [20] validation_0-auc:0.86303 validation_1-auc:0.83774 [21] validation_0-auc:0.86344 validation_1-auc:0.83706 [22] validation_0-auc:0.86436 validation_1-auc:0.83797 [23] validation_0-auc:0.86510 validation_1-auc:0.83779 [24] validation_0-auc:0.86599 validation_1-auc:0.83699 [25] validation_0-auc:0.86622 validation_1-auc:0.83717 [26] validation_0-auc:0.86666 validation_1-auc:0.83686 [27] validation_0-auc:0.86676 validation_1-auc:0.83677 [28] validation_0-auc:0.86725 validation_1-auc:0.83654 [29] validation_0-auc:0.86801 validation_1-auc:0.83721 [30] validation_0-auc:0.86835 validation_1-auc:0.83702 [31] validation_0-auc:0.86855 validation_1-auc:0.83695 [32] validation_0-auc:0.86923 validation_1-auc:0.83718 [33] validation_0-auc:0.86939 validation_1-auc:0.83721 [34] validation_0-auc:0.86989 validation_1-auc:0.83775 [35] validation_0-auc:0.87036 validation_1-auc:0.83726 [36] validation_0-auc:0.87093 validation_1-auc:0.83715 [37] validation_0-auc:0.87206 validation_1-auc:0.83741 [38] validation_0-auc:0.87238 validation_1-auc:0.83684 [39] validation_0-auc:0.87238 validation_1-auc:0.83715 [40] validation_0-auc:0.87273 validation_1-auc:0.83752 [41] validation_0-auc:0.87301 validation_1-auc:0.83752 [42] validation_0-auc:0.87315 validation_1-auc:0.83757 [43] validation_0-auc:0.87336 validation_1-auc:0.83705 [44] validation_0-auc:0.87348 validation_1-auc:0.83684 [45] validation_0-auc:0.87353 validation_1-auc:0.83677 [46] validation_0-auc:0.87404 validation_1-auc:0.83655 [47] validation_0-auc:0.87446 validation_1-auc:0.83642
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.81573 validation_1-auc:0.81067 [1] validation_0-auc:0.81402 validation_1-auc:0.80774 [2] validation_0-auc:0.82912 validation_1-auc:0.81864 [3] validation_0-auc:0.83834 validation_1-auc:0.82490 [4] validation_0-auc:0.84340 validation_1-auc:0.82924 [5] validation_0-auc:0.84800 validation_1-auc:0.83252 [6] validation_0-auc:0.85152 validation_1-auc:0.83472 [7] validation_0-auc:0.85737 validation_1-auc:0.83466 [8] validation_0-auc:0.86002 validation_1-auc:0.83343 [9] validation_0-auc:0.86333 validation_1-auc:0.83595 [10] validation_0-auc:0.86764 validation_1-auc:0.83618 [11] validation_0-auc:0.87055 validation_1-auc:0.83591 [12] validation_0-auc:0.87303 validation_1-auc:0.83511 [13] validation_0-auc:0.87531 validation_1-auc:0.83573 [14] validation_0-auc:0.87644 validation_1-auc:0.83532 [15] validation_0-auc:0.87864 validation_1-auc:0.83346 [16] validation_0-auc:0.87995 validation_1-auc:0.83380 [17] validation_0-auc:0.88099 validation_1-auc:0.83336 [18] validation_0-auc:0.88257 validation_1-auc:0.83356 [19] validation_0-auc:0.88371 validation_1-auc:0.83321 [20] validation_0-auc:0.88439 validation_1-auc:0.83356 [21] validation_0-auc:0.88586 validation_1-auc:0.83317 [22] validation_0-auc:0.88709 validation_1-auc:0.83316 [23] validation_0-auc:0.88830 validation_1-auc:0.83373 [24] validation_0-auc:0.88935 validation_1-auc:0.83351 [25] validation_0-auc:0.88952 validation_1-auc:0.83307 [26] validation_0-auc:0.89008 validation_1-auc:0.83344 [27] validation_0-auc:0.89029 validation_1-auc:0.83254 [28] validation_0-auc:0.89092 validation_1-auc:0.83229 [29] validation_0-auc:0.89101 validation_1-auc:0.83245 [30] validation_0-auc:0.89101 validation_1-auc:0.83259 [31] validation_0-auc:0.89140 validation_1-auc:0.83219 [32] validation_0-auc:0.89238 validation_1-auc:0.83116 [33] validation_0-auc:0.89233 validation_1-auc:0.83080 [34] validation_0-auc:0.89242 validation_1-auc:0.83077 [35] validation_0-auc:0.89254 validation_1-auc:0.83114 [36] validation_0-auc:0.89271 validation_1-auc:0.83110 [37] validation_0-auc:0.89290 validation_1-auc:0.83083 [38] validation_0-auc:0.89336 validation_1-auc:0.83049 [39] validation_0-auc:0.89530 validation_1-auc:0.82890
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.81428 validation_1-auc:0.80541 [1] validation_0-auc:0.81563 validation_1-auc:0.81177 [2] validation_0-auc:0.83592 validation_1-auc:0.82066 [3] validation_0-auc:0.84379 validation_1-auc:0.82740 [4] validation_0-auc:0.85017 validation_1-auc:0.83233 [5] validation_0-auc:0.85384 validation_1-auc:0.83517 [6] validation_0-auc:0.85918 validation_1-auc:0.83931 [7] validation_0-auc:0.86245 validation_1-auc:0.83779 [8] validation_0-auc:0.86419 validation_1-auc:0.83818 [9] validation_0-auc:0.86613 validation_1-auc:0.83712 [10] validation_0-auc:0.86933 validation_1-auc:0.83600 [11] validation_0-auc:0.87184 validation_1-auc:0.83711 [12] validation_0-auc:0.87415 validation_1-auc:0.83710 [13] validation_0-auc:0.87716 validation_1-auc:0.83639 [14] validation_0-auc:0.87865 validation_1-auc:0.83795 [15] validation_0-auc:0.88110 validation_1-auc:0.83841 [16] validation_0-auc:0.88358 validation_1-auc:0.83779 [17] validation_0-auc:0.88461 validation_1-auc:0.83794 [18] validation_0-auc:0.88528 validation_1-auc:0.83693 [19] validation_0-auc:0.88665 validation_1-auc:0.83657 [20] validation_0-auc:0.88726 validation_1-auc:0.83669 [21] validation_0-auc:0.88784 validation_1-auc:0.83639 [22] validation_0-auc:0.88877 validation_1-auc:0.83675 [23] validation_0-auc:0.88977 validation_1-auc:0.83683 [24] validation_0-auc:0.89106 validation_1-auc:0.83613 [25] validation_0-auc:0.89126 validation_1-auc:0.83660 [26] validation_0-auc:0.89187 validation_1-auc:0.83647 [27] validation_0-auc:0.89194 validation_1-auc:0.83668 [28] validation_0-auc:0.89241 validation_1-auc:0.83639 [29] validation_0-auc:0.89287 validation_1-auc:0.83627 [30] validation_0-auc:0.89301 validation_1-auc:0.83632 [31] validation_0-auc:0.89311 validation_1-auc:0.83618 [32] validation_0-auc:0.89323 validation_1-auc:0.83561 [33] validation_0-auc:0.89324 validation_1-auc:0.83535 [34] validation_0-auc:0.89379 validation_1-auc:0.83478 [35] validation_0-auc:0.89558 validation_1-auc:0.83419 [36] validation_0-auc:0.89575 validation_1-auc:0.83435
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.82448 validation_1-auc:0.81823 [1] validation_0-auc:0.82212 validation_1-auc:0.81592 [2] validation_0-auc:0.83464 validation_1-auc:0.82639 [3] validation_0-auc:0.84121 validation_1-auc:0.82994 [4] validation_0-auc:0.84804 validation_1-auc:0.83439 [5] validation_0-auc:0.85362 validation_1-auc:0.83314 [6] validation_0-auc:0.85620 validation_1-auc:0.83629 [7] validation_0-auc:0.86025 validation_1-auc:0.83581 [8] validation_0-auc:0.86252 validation_1-auc:0.83743 [9] validation_0-auc:0.86649 validation_1-auc:0.83646 [10] validation_0-auc:0.87101 validation_1-auc:0.83606 [11] validation_0-auc:0.87345 validation_1-auc:0.83743 [12] validation_0-auc:0.87701 validation_1-auc:0.83809 [13] validation_0-auc:0.88044 validation_1-auc:0.83779 [14] validation_0-auc:0.88136 validation_1-auc:0.83869 [15] validation_0-auc:0.88292 validation_1-auc:0.83977 [16] validation_0-auc:0.88460 validation_1-auc:0.83955 [17] validation_0-auc:0.88560 validation_1-auc:0.83952 [18] validation_0-auc:0.88737 validation_1-auc:0.83966 [19] validation_0-auc:0.88791 validation_1-auc:0.83902 [20] validation_0-auc:0.88921 validation_1-auc:0.83812 [21] validation_0-auc:0.88990 validation_1-auc:0.83808 [22] validation_0-auc:0.89119 validation_1-auc:0.83848 [23] validation_0-auc:0.89132 validation_1-auc:0.83821 [24] validation_0-auc:0.89151 validation_1-auc:0.83836 [25] validation_0-auc:0.89218 validation_1-auc:0.83825 [26] validation_0-auc:0.89259 validation_1-auc:0.83789 [27] validation_0-auc:0.89286 validation_1-auc:0.83822 [28] validation_0-auc:0.89339 validation_1-auc:0.83884 [29] validation_0-auc:0.89368 validation_1-auc:0.83923 [30] validation_0-auc:0.89400 validation_1-auc:0.83906 [31] validation_0-auc:0.89461 validation_1-auc:0.83875 [32] validation_0-auc:0.89599 validation_1-auc:0.83829 [33] validation_0-auc:0.89637 validation_1-auc:0.83835 [34] validation_0-auc:0.89658 validation_1-auc:0.83826 [35] validation_0-auc:0.89678 validation_1-auc:0.83787 [36] validation_0-auc:0.89722 validation_1-auc:0.83755 [37] validation_0-auc:0.89733 validation_1-auc:0.83735 [38] validation_0-auc:0.89809 validation_1-auc:0.83710 [39] validation_0-auc:0.89978 validation_1-auc:0.83646 [40] validation_0-auc:0.89994 validation_1-auc:0.83517 [41] validation_0-auc:0.90029 validation_1-auc:0.83466 [42] validation_0-auc:0.90106 validation_1-auc:0.83394 [43] validation_0-auc:0.90121 validation_1-auc:0.83358 [44] validation_0-auc:0.90189 validation_1-auc:0.83272
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.81642 validation_1-auc:0.81216 [1] validation_0-auc:0.81467 validation_1-auc:0.81133 [2] validation_0-auc:0.82902 validation_1-auc:0.82239 [3] validation_0-auc:0.83775 validation_1-auc:0.82795 [4] validation_0-auc:0.84354 validation_1-auc:0.83096 [5] validation_0-auc:0.84790 validation_1-auc:0.83441 [6] validation_0-auc:0.85080 validation_1-auc:0.83508 [7] validation_0-auc:0.85454 validation_1-auc:0.83765 [8] validation_0-auc:0.85680 validation_1-auc:0.83712 [9] validation_0-auc:0.85991 validation_1-auc:0.83639 [10] validation_0-auc:0.86257 validation_1-auc:0.83788 [11] validation_0-auc:0.86486 validation_1-auc:0.83696 [12] validation_0-auc:0.86742 validation_1-auc:0.83751 [13] validation_0-auc:0.86908 validation_1-auc:0.83692 [14] validation_0-auc:0.87015 validation_1-auc:0.83647 [15] validation_0-auc:0.87203 validation_1-auc:0.83610 [16] validation_0-auc:0.87312 validation_1-auc:0.83591 [17] validation_0-auc:0.87401 validation_1-auc:0.83522 [18] validation_0-auc:0.87530 validation_1-auc:0.83620 [19] validation_0-auc:0.87557 validation_1-auc:0.83593 [20] validation_0-auc:0.87615 validation_1-auc:0.83572 [21] validation_0-auc:0.87654 validation_1-auc:0.83491 [22] validation_0-auc:0.87686 validation_1-auc:0.83470 [23] validation_0-auc:0.87802 validation_1-auc:0.83387 [24] validation_0-auc:0.87833 validation_1-auc:0.83389 [25] validation_0-auc:0.87844 validation_1-auc:0.83418 [26] validation_0-auc:0.87823 validation_1-auc:0.83352 [27] validation_0-auc:0.87835 validation_1-auc:0.83349 [28] validation_0-auc:0.87917 validation_1-auc:0.83296 [29] validation_0-auc:0.87943 validation_1-auc:0.83221 [30] validation_0-auc:0.88058 validation_1-auc:0.83157 [31] validation_0-auc:0.88095 validation_1-auc:0.83166 [32] validation_0-auc:0.88121 validation_1-auc:0.83150 [33] validation_0-auc:0.88151 validation_1-auc:0.83233 [34] validation_0-auc:0.88173 validation_1-auc:0.83203 [35] validation_0-auc:0.88230 validation_1-auc:0.83163 [36] validation_0-auc:0.88239 validation_1-auc:0.83148 [37] validation_0-auc:0.88300 validation_1-auc:0.83086 [38] validation_0-auc:0.88390 validation_1-auc:0.83043 [39] validation_0-auc:0.88482 validation_1-auc:0.83050
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.81195 validation_1-auc:0.80615 [1] validation_0-auc:0.81510 validation_1-auc:0.81289 [2] validation_0-auc:0.83475 validation_1-auc:0.82645 [3] validation_0-auc:0.84423 validation_1-auc:0.83140 [4] validation_0-auc:0.84952 validation_1-auc:0.83408 [5] validation_0-auc:0.85317 validation_1-auc:0.83493 [6] validation_0-auc:0.85713 validation_1-auc:0.83850 [7] validation_0-auc:0.85924 validation_1-auc:0.83793 [8] validation_0-auc:0.86016 validation_1-auc:0.83809 [9] validation_0-auc:0.86307 validation_1-auc:0.83816 [10] validation_0-auc:0.86572 validation_1-auc:0.83931 [11] validation_0-auc:0.86748 validation_1-auc:0.83981 [12] validation_0-auc:0.87020 validation_1-auc:0.83906 [13] validation_0-auc:0.87270 validation_1-auc:0.84018 [14] validation_0-auc:0.87371 validation_1-auc:0.84127 [15] validation_0-auc:0.87558 validation_1-auc:0.84117 [16] validation_0-auc:0.87630 validation_1-auc:0.84078 [17] validation_0-auc:0.87673 validation_1-auc:0.84143 [18] validation_0-auc:0.87773 validation_1-auc:0.84117 [19] validation_0-auc:0.87853 validation_1-auc:0.84073 [20] validation_0-auc:0.87944 validation_1-auc:0.84043 [21] validation_0-auc:0.88031 validation_1-auc:0.84016 [22] validation_0-auc:0.88152 validation_1-auc:0.84011 [23] validation_0-auc:0.88229 validation_1-auc:0.84037 [24] validation_0-auc:0.88302 validation_1-auc:0.84030 [25] validation_0-auc:0.88319 validation_1-auc:0.84035 [26] validation_0-auc:0.88331 validation_1-auc:0.84042 [27] validation_0-auc:0.88374 validation_1-auc:0.83981 [28] validation_0-auc:0.88437 validation_1-auc:0.83969 [29] validation_0-auc:0.88430 validation_1-auc:0.83984 [30] validation_0-auc:0.88448 validation_1-auc:0.83927 [31] validation_0-auc:0.88478 validation_1-auc:0.83854 [32] validation_0-auc:0.88527 validation_1-auc:0.83809 [33] validation_0-auc:0.88527 validation_1-auc:0.83816 [34] validation_0-auc:0.88573 validation_1-auc:0.83816 [35] validation_0-auc:0.88579 validation_1-auc:0.83781 [36] validation_0-auc:0.88615 validation_1-auc:0.83815 [37] validation_0-auc:0.88632 validation_1-auc:0.83822 [38] validation_0-auc:0.88677 validation_1-auc:0.83768 [39] validation_0-auc:0.88687 validation_1-auc:0.83777 [40] validation_0-auc:0.88806 validation_1-auc:0.83728 [41] validation_0-auc:0.88826 validation_1-auc:0.83720 [42] validation_0-auc:0.88844 validation_1-auc:0.83717 [43] validation_0-auc:0.88903 validation_1-auc:0.83634 [44] validation_0-auc:0.88944 validation_1-auc:0.83652 [45] validation_0-auc:0.88957 validation_1-auc:0.83666 [46] validation_0-auc:0.88969 validation_1-auc:0.83630
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.82218 validation_1-auc:0.81819 [1] validation_0-auc:0.82120 validation_1-auc:0.81682 [2] validation_0-auc:0.83459 validation_1-auc:0.82497 [3] validation_0-auc:0.84312 validation_1-auc:0.82940 [4] validation_0-auc:0.84793 validation_1-auc:0.83178 [5] validation_0-auc:0.85339 validation_1-auc:0.83203 [6] validation_0-auc:0.85638 validation_1-auc:0.83512 [7] validation_0-auc:0.86082 validation_1-auc:0.83538 [8] validation_0-auc:0.86250 validation_1-auc:0.83772 [9] validation_0-auc:0.86569 validation_1-auc:0.83589 [10] validation_0-auc:0.86855 validation_1-auc:0.83342 [11] validation_0-auc:0.87103 validation_1-auc:0.83398 [12] validation_0-auc:0.87268 validation_1-auc:0.83472 [13] validation_0-auc:0.87475 validation_1-auc:0.83576 [14] validation_0-auc:0.87519 validation_1-auc:0.83642 [15] validation_0-auc:0.87568 validation_1-auc:0.83708 [16] validation_0-auc:0.87705 validation_1-auc:0.83742 [17] validation_0-auc:0.87816 validation_1-auc:0.83725 [18] validation_0-auc:0.87960 validation_1-auc:0.83748 [19] validation_0-auc:0.88066 validation_1-auc:0.83743 [20] validation_0-auc:0.88106 validation_1-auc:0.83706 [21] validation_0-auc:0.88130 validation_1-auc:0.83685 [22] validation_0-auc:0.88181 validation_1-auc:0.83701 [23] validation_0-auc:0.88268 validation_1-auc:0.83760 [24] validation_0-auc:0.88287 validation_1-auc:0.83759 [25] validation_0-auc:0.88349 validation_1-auc:0.83769 [26] validation_0-auc:0.88356 validation_1-auc:0.83738 [27] validation_0-auc:0.88422 validation_1-auc:0.83697 [28] validation_0-auc:0.88470 validation_1-auc:0.83692 [29] validation_0-auc:0.88499 validation_1-auc:0.83655 [30] validation_0-auc:0.88605 validation_1-auc:0.83731 [31] validation_0-auc:0.88616 validation_1-auc:0.83742 [32] validation_0-auc:0.88634 validation_1-auc:0.83749 [33] validation_0-auc:0.88638 validation_1-auc:0.83769 [34] validation_0-auc:0.88647 validation_1-auc:0.83766 [35] validation_0-auc:0.88742 validation_1-auc:0.83784 [36] validation_0-auc:0.88772 validation_1-auc:0.83717 [37] validation_0-auc:0.88800 validation_1-auc:0.83703 [38] validation_0-auc:0.88804 validation_1-auc:0.83704 [39] validation_0-auc:0.88806 validation_1-auc:0.83733 [40] validation_0-auc:0.88820 validation_1-auc:0.83711 [41] validation_0-auc:0.88864 validation_1-auc:0.83711 [42] validation_0-auc:0.88884 validation_1-auc:0.83691 [43] validation_0-auc:0.88959 validation_1-auc:0.83599 [44] validation_0-auc:0.89049 validation_1-auc:0.83564 [45] validation_0-auc:0.89064 validation_1-auc:0.83583 [46] validation_0-auc:0.89090 validation_1-auc:0.83548 [47] validation_0-auc:0.89094 validation_1-auc:0.83519 [48] validation_0-auc:0.89103 validation_1-auc:0.83547 [49] validation_0-auc:0.89242 validation_1-auc:0.83543 [50] validation_0-auc:0.89263 validation_1-auc:0.83518 [51] validation_0-auc:0.89269 validation_1-auc:0.83503 [52] validation_0-auc:0.89305 validation_1-auc:0.83429 [53] validation_0-auc:0.89313 validation_1-auc:0.83395 [54] validation_0-auc:0.89387 validation_1-auc:0.83441 [55] validation_0-auc:0.89434 validation_1-auc:0.83376 [56] validation_0-auc:0.89461 validation_1-auc:0.83378 [57] validation_0-auc:0.89464 validation_1-auc:0.83355 [58] validation_0-auc:0.89499 validation_1-auc:0.83349 [59] validation_0-auc:0.89506 validation_1-auc:0.83337 [60] validation_0-auc:0.89596 validation_1-auc:0.83275 [61] validation_0-auc:0.89625 validation_1-auc:0.83229 [62] validation_0-auc:0.89658 validation_1-auc:0.83240 [63] validation_0-auc:0.89643 validation_1-auc:0.83233 [64] validation_0-auc:0.89645 validation_1-auc:0.83215
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.81270 validation_1-auc:0.81076 [1] validation_0-auc:0.81820 validation_1-auc:0.81360 [2] validation_0-auc:0.83353 validation_1-auc:0.83010 [3] validation_0-auc:0.83823 validation_1-auc:0.83429 [4] validation_0-auc:0.84125 validation_1-auc:0.83363 [5] validation_0-auc:0.83979 validation_1-auc:0.83299 [6] validation_0-auc:0.83796 validation_1-auc:0.83148 [7] validation_0-auc:0.84384 validation_1-auc:0.83526 [8] validation_0-auc:0.84676 validation_1-auc:0.83677 [9] validation_0-auc:0.84731 validation_1-auc:0.83538 [10] validation_0-auc:0.84726 validation_1-auc:0.83485 [11] validation_0-auc:0.85253 validation_1-auc:0.84007 [12] validation_0-auc:0.85678 validation_1-auc:0.84255 [13] validation_0-auc:0.85821 validation_1-auc:0.84121 [14] validation_0-auc:0.86100 validation_1-auc:0.84394 [15] validation_0-auc:0.86250 validation_1-auc:0.84436 [16] validation_0-auc:0.86371 validation_1-auc:0.84416 [17] validation_0-auc:0.86531 validation_1-auc:0.84329 [18] validation_0-auc:0.86661 validation_1-auc:0.84514 [19] validation_0-auc:0.86843 validation_1-auc:0.84475 [20] validation_0-auc:0.86982 validation_1-auc:0.84504 [21] validation_0-auc:0.87086 validation_1-auc:0.84455 [22] validation_0-auc:0.87217 validation_1-auc:0.84348 [23] validation_0-auc:0.87245 validation_1-auc:0.84353 [24] validation_0-auc:0.87288 validation_1-auc:0.84332 [25] validation_0-auc:0.87357 validation_1-auc:0.84280 [26] validation_0-auc:0.87415 validation_1-auc:0.84256 [27] validation_0-auc:0.87457 validation_1-auc:0.84236 [28] validation_0-auc:0.87480 validation_1-auc:0.84238 [29] validation_0-auc:0.87526 validation_1-auc:0.84256 [30] validation_0-auc:0.87542 validation_1-auc:0.84259 [31] validation_0-auc:0.87579 validation_1-auc:0.84232 [32] validation_0-auc:0.87651 validation_1-auc:0.84209 [33] validation_0-auc:0.87710 validation_1-auc:0.84197 [34] validation_0-auc:0.87786 validation_1-auc:0.84234 [35] validation_0-auc:0.87852 validation_1-auc:0.84190 [36] validation_0-auc:0.87900 validation_1-auc:0.84162 [37] validation_0-auc:0.87939 validation_1-auc:0.84191 [38] validation_0-auc:0.87975 validation_1-auc:0.84166 [39] validation_0-auc:0.88029 validation_1-auc:0.84169 [40] validation_0-auc:0.88142 validation_1-auc:0.84133 [41] validation_0-auc:0.88153 validation_1-auc:0.84140 [42] validation_0-auc:0.88231 validation_1-auc:0.84144 [43] validation_0-auc:0.88254 validation_1-auc:0.84134 [44] validation_0-auc:0.88291 validation_1-auc:0.84132 [45] validation_0-auc:0.88334 validation_1-auc:0.84159 [46] validation_0-auc:0.88343 validation_1-auc:0.84149 [47] validation_0-auc:0.88391 validation_1-auc:0.84161 GridSearchCV 최적 파라미터: {'colsample_bytree': 0.5, 'max_depth': 5, 'min_child_weight': 3} ROC AUC: 0.8451
- 성능은 약 0.8451 정도
### 하이퍼 파라미터 튜닝(2)
# n_estimators는 1000으로 증가시키고, learning_rate = 0.02로 감소, reg_alpha = 0.03으로 추가
xgb_clf = XGBClassifier(n_estimators = 1000, random_state = 156,
learning_rate = 0.02, max_depth = 7,\
min_child_weight = 1, colsample_bytree = 0.75, reg_alpha = 0.03)
# evaluation metric을 auc로, early_stopping은 200 으로 설정하고 학습 수행
# 학습
xgb_clf.fit(X_train, y_train, early_stopping_rounds = 200,
eval_metric = "auc", eval_set = [(X_train, y_train), (X_test, y_test)])
# 예측 & 평가
xgb_roc_score = roc_auc_score(y_test, xgb_clf.predict_proba(X_test)[:,1], average = 'macro')
print('ROC AUC: {0:.4f}'.format(xgb_roc_score))
/usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead. warnings.warn( /usr/local/lib/python3.10/dist-packages/xgboost/sklearn.py:835: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead. warnings.warn(
[0] validation_0-auc:0.82311 validation_1-auc:0.81793 [1] validation_0-auc:0.82585 validation_1-auc:0.81766 [2] validation_0-auc:0.82950 validation_1-auc:0.81846 [3] validation_0-auc:0.83162 validation_1-auc:0.82051 [4] validation_0-auc:0.83561 validation_1-auc:0.82587 [5] validation_0-auc:0.83569 validation_1-auc:0.82566 [6] validation_0-auc:0.83635 validation_1-auc:0.82582 [7] validation_0-auc:0.83757 validation_1-auc:0.82763 [8] validation_0-auc:0.83873 validation_1-auc:0.82945 [9] validation_0-auc:0.83986 validation_1-auc:0.83072 [10] validation_0-auc:0.84036 validation_1-auc:0.83039 [11] validation_0-auc:0.84071 validation_1-auc:0.83051 [12] validation_0-auc:0.84064 validation_1-auc:0.82997 [13] validation_0-auc:0.84098 validation_1-auc:0.82958 [14] validation_0-auc:0.84174 validation_1-auc:0.83016 [15] validation_0-auc:0.84223 validation_1-auc:0.83103 [16] validation_0-auc:0.84262 validation_1-auc:0.83055 [17] validation_0-auc:0.84302 validation_1-auc:0.83020 [18] validation_0-auc:0.84539 validation_1-auc:0.83365 [19] validation_0-auc:0.84552 validation_1-auc:0.83365 [20] validation_0-auc:0.84602 validation_1-auc:0.83425 [21] validation_0-auc:0.84638 validation_1-auc:0.83417 [22] validation_0-auc:0.84674 validation_1-auc:0.83410 [23] validation_0-auc:0.84709 validation_1-auc:0.83411 [24] validation_0-auc:0.84733 validation_1-auc:0.83414 [25] validation_0-auc:0.84748 validation_1-auc:0.83433 [26] validation_0-auc:0.84773 validation_1-auc:0.83412 [27] validation_0-auc:0.84851 validation_1-auc:0.83440 [28] validation_0-auc:0.84845 validation_1-auc:0.83438 [29] validation_0-auc:0.84997 validation_1-auc:0.83507 [30] validation_0-auc:0.84927 validation_1-auc:0.83379 [31] validation_0-auc:0.84984 validation_1-auc:0.83457 [32] validation_0-auc:0.85048 validation_1-auc:0.83473 [33] validation_0-auc:0.85068 validation_1-auc:0.83501 [34] validation_0-auc:0.85075 validation_1-auc:0.83495 [35] validation_0-auc:0.85103 validation_1-auc:0.83530 [36] validation_0-auc:0.85164 validation_1-auc:0.83564 [37] validation_0-auc:0.85178 validation_1-auc:0.83554 [38] validation_0-auc:0.85234 validation_1-auc:0.83545 [39] validation_0-auc:0.85237 validation_1-auc:0.83567 [40] validation_0-auc:0.85311 validation_1-auc:0.83620 [41] validation_0-auc:0.85368 validation_1-auc:0.83621 [42] validation_0-auc:0.85389 validation_1-auc:0.83596 [43] validation_0-auc:0.85433 validation_1-auc:0.83682 [44] validation_0-auc:0.85522 validation_1-auc:0.83704 [45] validation_0-auc:0.85467 validation_1-auc:0.83725 [46] validation_0-auc:0.85488 validation_1-auc:0.83704 [47] validation_0-auc:0.85532 validation_1-auc:0.83738 [48] validation_0-auc:0.85549 validation_1-auc:0.83757 [49] validation_0-auc:0.85594 validation_1-auc:0.83784 [50] validation_0-auc:0.85634 validation_1-auc:0.83812 [51] validation_0-auc:0.85657 validation_1-auc:0.83801 [52] validation_0-auc:0.85697 validation_1-auc:0.83844 [53] validation_0-auc:0.85723 validation_1-auc:0.83852 [54] validation_0-auc:0.85742 validation_1-auc:0.83858 [55] validation_0-auc:0.85812 validation_1-auc:0.83872 [56] validation_0-auc:0.85866 validation_1-auc:0.83871 [57] validation_0-auc:0.85912 validation_1-auc:0.83880 [58] validation_0-auc:0.85923 validation_1-auc:0.83909 [59] validation_0-auc:0.85954 validation_1-auc:0.83934 [60] validation_0-auc:0.85967 validation_1-auc:0.83949 [61] validation_0-auc:0.86005 validation_1-auc:0.83945 [62] validation_0-auc:0.86043 validation_1-auc:0.83996 [63] validation_0-auc:0.86012 validation_1-auc:0.83966 [64] validation_0-auc:0.86063 validation_1-auc:0.83984 [65] validation_0-auc:0.86082 validation_1-auc:0.83975 [66] validation_0-auc:0.86093 validation_1-auc:0.83895 [67] validation_0-auc:0.86136 validation_1-auc:0.83965 [68] validation_0-auc:0.86149 validation_1-auc:0.83962 [69] validation_0-auc:0.86173 validation_1-auc:0.83958 [70] validation_0-auc:0.86199 validation_1-auc:0.83980 [71] validation_0-auc:0.86218 validation_1-auc:0.83973 [72] validation_0-auc:0.86248 validation_1-auc:0.83983 [73] validation_0-auc:0.86276 validation_1-auc:0.83996 [74] validation_0-auc:0.86285 validation_1-auc:0.83998 [75] validation_0-auc:0.86294 validation_1-auc:0.83973 [76] validation_0-auc:0.86316 validation_1-auc:0.83976 [77] validation_0-auc:0.86358 validation_1-auc:0.83968 [78] validation_0-auc:0.86369 validation_1-auc:0.83982 [79] validation_0-auc:0.86398 validation_1-auc:0.83987 [80] validation_0-auc:0.86432 validation_1-auc:0.84005 [81] validation_0-auc:0.86476 validation_1-auc:0.84042 [82] validation_0-auc:0.86498 validation_1-auc:0.84035 [83] validation_0-auc:0.86530 validation_1-auc:0.84019 [84] validation_0-auc:0.86578 validation_1-auc:0.84093 [85] validation_0-auc:0.86592 validation_1-auc:0.84102 [86] validation_0-auc:0.86611 validation_1-auc:0.84108 [87] validation_0-auc:0.86618 validation_1-auc:0.84116 [88] validation_0-auc:0.86644 validation_1-auc:0.84116 [89] validation_0-auc:0.86686 validation_1-auc:0.84144 [90] validation_0-auc:0.86707 validation_1-auc:0.84125 [91] validation_0-auc:0.86721 validation_1-auc:0.84144 [92] validation_0-auc:0.86743 validation_1-auc:0.84143 [93] validation_0-auc:0.86771 validation_1-auc:0.84160 [94] validation_0-auc:0.86807 validation_1-auc:0.84154 [95] validation_0-auc:0.86823 validation_1-auc:0.84149 [96] validation_0-auc:0.86843 validation_1-auc:0.84156 [97] validation_0-auc:0.86860 validation_1-auc:0.84159 [98] validation_0-auc:0.86881 validation_1-auc:0.84167 [99] validation_0-auc:0.86902 validation_1-auc:0.84168 [100] validation_0-auc:0.86919 validation_1-auc:0.84163 [101] validation_0-auc:0.86925 validation_1-auc:0.84163 [102] validation_0-auc:0.86960 validation_1-auc:0.84177 [103] validation_0-auc:0.86976 validation_1-auc:0.84180 [104] validation_0-auc:0.86996 validation_1-auc:0.84189 [105] validation_0-auc:0.87023 validation_1-auc:0.84202 [106] validation_0-auc:0.87045 validation_1-auc:0.84195 [107] validation_0-auc:0.87078 validation_1-auc:0.84188 [108] validation_0-auc:0.87092 validation_1-auc:0.84199 [109] validation_0-auc:0.87112 validation_1-auc:0.84217 [110] validation_0-auc:0.87141 validation_1-auc:0.84213 [111] validation_0-auc:0.87192 validation_1-auc:0.84244 [112] validation_0-auc:0.87232 validation_1-auc:0.84243 [113] validation_0-auc:0.87242 validation_1-auc:0.84232 [114] validation_0-auc:0.87252 validation_1-auc:0.84229 [115] validation_0-auc:0.87279 validation_1-auc:0.84238 [116] validation_0-auc:0.87282 validation_1-auc:0.84228 [117] validation_0-auc:0.87282 validation_1-auc:0.84234 [118] validation_0-auc:0.87274 validation_1-auc:0.84222 [119] validation_0-auc:0.87291 validation_1-auc:0.84231 [120] validation_0-auc:0.87319 validation_1-auc:0.84241 [121] validation_0-auc:0.87352 validation_1-auc:0.84235 [122] validation_0-auc:0.87383 validation_1-auc:0.84233 [123] validation_0-auc:0.87422 validation_1-auc:0.84236 [124] validation_0-auc:0.87438 validation_1-auc:0.84269 [125] validation_0-auc:0.87444 validation_1-auc:0.84250 [126] validation_0-auc:0.87460 validation_1-auc:0.84279 [127] validation_0-auc:0.87495 validation_1-auc:0.84275 [128] validation_0-auc:0.87543 validation_1-auc:0.84248 [129] validation_0-auc:0.87577 validation_1-auc:0.84236 [130] validation_0-auc:0.87601 validation_1-auc:0.84235 [131] validation_0-auc:0.87627 validation_1-auc:0.84224 [132] validation_0-auc:0.87640 validation_1-auc:0.84255 [133] validation_0-auc:0.87656 validation_1-auc:0.84265 [134] validation_0-auc:0.87688 validation_1-auc:0.84274 [135] validation_0-auc:0.87708 validation_1-auc:0.84283 [136] validation_0-auc:0.87711 validation_1-auc:0.84300 [137] validation_0-auc:0.87728 validation_1-auc:0.84307 [138] validation_0-auc:0.87767 validation_1-auc:0.84305 [139] validation_0-auc:0.87803 validation_1-auc:0.84299 [140] validation_0-auc:0.87816 validation_1-auc:0.84308 [141] validation_0-auc:0.87825 validation_1-auc:0.84306 [142] validation_0-auc:0.87861 validation_1-auc:0.84288 [143] validation_0-auc:0.87879 validation_1-auc:0.84282 [144] validation_0-auc:0.87896 validation_1-auc:0.84296 [145] validation_0-auc:0.87911 validation_1-auc:0.84314 [146] validation_0-auc:0.87940 validation_1-auc:0.84316 [147] validation_0-auc:0.87956 validation_1-auc:0.84323 [148] validation_0-auc:0.87977 validation_1-auc:0.84337 [149] validation_0-auc:0.88011 validation_1-auc:0.84347 [150] validation_0-auc:0.88016 validation_1-auc:0.84351 [151] validation_0-auc:0.88036 validation_1-auc:0.84368 [152] validation_0-auc:0.88043 validation_1-auc:0.84369 [153] validation_0-auc:0.88086 validation_1-auc:0.84372 [154] validation_0-auc:0.88116 validation_1-auc:0.84389 [155] validation_0-auc:0.88139 validation_1-auc:0.84393 [156] validation_0-auc:0.88165 validation_1-auc:0.84387 [157] validation_0-auc:0.88188 validation_1-auc:0.84389 [158] validation_0-auc:0.88209 validation_1-auc:0.84390 [159] validation_0-auc:0.88219 validation_1-auc:0.84396 [160] validation_0-auc:0.88243 validation_1-auc:0.84418 [161] validation_0-auc:0.88281 validation_1-auc:0.84436 [162] validation_0-auc:0.88284 validation_1-auc:0.84432 [163] validation_0-auc:0.88293 validation_1-auc:0.84461 [164] validation_0-auc:0.88323 validation_1-auc:0.84463 [165] validation_0-auc:0.88352 validation_1-auc:0.84467 [166] validation_0-auc:0.88386 validation_1-auc:0.84495 [167] validation_0-auc:0.88421 validation_1-auc:0.84493 [168] validation_0-auc:0.88450 validation_1-auc:0.84497 [169] validation_0-auc:0.88477 validation_1-auc:0.84497 [170] validation_0-auc:0.88510 validation_1-auc:0.84503 [171] validation_0-auc:0.88526 validation_1-auc:0.84505 [172] validation_0-auc:0.88553 validation_1-auc:0.84502 [173] validation_0-auc:0.88575 validation_1-auc:0.84502 [174] validation_0-auc:0.88598 validation_1-auc:0.84505 [175] validation_0-auc:0.88622 validation_1-auc:0.84508 [176] validation_0-auc:0.88635 validation_1-auc:0.84517 [177] validation_0-auc:0.88663 validation_1-auc:0.84514 [178] validation_0-auc:0.88676 validation_1-auc:0.84510 [179] validation_0-auc:0.88695 validation_1-auc:0.84513 [180] validation_0-auc:0.88711 validation_1-auc:0.84525 [181] validation_0-auc:0.88735 validation_1-auc:0.84534 [182] validation_0-auc:0.88771 validation_1-auc:0.84527 [183] validation_0-auc:0.88804 validation_1-auc:0.84529 [184] validation_0-auc:0.88842 validation_1-auc:0.84538 [185] validation_0-auc:0.88862 validation_1-auc:0.84526 [186] validation_0-auc:0.88875 validation_1-auc:0.84521 [187] validation_0-auc:0.88899 validation_1-auc:0.84531 [188] validation_0-auc:0.88918 validation_1-auc:0.84538 [189] validation_0-auc:0.88950 validation_1-auc:0.84530 [190] validation_0-auc:0.88973 validation_1-auc:0.84513 [191] validation_0-auc:0.88990 validation_1-auc:0.84525 [192] validation_0-auc:0.89009 validation_1-auc:0.84528 [193] validation_0-auc:0.89033 validation_1-auc:0.84525 [194] validation_0-auc:0.89049 validation_1-auc:0.84522 [195] validation_0-auc:0.89063 validation_1-auc:0.84529 [196] validation_0-auc:0.89082 validation_1-auc:0.84541 [197] validation_0-auc:0.89096 validation_1-auc:0.84531 [198] validation_0-auc:0.89123 validation_1-auc:0.84544 [199] validation_0-auc:0.89135 validation_1-auc:0.84557 [200] validation_0-auc:0.89146 validation_1-auc:0.84557 [201] validation_0-auc:0.89176 validation_1-auc:0.84560 [202] validation_0-auc:0.89197 validation_1-auc:0.84568 [203] validation_0-auc:0.89227 validation_1-auc:0.84576 [204] validation_0-auc:0.89253 validation_1-auc:0.84571 [205] validation_0-auc:0.89269 validation_1-auc:0.84576 [206] validation_0-auc:0.89288 validation_1-auc:0.84578 [207] validation_0-auc:0.89307 validation_1-auc:0.84584 [208] validation_0-auc:0.89325 validation_1-auc:0.84590 [209] validation_0-auc:0.89344 validation_1-auc:0.84598 [210] validation_0-auc:0.89364 validation_1-auc:0.84598 [211] validation_0-auc:0.89381 validation_1-auc:0.84591 [212] validation_0-auc:0.89400 validation_1-auc:0.84591 [213] validation_0-auc:0.89420 validation_1-auc:0.84590 [214] validation_0-auc:0.89443 validation_1-auc:0.84589 [215] validation_0-auc:0.89456 validation_1-auc:0.84585 [216] validation_0-auc:0.89472 validation_1-auc:0.84582 [217] validation_0-auc:0.89490 validation_1-auc:0.84577 [218] validation_0-auc:0.89510 validation_1-auc:0.84579 [219] validation_0-auc:0.89534 validation_1-auc:0.84582 [220] validation_0-auc:0.89549 validation_1-auc:0.84580 [221] validation_0-auc:0.89570 validation_1-auc:0.84588 [222] validation_0-auc:0.89597 validation_1-auc:0.84581 [223] validation_0-auc:0.89619 validation_1-auc:0.84583 [224] validation_0-auc:0.89631 validation_1-auc:0.84582 [225] validation_0-auc:0.89648 validation_1-auc:0.84574 [226] validation_0-auc:0.89665 validation_1-auc:0.84569 [227] validation_0-auc:0.89684 validation_1-auc:0.84571 [228] validation_0-auc:0.89707 validation_1-auc:0.84562 [229] validation_0-auc:0.89720 validation_1-auc:0.84559 [230] validation_0-auc:0.89739 validation_1-auc:0.84546 [231] validation_0-auc:0.89756 validation_1-auc:0.84540 [232] validation_0-auc:0.89779 validation_1-auc:0.84534 [233] validation_0-auc:0.89790 validation_1-auc:0.84528 [234] validation_0-auc:0.89806 validation_1-auc:0.84528 [235] validation_0-auc:0.89819 validation_1-auc:0.84530 [236] validation_0-auc:0.89832 validation_1-auc:0.84528 [237] validation_0-auc:0.89841 validation_1-auc:0.84526 [238] validation_0-auc:0.89856 validation_1-auc:0.84523 [239] validation_0-auc:0.89873 validation_1-auc:0.84516 [240] validation_0-auc:0.89896 validation_1-auc:0.84512 [241] validation_0-auc:0.89912 validation_1-auc:0.84505 [242] validation_0-auc:0.89926 validation_1-auc:0.84501 [243] validation_0-auc:0.89943 validation_1-auc:0.84494 [244] validation_0-auc:0.89957 validation_1-auc:0.84486 [245] validation_0-auc:0.89969 validation_1-auc:0.84489 [246] validation_0-auc:0.89984 validation_1-auc:0.84491 [247] validation_0-auc:0.89997 validation_1-auc:0.84499 [248] validation_0-auc:0.90013 validation_1-auc:0.84494 [249] validation_0-auc:0.90024 validation_1-auc:0.84497 [250] validation_0-auc:0.90037 validation_1-auc:0.84501 [251] validation_0-auc:0.90046 validation_1-auc:0.84497 [252] validation_0-auc:0.90059 validation_1-auc:0.84494 [253] validation_0-auc:0.90065 validation_1-auc:0.84493 [254] validation_0-auc:0.90076 validation_1-auc:0.84495 [255] validation_0-auc:0.90093 validation_1-auc:0.84500 [256] validation_0-auc:0.90097 validation_1-auc:0.84491 [257] validation_0-auc:0.90107 validation_1-auc:0.84487 [258] validation_0-auc:0.90115 validation_1-auc:0.84484 [259] validation_0-auc:0.90129 validation_1-auc:0.84478 [260] validation_0-auc:0.90133 validation_1-auc:0.84473 [261] validation_0-auc:0.90144 validation_1-auc:0.84468 [262] validation_0-auc:0.90160 validation_1-auc:0.84472 [263] validation_0-auc:0.90177 validation_1-auc:0.84465 [264] validation_0-auc:0.90188 validation_1-auc:0.84466 [265] validation_0-auc:0.90210 validation_1-auc:0.84468 [266] validation_0-auc:0.90222 validation_1-auc:0.84467 [267] validation_0-auc:0.90226 validation_1-auc:0.84469 [268] validation_0-auc:0.90241 validation_1-auc:0.84464 [269] validation_0-auc:0.90245 validation_1-auc:0.84456 [270] validation_0-auc:0.90258 validation_1-auc:0.84463 [271] validation_0-auc:0.90268 validation_1-auc:0.84464 [272] validation_0-auc:0.90274 validation_1-auc:0.84459 [273] validation_0-auc:0.90286 validation_1-auc:0.84461 [274] validation_0-auc:0.90296 validation_1-auc:0.84468 [275] validation_0-auc:0.90305 validation_1-auc:0.84475 [276] validation_0-auc:0.90315 validation_1-auc:0.84471 [277] validation_0-auc:0.90325 validation_1-auc:0.84471 [278] validation_0-auc:0.90339 validation_1-auc:0.84465 [279] validation_0-auc:0.90351 validation_1-auc:0.84459 [280] validation_0-auc:0.90359 validation_1-auc:0.84464 [281] validation_0-auc:0.90366 validation_1-auc:0.84465 [282] validation_0-auc:0.90373 validation_1-auc:0.84465 [283] validation_0-auc:0.90384 validation_1-auc:0.84461 [284] validation_0-auc:0.90388 validation_1-auc:0.84460 [285] validation_0-auc:0.90397 validation_1-auc:0.84455 [286] validation_0-auc:0.90416 validation_1-auc:0.84450 [287] validation_0-auc:0.90421 validation_1-auc:0.84450 [288] validation_0-auc:0.90425 validation_1-auc:0.84454 [289] validation_0-auc:0.90431 validation_1-auc:0.84450 [290] validation_0-auc:0.90440 validation_1-auc:0.84443 [291] validation_0-auc:0.90457 validation_1-auc:0.84440 [292] validation_0-auc:0.90476 validation_1-auc:0.84436 [293] validation_0-auc:0.90492 validation_1-auc:0.84442 [294] validation_0-auc:0.90506 validation_1-auc:0.84443 [295] validation_0-auc:0.90516 validation_1-auc:0.84450 [296] validation_0-auc:0.90523 validation_1-auc:0.84449 [297] validation_0-auc:0.90542 validation_1-auc:0.84444 [298] validation_0-auc:0.90550 validation_1-auc:0.84440 [299] validation_0-auc:0.90555 validation_1-auc:0.84439 [300] validation_0-auc:0.90562 validation_1-auc:0.84433 [301] validation_0-auc:0.90564 validation_1-auc:0.84436 [302] validation_0-auc:0.90581 validation_1-auc:0.84440 [303] validation_0-auc:0.90588 validation_1-auc:0.84434 [304] validation_0-auc:0.90602 validation_1-auc:0.84429 [305] validation_0-auc:0.90623 validation_1-auc:0.84427 [306] validation_0-auc:0.90629 validation_1-auc:0.84428 [307] validation_0-auc:0.90639 validation_1-auc:0.84432 [308] validation_0-auc:0.90646 validation_1-auc:0.84432 [309] validation_0-auc:0.90651 validation_1-auc:0.84438 [310] validation_0-auc:0.90658 validation_1-auc:0.84431 [311] validation_0-auc:0.90664 validation_1-auc:0.84432 [312] validation_0-auc:0.90678 validation_1-auc:0.84433 [313] validation_0-auc:0.90684 validation_1-auc:0.84428 [314] validation_0-auc:0.90689 validation_1-auc:0.84438 [315] validation_0-auc:0.90694 validation_1-auc:0.84442 [316] validation_0-auc:0.90702 validation_1-auc:0.84433 [317] validation_0-auc:0.90708 validation_1-auc:0.84431 [318] validation_0-auc:0.90713 validation_1-auc:0.84428 [319] validation_0-auc:0.90719 validation_1-auc:0.84424 [320] validation_0-auc:0.90725 validation_1-auc:0.84423 [321] validation_0-auc:0.90732 validation_1-auc:0.84424 [322] validation_0-auc:0.90737 validation_1-auc:0.84423 [323] validation_0-auc:0.90740 validation_1-auc:0.84429 [324] validation_0-auc:0.90752 validation_1-auc:0.84426 [325] validation_0-auc:0.90755 validation_1-auc:0.84429 [326] validation_0-auc:0.90759 validation_1-auc:0.84429 [327] validation_0-auc:0.90774 validation_1-auc:0.84434 [328] validation_0-auc:0.90791 validation_1-auc:0.84434 [329] validation_0-auc:0.90800 validation_1-auc:0.84430 [330] validation_0-auc:0.90802 validation_1-auc:0.84432 [331] validation_0-auc:0.90809 validation_1-auc:0.84429 [332] validation_0-auc:0.90819 validation_1-auc:0.84426 [333] validation_0-auc:0.90823 validation_1-auc:0.84427 [334] validation_0-auc:0.90825 validation_1-auc:0.84428 [335] validation_0-auc:0.90841 validation_1-auc:0.84425 [336] validation_0-auc:0.90848 validation_1-auc:0.84424 [337] validation_0-auc:0.90858 validation_1-auc:0.84417 [338] validation_0-auc:0.90870 validation_1-auc:0.84422 [339] validation_0-auc:0.90882 validation_1-auc:0.84424 [340] validation_0-auc:0.90899 validation_1-auc:0.84431 [341] validation_0-auc:0.90905 validation_1-auc:0.84430 [342] validation_0-auc:0.90907 validation_1-auc:0.84432 [343] validation_0-auc:0.90919 validation_1-auc:0.84439 [344] validation_0-auc:0.90926 validation_1-auc:0.84442 [345] validation_0-auc:0.90928 validation_1-auc:0.84443 [346] validation_0-auc:0.90943 validation_1-auc:0.84443 [347] validation_0-auc:0.90947 validation_1-auc:0.84442 [348] validation_0-auc:0.90955 validation_1-auc:0.84441 [349] validation_0-auc:0.90957 validation_1-auc:0.84443 [350] validation_0-auc:0.90962 validation_1-auc:0.84442 [351] validation_0-auc:0.90977 validation_1-auc:0.84441 [352] validation_0-auc:0.90982 validation_1-auc:0.84439 [353] validation_0-auc:0.90995 validation_1-auc:0.84440 [354] validation_0-auc:0.90998 validation_1-auc:0.84440 [355] validation_0-auc:0.91000 validation_1-auc:0.84436 [356] validation_0-auc:0.91002 validation_1-auc:0.84436 [357] validation_0-auc:0.91009 validation_1-auc:0.84436 [358] validation_0-auc:0.91021 validation_1-auc:0.84434 [359] validation_0-auc:0.91025 validation_1-auc:0.84433 [360] validation_0-auc:0.91035 validation_1-auc:0.84437 [361] validation_0-auc:0.91045 validation_1-auc:0.84431 [362] validation_0-auc:0.91048 validation_1-auc:0.84435 [363] validation_0-auc:0.91064 validation_1-auc:0.84437 [364] validation_0-auc:0.91065 validation_1-auc:0.84436 [365] validation_0-auc:0.91076 validation_1-auc:0.84433 [366] validation_0-auc:0.91087 validation_1-auc:0.84428 [367] validation_0-auc:0.91091 validation_1-auc:0.84427 [368] validation_0-auc:0.91094 validation_1-auc:0.84428 [369] validation_0-auc:0.91095 validation_1-auc:0.84427 [370] validation_0-auc:0.91097 validation_1-auc:0.84430 [371] validation_0-auc:0.91103 validation_1-auc:0.84433 [372] validation_0-auc:0.91112 validation_1-auc:0.84426 [373] validation_0-auc:0.91125 validation_1-auc:0.84426 [374] validation_0-auc:0.91138 validation_1-auc:0.84424 [375] validation_0-auc:0.91141 validation_1-auc:0.84422 [376] validation_0-auc:0.91145 validation_1-auc:0.84418 [377] validation_0-auc:0.91156 validation_1-auc:0.84416 [378] validation_0-auc:0.91166 validation_1-auc:0.84420 [379] validation_0-auc:0.91170 validation_1-auc:0.84417 [380] validation_0-auc:0.91181 validation_1-auc:0.84414 [381] validation_0-auc:0.91184 validation_1-auc:0.84416 [382] validation_0-auc:0.91188 validation_1-auc:0.84414 [383] validation_0-auc:0.91200 validation_1-auc:0.84408 [384] validation_0-auc:0.91207 validation_1-auc:0.84407 [385] validation_0-auc:0.91218 validation_1-auc:0.84403 [386] validation_0-auc:0.91221 validation_1-auc:0.84401 [387] validation_0-auc:0.91223 validation_1-auc:0.84400 [388] validation_0-auc:0.91229 validation_1-auc:0.84398 [389] validation_0-auc:0.91233 validation_1-auc:0.84402 [390] validation_0-auc:0.91236 validation_1-auc:0.84405 [391] validation_0-auc:0.91239 validation_1-auc:0.84404 [392] validation_0-auc:0.91243 validation_1-auc:0.84404 [393] validation_0-auc:0.91246 validation_1-auc:0.84405 [394] validation_0-auc:0.91250 validation_1-auc:0.84404 [395] validation_0-auc:0.91250 validation_1-auc:0.84402 [396] validation_0-auc:0.91257 validation_1-auc:0.84401 [397] validation_0-auc:0.91260 validation_1-auc:0.84400 [398] validation_0-auc:0.91261 validation_1-auc:0.84399 [399] validation_0-auc:0.91269 validation_1-auc:0.84398 [400] validation_0-auc:0.91269 validation_1-auc:0.84398 [401] validation_0-auc:0.91273 validation_1-auc:0.84396 [402] validation_0-auc:0.91276 validation_1-auc:0.84394 [403] validation_0-auc:0.91276 validation_1-auc:0.84396 [404] validation_0-auc:0.91280 validation_1-auc:0.84394 [405] validation_0-auc:0.91289 validation_1-auc:0.84396 [406] validation_0-auc:0.91290 validation_1-auc:0.84396 [407] validation_0-auc:0.91292 validation_1-auc:0.84395 [408] validation_0-auc:0.91293 validation_1-auc:0.84396 [409] validation_0-auc:0.91295 validation_1-auc:0.84395 [410] validation_0-auc:0.91296 validation_1-auc:0.84398 ROC AUC: 0.8460
- ROC-AUC가 0.8460으로 이전보다 성능이 더 향상됨
### 피처 중요도 시각화
from xgboost import plot_importance
import matplotlib.pyplot as plt
%matplotlib inline
fig, ax = plt.subplots(1, 1, figsize = (10,8))
plot_importance(xgb_clf, ax = ax , max_num_features = 20, height = 0.4)
<Axes: title={'center': 'Feature importance'}, xlabel='F score', ylabel='Features'>
2-2. LightGBM
a) 기본 모델 생성/학습/예측
from lightgbm import LGBMClassifier
### 모델 객체 생성
lgbm_clf = LGBMClassifier(n_estimators = 500)
evals = [(X_test, y_test)]
### 학습
lgbm_clf.fit(X_train, y_train, early_stopping_rounds = 100,
eval_metric = "auc", eval_set = evals, verbose = True)
### 평가
lgbm_roc_score = roc_auc_score(y_test, lgbm_clf.predict_proba(X_test)[:,1],average = 'macro')
print('ROC AUC: {0:.4f}'.format(lgbm_roc_score))
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. " /usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:736: UserWarning: 'verbose' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead. _log_warning("'verbose' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.817384 valid_0's binary_logloss: 0.165046 [2] valid_0's auc: 0.818903 valid_0's binary_logloss: 0.160006 [3] valid_0's auc: 0.827707 valid_0's binary_logloss: 0.156323 [4] valid_0's auc: 0.832155 valid_0's binary_logloss: 0.153463 [5] valid_0's auc: 0.834677 valid_0's binary_logloss: 0.151256 [6] valid_0's auc: 0.834093 valid_0's binary_logloss: 0.149427 [7] valid_0's auc: 0.837046 valid_0's binary_logloss: 0.147961 [8] valid_0's auc: 0.837838 valid_0's binary_logloss: 0.146591 [9] valid_0's auc: 0.839435 valid_0's binary_logloss: 0.145455 [10] valid_0's auc: 0.83973 valid_0's binary_logloss: 0.144486 [11] valid_0's auc: 0.839799 valid_0's binary_logloss: 0.143769 [12] valid_0's auc: 0.840034 valid_0's binary_logloss: 0.143146 [13] valid_0's auc: 0.840271 valid_0's binary_logloss: 0.142533 [14] valid_0's auc: 0.840342 valid_0's binary_logloss: 0.142036 [15] valid_0's auc: 0.840928 valid_0's binary_logloss: 0.14161 [16] valid_0's auc: 0.840337 valid_0's binary_logloss: 0.141307 [17] valid_0's auc: 0.839901 valid_0's binary_logloss: 0.141152 [18] valid_0's auc: 0.839742 valid_0's binary_logloss: 0.141018 [19] valid_0's auc: 0.839818 valid_0's binary_logloss: 0.14068 [20] valid_0's auc: 0.839307 valid_0's binary_logloss: 0.140562 [21] valid_0's auc: 0.839662 valid_0's binary_logloss: 0.140353 [22] valid_0's auc: 0.840411 valid_0's binary_logloss: 0.140144 [23] valid_0's auc: 0.840522 valid_0's binary_logloss: 0.139983 [24] valid_0's auc: 0.840208 valid_0's binary_logloss: 0.139943 [25] valid_0's auc: 0.839578 valid_0's binary_logloss: 0.139898 [26] valid_0's auc: 0.83975 valid_0's binary_logloss: 0.139814 [27] valid_0's auc: 0.83988 valid_0's binary_logloss: 0.139711 [28] valid_0's auc: 0.839704 valid_0's binary_logloss: 0.139681 [29] valid_0's auc: 0.839432 valid_0's binary_logloss: 0.139662 [30] valid_0's auc: 0.839196 valid_0's binary_logloss: 0.139641 [31] valid_0's auc: 0.838891 valid_0's binary_logloss: 0.139654 [32] valid_0's auc: 0.838943 valid_0's binary_logloss: 0.1396 [33] valid_0's auc: 0.838632 valid_0's binary_logloss: 0.139642 [34] valid_0's auc: 0.838314 valid_0's binary_logloss: 0.139687 [35] valid_0's auc: 0.83844 valid_0's binary_logloss: 0.139668 [36] valid_0's auc: 0.839074 valid_0's binary_logloss: 0.139562 [37] valid_0's auc: 0.838806 valid_0's binary_logloss: 0.139594 [38] valid_0's auc: 0.839041 valid_0's binary_logloss: 0.139574 [39] valid_0's auc: 0.839081 valid_0's binary_logloss: 0.139587 [40] valid_0's auc: 0.839276 valid_0's binary_logloss: 0.139504 [41] valid_0's auc: 0.83951 valid_0's binary_logloss: 0.139481 [42] valid_0's auc: 0.839544 valid_0's binary_logloss: 0.139487 [43] valid_0's auc: 0.839673 valid_0's binary_logloss: 0.139478 [44] valid_0's auc: 0.839677 valid_0's binary_logloss: 0.139453 [45] valid_0's auc: 0.839703 valid_0's binary_logloss: 0.139445 [46] valid_0's auc: 0.839601 valid_0's binary_logloss: 0.139468 [47] valid_0's auc: 0.839318 valid_0's binary_logloss: 0.139529 [48] valid_0's auc: 0.839462 valid_0's binary_logloss: 0.139486 [49] valid_0's auc: 0.839288 valid_0's binary_logloss: 0.139492 [50] valid_0's auc: 0.838987 valid_0's binary_logloss: 0.139572 [51] valid_0's auc: 0.838845 valid_0's binary_logloss: 0.139603 [52] valid_0's auc: 0.838655 valid_0's binary_logloss: 0.139623 [53] valid_0's auc: 0.838783 valid_0's binary_logloss: 0.139609 [54] valid_0's auc: 0.838695 valid_0's binary_logloss: 0.139638 [55] valid_0's auc: 0.838868 valid_0's binary_logloss: 0.139625 [56] valid_0's auc: 0.838653 valid_0's binary_logloss: 0.139645 [57] valid_0's auc: 0.83856 valid_0's binary_logloss: 0.139688 [58] valid_0's auc: 0.838475 valid_0's binary_logloss: 0.139694 [59] valid_0's auc: 0.8384 valid_0's binary_logloss: 0.139682 [60] valid_0's auc: 0.838319 valid_0's binary_logloss: 0.13969 [61] valid_0's auc: 0.838209 valid_0's binary_logloss: 0.13973 [62] valid_0's auc: 0.83806 valid_0's binary_logloss: 0.139765 [63] valid_0's auc: 0.838096 valid_0's binary_logloss: 0.139749 [64] valid_0's auc: 0.838163 valid_0's binary_logloss: 0.139746 [65] valid_0's auc: 0.838183 valid_0's binary_logloss: 0.139805 [66] valid_0's auc: 0.838215 valid_0's binary_logloss: 0.139815 [67] valid_0's auc: 0.838268 valid_0's binary_logloss: 0.139822 [68] valid_0's auc: 0.83836 valid_0's binary_logloss: 0.139816 [69] valid_0's auc: 0.838114 valid_0's binary_logloss: 0.139874 [70] valid_0's auc: 0.83832 valid_0's binary_logloss: 0.139816 [71] valid_0's auc: 0.838256 valid_0's binary_logloss: 0.139818 [72] valid_0's auc: 0.838231 valid_0's binary_logloss: 0.139845 [73] valid_0's auc: 0.838028 valid_0's binary_logloss: 0.139888 [74] valid_0's auc: 0.837912 valid_0's binary_logloss: 0.139905 [75] valid_0's auc: 0.83772 valid_0's binary_logloss: 0.13992 [76] valid_0's auc: 0.837606 valid_0's binary_logloss: 0.139899 [77] valid_0's auc: 0.837521 valid_0's binary_logloss: 0.139925 [78] valid_0's auc: 0.837462 valid_0's binary_logloss: 0.139957 [79] valid_0's auc: 0.837541 valid_0's binary_logloss: 0.139944 [80] valid_0's auc: 0.838013 valid_0's binary_logloss: 0.13983 [81] valid_0's auc: 0.83789 valid_0's binary_logloss: 0.139874 [82] valid_0's auc: 0.837671 valid_0's binary_logloss: 0.139975 [83] valid_0's auc: 0.837707 valid_0's binary_logloss: 0.139972 [84] valid_0's auc: 0.837631 valid_0's binary_logloss: 0.140011 [85] valid_0's auc: 0.837496 valid_0's binary_logloss: 0.140023 [86] valid_0's auc: 0.83757 valid_0's binary_logloss: 0.140021 [87] valid_0's auc: 0.837284 valid_0's binary_logloss: 0.140099 [88] valid_0's auc: 0.837228 valid_0's binary_logloss: 0.140115 [89] valid_0's auc: 0.836964 valid_0's binary_logloss: 0.140172 [90] valid_0's auc: 0.836752 valid_0's binary_logloss: 0.140225 [91] valid_0's auc: 0.836833 valid_0's binary_logloss: 0.140221 [92] valid_0's auc: 0.836648 valid_0's binary_logloss: 0.140277 [93] valid_0's auc: 0.836648 valid_0's binary_logloss: 0.140315 [94] valid_0's auc: 0.836677 valid_0's binary_logloss: 0.140321 [95] valid_0's auc: 0.836729 valid_0's binary_logloss: 0.140307 [96] valid_0's auc: 0.8368 valid_0's binary_logloss: 0.140313 [97] valid_0's auc: 0.836797 valid_0's binary_logloss: 0.140331 [98] valid_0's auc: 0.836675 valid_0's binary_logloss: 0.140361 [99] valid_0's auc: 0.83655 valid_0's binary_logloss: 0.14039 [100] valid_0's auc: 0.836518 valid_0's binary_logloss: 0.1404 [101] valid_0's auc: 0.836998 valid_0's binary_logloss: 0.140294 [102] valid_0's auc: 0.836778 valid_0's binary_logloss: 0.140366 [103] valid_0's auc: 0.83694 valid_0's binary_logloss: 0.140333 [104] valid_0's auc: 0.836749 valid_0's binary_logloss: 0.14039 [105] valid_0's auc: 0.836752 valid_0's binary_logloss: 0.140391 [106] valid_0's auc: 0.837197 valid_0's binary_logloss: 0.140305 [107] valid_0's auc: 0.837141 valid_0's binary_logloss: 0.140329 [108] valid_0's auc: 0.8371 valid_0's binary_logloss: 0.140344 [109] valid_0's auc: 0.837136 valid_0's binary_logloss: 0.14033 [110] valid_0's auc: 0.837102 valid_0's binary_logloss: 0.140388 [111] valid_0's auc: 0.836957 valid_0's binary_logloss: 0.140426 [112] valid_0's auc: 0.836779 valid_0's binary_logloss: 0.14051 [113] valid_0's auc: 0.836831 valid_0's binary_logloss: 0.140526 [114] valid_0's auc: 0.836783 valid_0's binary_logloss: 0.14055 [115] valid_0's auc: 0.836672 valid_0's binary_logloss: 0.140585 ROC AUC: 0.8409
-
ROC AUC: 0.8409
-
LightGBM이 XGBoost에 비해 학습에 걸리는 시간이 좀 더 단축됨
b) Hyper Parameter Tuning
-
GridSearchCV 활용
-
튜닝 대상:
num_leaves
,max_depth
,min_child_samples
,subsample
from sklearn.model_selection import GridSearchCV
# 하이퍼 파라미터 테스트의 수행 속도를 향상시키기 위해 n_estimators를 200으로 감소
lgbm_clf = LGBMClassifier(n_estimators = 200)
# tuning 할 파라미터 목록
params = {'num_leaves': [32, 64 ],
'max_depth':[128, 160],
'min_child_samples':[60, 100],
'subsample':[0.8, 1]}
gridcv = GridSearchCV(lgbm_clf, param_grid = params, cv = 3)
gridcv.fit(X_train, y_train, early_stopping_rounds = 30,
eval_metric = "auc", eval_set = [(X_train, y_train), (X_test, y_test)])
print('GridSearchCV 최적 파라미터:', gridcv.best_params_)
lgbm_roc_score = roc_auc_score(y_test, gridcv.predict_proba(X_test)[:,1], average = 'macro')
print('ROC AUC: {0:.4f}'.format(lgbm_roc_score))
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.820235 valid_0's binary_logloss: 0.156085 valid_1's auc: 0.81613 valid_1's binary_logloss: 0.164992 [2] valid_0's auc: 0.825778 valid_0's binary_logloss: 0.150951 valid_1's auc: 0.821835 valid_1's binary_logloss: 0.159874 [3] valid_0's auc: 0.832262 valid_0's binary_logloss: 0.147158 valid_1's auc: 0.826533 valid_1's binary_logloss: 0.156346 [4] valid_0's auc: 0.83865 valid_0's binary_logloss: 0.144126 valid_1's auc: 0.833166 valid_1's binary_logloss: 0.1534 [5] valid_0's auc: 0.842822 valid_0's binary_logloss: 0.141725 valid_1's auc: 0.836448 valid_1's binary_logloss: 0.151167 [6] valid_0's auc: 0.844702 valid_0's binary_logloss: 0.139642 valid_1's auc: 0.837094 valid_1's binary_logloss: 0.149356 [7] valid_0's auc: 0.847144 valid_0's binary_logloss: 0.13794 valid_1's auc: 0.837965 valid_1's binary_logloss: 0.147853 [8] valid_0's auc: 0.848277 valid_0's binary_logloss: 0.136499 valid_1's auc: 0.837663 valid_1's binary_logloss: 0.146543 [9] valid_0's auc: 0.849328 valid_0's binary_logloss: 0.135326 valid_1's auc: 0.837413 valid_1's binary_logloss: 0.145528 [10] valid_0's auc: 0.851112 valid_0's binary_logloss: 0.134188 valid_1's auc: 0.836954 valid_1's binary_logloss: 0.14466 [11] valid_0's auc: 0.852613 valid_0's binary_logloss: 0.133257 valid_1's auc: 0.837393 valid_1's binary_logloss: 0.143843 [12] valid_0's auc: 0.854906 valid_0's binary_logloss: 0.132346 valid_1's auc: 0.837459 valid_1's binary_logloss: 0.143285 [13] valid_0's auc: 0.855656 valid_0's binary_logloss: 0.131601 valid_1's auc: 0.837612 valid_1's binary_logloss: 0.142732 [14] valid_0's auc: 0.857076 valid_0's binary_logloss: 0.130884 valid_1's auc: 0.837055 valid_1's binary_logloss: 0.142403 [15] valid_0's auc: 0.857961 valid_0's binary_logloss: 0.130252 valid_1's auc: 0.837198 valid_1's binary_logloss: 0.142031 [16] valid_0's auc: 0.860191 valid_0's binary_logloss: 0.129596 valid_1's auc: 0.836016 valid_1's binary_logloss: 0.141822 [17] valid_0's auc: 0.860941 valid_0's binary_logloss: 0.129064 valid_1's auc: 0.836076 valid_1's binary_logloss: 0.141551 [18] valid_0's auc: 0.862201 valid_0's binary_logloss: 0.128565 valid_1's auc: 0.835929 valid_1's binary_logloss: 0.141326 [19] valid_0's auc: 0.863581 valid_0's binary_logloss: 0.128105 valid_1's auc: 0.835256 valid_1's binary_logloss: 0.141243 [20] valid_0's auc: 0.864799 valid_0's binary_logloss: 0.127654 valid_1's auc: 0.83435 valid_1's binary_logloss: 0.141148 [21] valid_0's auc: 0.866472 valid_0's binary_logloss: 0.127165 valid_1's auc: 0.834176 valid_1's binary_logloss: 0.141041 [22] valid_0's auc: 0.867055 valid_0's binary_logloss: 0.126777 valid_1's auc: 0.834173 valid_1's binary_logloss: 0.140887 [23] valid_0's auc: 0.867726 valid_0's binary_logloss: 0.12643 valid_1's auc: 0.833577 valid_1's binary_logloss: 0.140909 [24] valid_0's auc: 0.868612 valid_0's binary_logloss: 0.126061 valid_1's auc: 0.833336 valid_1's binary_logloss: 0.140824 [25] valid_0's auc: 0.869224 valid_0's binary_logloss: 0.125753 valid_1's auc: 0.833428 valid_1's binary_logloss: 0.140793 [26] valid_0's auc: 0.870183 valid_0's binary_logloss: 0.125414 valid_1's auc: 0.83333 valid_1's binary_logloss: 0.140724 [27] valid_0's auc: 0.870926 valid_0's binary_logloss: 0.125123 valid_1's auc: 0.832503 valid_1's binary_logloss: 0.140772 [28] valid_0's auc: 0.872431 valid_0's binary_logloss: 0.124766 valid_1's auc: 0.832826 valid_1's binary_logloss: 0.140685 [29] valid_0's auc: 0.873397 valid_0's binary_logloss: 0.124495 valid_1's auc: 0.833175 valid_1's binary_logloss: 0.140604 [30] valid_0's auc: 0.87475 valid_0's binary_logloss: 0.12417 valid_1's auc: 0.833614 valid_1's binary_logloss: 0.140497 [31] valid_0's auc: 0.875407 valid_0's binary_logloss: 0.12389 valid_1's auc: 0.833706 valid_1's binary_logloss: 0.140428 [32] valid_0's auc: 0.876136 valid_0's binary_logloss: 0.123637 valid_1's auc: 0.833458 valid_1's binary_logloss: 0.140448 [33] valid_0's auc: 0.876575 valid_0's binary_logloss: 0.123421 valid_1's auc: 0.832965 valid_1's binary_logloss: 0.140498 [34] valid_0's auc: 0.877224 valid_0's binary_logloss: 0.123219 valid_1's auc: 0.832659 valid_1's binary_logloss: 0.140537 [35] valid_0's auc: 0.877898 valid_0's binary_logloss: 0.122947 valid_1's auc: 0.832787 valid_1's binary_logloss: 0.140536 [36] valid_0's auc: 0.878334 valid_0's binary_logloss: 0.122724 valid_1's auc: 0.832724 valid_1's binary_logloss: 0.14053 [37] valid_0's auc: 0.878762 valid_0's binary_logloss: 0.122514 valid_1's auc: 0.832581 valid_1's binary_logloss: 0.140533
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.814371 valid_0's binary_logloss: 0.156452 valid_1's auc: 0.813175 valid_1's binary_logloss: 0.165418 [2] valid_0's auc: 0.827277 valid_0's binary_logloss: 0.151084 valid_1's auc: 0.819635 valid_1's binary_logloss: 0.160159 [3] valid_0's auc: 0.837033 valid_0's binary_logloss: 0.14722 valid_1's auc: 0.828221 valid_1's binary_logloss: 0.156492 [4] valid_0's auc: 0.840167 valid_0's binary_logloss: 0.14423 valid_1's auc: 0.830942 valid_1's binary_logloss: 0.153586 [5] valid_0's auc: 0.842499 valid_0's binary_logloss: 0.141721 valid_1's auc: 0.833301 valid_1's binary_logloss: 0.151219 [6] valid_0's auc: 0.845403 valid_0's binary_logloss: 0.139708 valid_1's auc: 0.836412 valid_1's binary_logloss: 0.149312 [7] valid_0's auc: 0.848049 valid_0's binary_logloss: 0.138024 valid_1's auc: 0.836054 valid_1's binary_logloss: 0.14779 [8] valid_0's auc: 0.849694 valid_0's binary_logloss: 0.136542 valid_1's auc: 0.837537 valid_1's binary_logloss: 0.146417 [9] valid_0's auc: 0.851646 valid_0's binary_logloss: 0.135289 valid_1's auc: 0.838418 valid_1's binary_logloss: 0.145329 [10] valid_0's auc: 0.853642 valid_0's binary_logloss: 0.134189 valid_1's auc: 0.839342 valid_1's binary_logloss: 0.144374 [11] valid_0's auc: 0.855647 valid_0's binary_logloss: 0.133227 valid_1's auc: 0.840035 valid_1's binary_logloss: 0.143552 [12] valid_0's auc: 0.856768 valid_0's binary_logloss: 0.132399 valid_1's auc: 0.839294 valid_1's binary_logloss: 0.143047 [13] valid_0's auc: 0.85763 valid_0's binary_logloss: 0.13165 valid_1's auc: 0.838911 valid_1's binary_logloss: 0.142469 [14] valid_0's auc: 0.859243 valid_0's binary_logloss: 0.130936 valid_1's auc: 0.838705 valid_1's binary_logloss: 0.141913 [15] valid_0's auc: 0.860124 valid_0's binary_logloss: 0.130312 valid_1's auc: 0.838608 valid_1's binary_logloss: 0.141547 [16] valid_0's auc: 0.861358 valid_0's binary_logloss: 0.129687 valid_1's auc: 0.838422 valid_1's binary_logloss: 0.141134 [17] valid_0's auc: 0.862159 valid_0's binary_logloss: 0.129139 valid_1's auc: 0.838636 valid_1's binary_logloss: 0.140786 [18] valid_0's auc: 0.862729 valid_0's binary_logloss: 0.128664 valid_1's auc: 0.838342 valid_1's binary_logloss: 0.140538 [19] valid_0's auc: 0.863842 valid_0's binary_logloss: 0.128137 valid_1's auc: 0.838464 valid_1's binary_logloss: 0.140331 [20] valid_0's auc: 0.864859 valid_0's binary_logloss: 0.127657 valid_1's auc: 0.837832 valid_1's binary_logloss: 0.140179 [21] valid_0's auc: 0.866227 valid_0's binary_logloss: 0.127137 valid_1's auc: 0.837735 valid_1's binary_logloss: 0.140043 [22] valid_0's auc: 0.866925 valid_0's binary_logloss: 0.126772 valid_1's auc: 0.838268 valid_1's binary_logloss: 0.139927 [23] valid_0's auc: 0.867727 valid_0's binary_logloss: 0.126369 valid_1's auc: 0.838482 valid_1's binary_logloss: 0.139787 [24] valid_0's auc: 0.868239 valid_0's binary_logloss: 0.126013 valid_1's auc: 0.838767 valid_1's binary_logloss: 0.13964 [25] valid_0's auc: 0.869304 valid_0's binary_logloss: 0.125622 valid_1's auc: 0.838562 valid_1's binary_logloss: 0.139648 [26] valid_0's auc: 0.870347 valid_0's binary_logloss: 0.125288 valid_1's auc: 0.838228 valid_1's binary_logloss: 0.139618 [27] valid_0's auc: 0.871198 valid_0's binary_logloss: 0.124953 valid_1's auc: 0.838403 valid_1's binary_logloss: 0.139594 [28] valid_0's auc: 0.872024 valid_0's binary_logloss: 0.124672 valid_1's auc: 0.838405 valid_1's binary_logloss: 0.139526 [29] valid_0's auc: 0.873184 valid_0's binary_logloss: 0.124303 valid_1's auc: 0.838211 valid_1's binary_logloss: 0.139531 [30] valid_0's auc: 0.874076 valid_0's binary_logloss: 0.12403 valid_1's auc: 0.838983 valid_1's binary_logloss: 0.139411 [31] valid_0's auc: 0.874768 valid_0's binary_logloss: 0.123745 valid_1's auc: 0.839314 valid_1's binary_logloss: 0.139314 [32] valid_0's auc: 0.875593 valid_0's binary_logloss: 0.123486 valid_1's auc: 0.838875 valid_1's binary_logloss: 0.139322 [33] valid_0's auc: 0.8767 valid_0's binary_logloss: 0.123182 valid_1's auc: 0.838809 valid_1's binary_logloss: 0.139329 [34] valid_0's auc: 0.87774 valid_0's binary_logloss: 0.122892 valid_1's auc: 0.838376 valid_1's binary_logloss: 0.139342 [35] valid_0's auc: 0.878372 valid_0's binary_logloss: 0.122634 valid_1's auc: 0.838454 valid_1's binary_logloss: 0.13931 [36] valid_0's auc: 0.879098 valid_0's binary_logloss: 0.122414 valid_1's auc: 0.838895 valid_1's binary_logloss: 0.13925 [37] valid_0's auc: 0.879502 valid_0's binary_logloss: 0.122216 valid_1's auc: 0.838441 valid_1's binary_logloss: 0.139302 [38] valid_0's auc: 0.880036 valid_0's binary_logloss: 0.121998 valid_1's auc: 0.838582 valid_1's binary_logloss: 0.139306 [39] valid_0's auc: 0.880641 valid_0's binary_logloss: 0.121716 valid_1's auc: 0.838787 valid_1's binary_logloss: 0.139269 [40] valid_0's auc: 0.881249 valid_0's binary_logloss: 0.121482 valid_1's auc: 0.838906 valid_1's binary_logloss: 0.139223 [41] valid_0's auc: 0.881919 valid_0's binary_logloss: 0.121223 valid_1's auc: 0.838567 valid_1's binary_logloss: 0.13926
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.821645 valid_0's binary_logloss: 0.156528 valid_1's auc: 0.81857 valid_1's binary_logloss: 0.165101 [2] valid_0's auc: 0.827488 valid_0's binary_logloss: 0.151189 valid_1's auc: 0.822299 valid_1's binary_logloss: 0.160072 [3] valid_0's auc: 0.837855 valid_0's binary_logloss: 0.147263 valid_1's auc: 0.829855 valid_1's binary_logloss: 0.156527 [4] valid_0's auc: 0.840063 valid_0's binary_logloss: 0.144261 valid_1's auc: 0.833088 valid_1's binary_logloss: 0.153446 [5] valid_0's auc: 0.842802 valid_0's binary_logloss: 0.141691 valid_1's auc: 0.834541 valid_1's binary_logloss: 0.151144 [6] valid_0's auc: 0.844 valid_0's binary_logloss: 0.139654 valid_1's auc: 0.834542 valid_1's binary_logloss: 0.149333 [7] valid_0's auc: 0.845838 valid_0's binary_logloss: 0.138002 valid_1's auc: 0.835645 valid_1's binary_logloss: 0.147676 [8] valid_0's auc: 0.846869 valid_0's binary_logloss: 0.136628 valid_1's auc: 0.836118 valid_1's binary_logloss: 0.146491 [9] valid_0's auc: 0.849282 valid_0's binary_logloss: 0.135382 valid_1's auc: 0.837542 valid_1's binary_logloss: 0.14539 [10] valid_0's auc: 0.851021 valid_0's binary_logloss: 0.134282 valid_1's auc: 0.836942 valid_1's binary_logloss: 0.144584 [11] valid_0's auc: 0.852037 valid_0's binary_logloss: 0.133358 valid_1's auc: 0.8374 valid_1's binary_logloss: 0.143836 [12] valid_0's auc: 0.854496 valid_0's binary_logloss: 0.132505 valid_1's auc: 0.838593 valid_1's binary_logloss: 0.143171 [13] valid_0's auc: 0.857514 valid_0's binary_logloss: 0.131695 valid_1's auc: 0.838558 valid_1's binary_logloss: 0.142646 [14] valid_0's auc: 0.858827 valid_0's binary_logloss: 0.131006 valid_1's auc: 0.838498 valid_1's binary_logloss: 0.142158 [15] valid_0's auc: 0.860574 valid_0's binary_logloss: 0.130352 valid_1's auc: 0.837435 valid_1's binary_logloss: 0.141868 [16] valid_0's auc: 0.861239 valid_0's binary_logloss: 0.129765 valid_1's auc: 0.837374 valid_1's binary_logloss: 0.141537 [17] valid_0's auc: 0.86217 valid_0's binary_logloss: 0.129164 valid_1's auc: 0.837703 valid_1's binary_logloss: 0.141192 [18] valid_0's auc: 0.863228 valid_0's binary_logloss: 0.128615 valid_1's auc: 0.837526 valid_1's binary_logloss: 0.140917 [19] valid_0's auc: 0.86473 valid_0's binary_logloss: 0.128113 valid_1's auc: 0.838235 valid_1's binary_logloss: 0.140572 [20] valid_0's auc: 0.865797 valid_0's binary_logloss: 0.127679 valid_1's auc: 0.838788 valid_1's binary_logloss: 0.140332 [21] valid_0's auc: 0.866561 valid_0's binary_logloss: 0.127235 valid_1's auc: 0.839171 valid_1's binary_logloss: 0.140108 [22] valid_0's auc: 0.867237 valid_0's binary_logloss: 0.12688 valid_1's auc: 0.839213 valid_1's binary_logloss: 0.13991 [23] valid_0's auc: 0.867894 valid_0's binary_logloss: 0.126519 valid_1's auc: 0.839641 valid_1's binary_logloss: 0.139745 [24] valid_0's auc: 0.868501 valid_0's binary_logloss: 0.126192 valid_1's auc: 0.840025 valid_1's binary_logloss: 0.139593 [25] valid_0's auc: 0.869311 valid_0's binary_logloss: 0.125838 valid_1's auc: 0.839961 valid_1's binary_logloss: 0.139531 [26] valid_0's auc: 0.870325 valid_0's binary_logloss: 0.125518 valid_1's auc: 0.839261 valid_1's binary_logloss: 0.139524 [27] valid_0's auc: 0.871488 valid_0's binary_logloss: 0.125147 valid_1's auc: 0.839671 valid_1's binary_logloss: 0.139365 [28] valid_0's auc: 0.87235 valid_0's binary_logloss: 0.12484 valid_1's auc: 0.840114 valid_1's binary_logloss: 0.139236 [29] valid_0's auc: 0.872991 valid_0's binary_logloss: 0.124593 valid_1's auc: 0.839491 valid_1's binary_logloss: 0.139271 [30] valid_0's auc: 0.874129 valid_0's binary_logloss: 0.124312 valid_1's auc: 0.839589 valid_1's binary_logloss: 0.13918 [31] valid_0's auc: 0.875305 valid_0's binary_logloss: 0.123988 valid_1's auc: 0.839441 valid_1's binary_logloss: 0.139184 [32] valid_0's auc: 0.875943 valid_0's binary_logloss: 0.123748 valid_1's auc: 0.839268 valid_1's binary_logloss: 0.13919 [33] valid_0's auc: 0.876575 valid_0's binary_logloss: 0.123484 valid_1's auc: 0.839549 valid_1's binary_logloss: 0.139075 [34] valid_0's auc: 0.877426 valid_0's binary_logloss: 0.123156 valid_1's auc: 0.839087 valid_1's binary_logloss: 0.139148 [35] valid_0's auc: 0.87822 valid_0's binary_logloss: 0.122873 valid_1's auc: 0.8389 valid_1's binary_logloss: 0.139187 [36] valid_0's auc: 0.878932 valid_0's binary_logloss: 0.12259 valid_1's auc: 0.838921 valid_1's binary_logloss: 0.139194 [37] valid_0's auc: 0.879842 valid_0's binary_logloss: 0.12233 valid_1's auc: 0.839133 valid_1's binary_logloss: 0.139161 [38] valid_0's auc: 0.880497 valid_0's binary_logloss: 0.12208 valid_1's auc: 0.838975 valid_1's binary_logloss: 0.139143 [39] valid_0's auc: 0.881056 valid_0's binary_logloss: 0.121827 valid_1's auc: 0.839037 valid_1's binary_logloss: 0.139138 [40] valid_0's auc: 0.881604 valid_0's binary_logloss: 0.121603 valid_1's auc: 0.839204 valid_1's binary_logloss: 0.139119 [41] valid_0's auc: 0.882159 valid_0's binary_logloss: 0.121355 valid_1's auc: 0.839277 valid_1's binary_logloss: 0.139091 [42] valid_0's auc: 0.882757 valid_0's binary_logloss: 0.121116 valid_1's auc: 0.838964 valid_1's binary_logloss: 0.139133 [43] valid_0's auc: 0.883143 valid_0's binary_logloss: 0.120918 valid_1's auc: 0.839024 valid_1's binary_logloss: 0.139124 [44] valid_0's auc: 0.883697 valid_0's binary_logloss: 0.12072 valid_1's auc: 0.838652 valid_1's binary_logloss: 0.139203 [45] valid_0's auc: 0.884292 valid_0's binary_logloss: 0.120482 valid_1's auc: 0.839016 valid_1's binary_logloss: 0.139124 [46] valid_0's auc: 0.884969 valid_0's binary_logloss: 0.120266 valid_1's auc: 0.838683 valid_1's binary_logloss: 0.139184 [47] valid_0's auc: 0.8853 valid_0's binary_logloss: 0.120089 valid_1's auc: 0.838624 valid_1's binary_logloss: 0.139193 [48] valid_0's auc: 0.885876 valid_0's binary_logloss: 0.11993 valid_1's auc: 0.838569 valid_1's binary_logloss: 0.139212 [49] valid_0's auc: 0.886141 valid_0's binary_logloss: 0.119757 valid_1's auc: 0.838345 valid_1's binary_logloss: 0.139288 [50] valid_0's auc: 0.886433 valid_0's binary_logloss: 0.119595 valid_1's auc: 0.838342 valid_1's binary_logloss: 0.139332 [51] valid_0's auc: 0.886975 valid_0's binary_logloss: 0.119377 valid_1's auc: 0.838335 valid_1's binary_logloss: 0.139331 [52] valid_0's auc: 0.887568 valid_0's binary_logloss: 0.119161 valid_1's auc: 0.838204 valid_1's binary_logloss: 0.139331 [53] valid_0's auc: 0.887867 valid_0's binary_logloss: 0.118974 valid_1's auc: 0.838044 valid_1's binary_logloss: 0.13936 [54] valid_0's auc: 0.888093 valid_0's binary_logloss: 0.118834 valid_1's auc: 0.838137 valid_1's binary_logloss: 0.13935 [55] valid_0's auc: 0.888289 valid_0's binary_logloss: 0.118675 valid_1's auc: 0.837878 valid_1's binary_logloss: 0.139392 [56] valid_0's auc: 0.888615 valid_0's binary_logloss: 0.118561 valid_1's auc: 0.837776 valid_1's binary_logloss: 0.139418 [57] valid_0's auc: 0.889157 valid_0's binary_logloss: 0.118369 valid_1's auc: 0.837585 valid_1's binary_logloss: 0.139447 [58] valid_0's auc: 0.889659 valid_0's binary_logloss: 0.11819 valid_1's auc: 0.837789 valid_1's binary_logloss: 0.139431
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.820235 valid_0's binary_logloss: 0.156085 valid_1's auc: 0.81613 valid_1's binary_logloss: 0.164992 [2] valid_0's auc: 0.825778 valid_0's binary_logloss: 0.150951 valid_1's auc: 0.821835 valid_1's binary_logloss: 0.159874 [3] valid_0's auc: 0.832262 valid_0's binary_logloss: 0.147158 valid_1's auc: 0.826533 valid_1's binary_logloss: 0.156346 [4] valid_0's auc: 0.83865 valid_0's binary_logloss: 0.144126 valid_1's auc: 0.833166 valid_1's binary_logloss: 0.1534 [5] valid_0's auc: 0.842822 valid_0's binary_logloss: 0.141725 valid_1's auc: 0.836448 valid_1's binary_logloss: 0.151167 [6] valid_0's auc: 0.844702 valid_0's binary_logloss: 0.139642 valid_1's auc: 0.837094 valid_1's binary_logloss: 0.149356 [7] valid_0's auc: 0.847144 valid_0's binary_logloss: 0.13794 valid_1's auc: 0.837965 valid_1's binary_logloss: 0.147853 [8] valid_0's auc: 0.848277 valid_0's binary_logloss: 0.136499 valid_1's auc: 0.837663 valid_1's binary_logloss: 0.146543 [9] valid_0's auc: 0.849328 valid_0's binary_logloss: 0.135326 valid_1's auc: 0.837413 valid_1's binary_logloss: 0.145528 [10] valid_0's auc: 0.851112 valid_0's binary_logloss: 0.134188 valid_1's auc: 0.836954 valid_1's binary_logloss: 0.14466 [11] valid_0's auc: 0.852613 valid_0's binary_logloss: 0.133257 valid_1's auc: 0.837393 valid_1's binary_logloss: 0.143843 [12] valid_0's auc: 0.854906 valid_0's binary_logloss: 0.132346 valid_1's auc: 0.837459 valid_1's binary_logloss: 0.143285 [13] valid_0's auc: 0.855656 valid_0's binary_logloss: 0.131601 valid_1's auc: 0.837612 valid_1's binary_logloss: 0.142732 [14] valid_0's auc: 0.857076 valid_0's binary_logloss: 0.130884 valid_1's auc: 0.837055 valid_1's binary_logloss: 0.142403 [15] valid_0's auc: 0.857961 valid_0's binary_logloss: 0.130252 valid_1's auc: 0.837198 valid_1's binary_logloss: 0.142031 [16] valid_0's auc: 0.860191 valid_0's binary_logloss: 0.129596 valid_1's auc: 0.836016 valid_1's binary_logloss: 0.141822 [17] valid_0's auc: 0.860941 valid_0's binary_logloss: 0.129064 valid_1's auc: 0.836076 valid_1's binary_logloss: 0.141551 [18] valid_0's auc: 0.862201 valid_0's binary_logloss: 0.128565 valid_1's auc: 0.835929 valid_1's binary_logloss: 0.141326 [19] valid_0's auc: 0.863581 valid_0's binary_logloss: 0.128105 valid_1's auc: 0.835256 valid_1's binary_logloss: 0.141243 [20] valid_0's auc: 0.864799 valid_0's binary_logloss: 0.127654 valid_1's auc: 0.83435 valid_1's binary_logloss: 0.141148 [21] valid_0's auc: 0.866472 valid_0's binary_logloss: 0.127165 valid_1's auc: 0.834176 valid_1's binary_logloss: 0.141041 [22] valid_0's auc: 0.867055 valid_0's binary_logloss: 0.126777 valid_1's auc: 0.834173 valid_1's binary_logloss: 0.140887 [23] valid_0's auc: 0.867726 valid_0's binary_logloss: 0.12643 valid_1's auc: 0.833577 valid_1's binary_logloss: 0.140909 [24] valid_0's auc: 0.868612 valid_0's binary_logloss: 0.126061 valid_1's auc: 0.833336 valid_1's binary_logloss: 0.140824 [25] valid_0's auc: 0.869224 valid_0's binary_logloss: 0.125753 valid_1's auc: 0.833428 valid_1's binary_logloss: 0.140793 [26] valid_0's auc: 0.870183 valid_0's binary_logloss: 0.125414 valid_1's auc: 0.83333 valid_1's binary_logloss: 0.140724 [27] valid_0's auc: 0.870926 valid_0's binary_logloss: 0.125123 valid_1's auc: 0.832503 valid_1's binary_logloss: 0.140772 [28] valid_0's auc: 0.872431 valid_0's binary_logloss: 0.124766 valid_1's auc: 0.832826 valid_1's binary_logloss: 0.140685 [29] valid_0's auc: 0.873397 valid_0's binary_logloss: 0.124495 valid_1's auc: 0.833175 valid_1's binary_logloss: 0.140604 [30] valid_0's auc: 0.87475 valid_0's binary_logloss: 0.12417 valid_1's auc: 0.833614 valid_1's binary_logloss: 0.140497 [31] valid_0's auc: 0.875407 valid_0's binary_logloss: 0.12389 valid_1's auc: 0.833706 valid_1's binary_logloss: 0.140428 [32] valid_0's auc: 0.876136 valid_0's binary_logloss: 0.123637 valid_1's auc: 0.833458 valid_1's binary_logloss: 0.140448 [33] valid_0's auc: 0.876575 valid_0's binary_logloss: 0.123421 valid_1's auc: 0.832965 valid_1's binary_logloss: 0.140498 [34] valid_0's auc: 0.877224 valid_0's binary_logloss: 0.123219 valid_1's auc: 0.832659 valid_1's binary_logloss: 0.140537 [35] valid_0's auc: 0.877898 valid_0's binary_logloss: 0.122947 valid_1's auc: 0.832787 valid_1's binary_logloss: 0.140536 [36] valid_0's auc: 0.878334 valid_0's binary_logloss: 0.122724 valid_1's auc: 0.832724 valid_1's binary_logloss: 0.14053 [37] valid_0's auc: 0.878762 valid_0's binary_logloss: 0.122514 valid_1's auc: 0.832581 valid_1's binary_logloss: 0.140533
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.814371 valid_0's binary_logloss: 0.156452 valid_1's auc: 0.813175 valid_1's binary_logloss: 0.165418 [2] valid_0's auc: 0.827277 valid_0's binary_logloss: 0.151084 valid_1's auc: 0.819635 valid_1's binary_logloss: 0.160159 [3] valid_0's auc: 0.837033 valid_0's binary_logloss: 0.14722 valid_1's auc: 0.828221 valid_1's binary_logloss: 0.156492 [4] valid_0's auc: 0.840167 valid_0's binary_logloss: 0.14423 valid_1's auc: 0.830942 valid_1's binary_logloss: 0.153586 [5] valid_0's auc: 0.842499 valid_0's binary_logloss: 0.141721 valid_1's auc: 0.833301 valid_1's binary_logloss: 0.151219 [6] valid_0's auc: 0.845403 valid_0's binary_logloss: 0.139708 valid_1's auc: 0.836412 valid_1's binary_logloss: 0.149312 [7] valid_0's auc: 0.848049 valid_0's binary_logloss: 0.138024 valid_1's auc: 0.836054 valid_1's binary_logloss: 0.14779 [8] valid_0's auc: 0.849694 valid_0's binary_logloss: 0.136542 valid_1's auc: 0.837537 valid_1's binary_logloss: 0.146417 [9] valid_0's auc: 0.851646 valid_0's binary_logloss: 0.135289 valid_1's auc: 0.838418 valid_1's binary_logloss: 0.145329 [10] valid_0's auc: 0.853642 valid_0's binary_logloss: 0.134189 valid_1's auc: 0.839342 valid_1's binary_logloss: 0.144374 [11] valid_0's auc: 0.855647 valid_0's binary_logloss: 0.133227 valid_1's auc: 0.840035 valid_1's binary_logloss: 0.143552 [12] valid_0's auc: 0.856768 valid_0's binary_logloss: 0.132399 valid_1's auc: 0.839294 valid_1's binary_logloss: 0.143047 [13] valid_0's auc: 0.85763 valid_0's binary_logloss: 0.13165 valid_1's auc: 0.838911 valid_1's binary_logloss: 0.142469 [14] valid_0's auc: 0.859243 valid_0's binary_logloss: 0.130936 valid_1's auc: 0.838705 valid_1's binary_logloss: 0.141913 [15] valid_0's auc: 0.860124 valid_0's binary_logloss: 0.130312 valid_1's auc: 0.838608 valid_1's binary_logloss: 0.141547 [16] valid_0's auc: 0.861358 valid_0's binary_logloss: 0.129687 valid_1's auc: 0.838422 valid_1's binary_logloss: 0.141134 [17] valid_0's auc: 0.862159 valid_0's binary_logloss: 0.129139 valid_1's auc: 0.838636 valid_1's binary_logloss: 0.140786 [18] valid_0's auc: 0.862729 valid_0's binary_logloss: 0.128664 valid_1's auc: 0.838342 valid_1's binary_logloss: 0.140538 [19] valid_0's auc: 0.863842 valid_0's binary_logloss: 0.128137 valid_1's auc: 0.838464 valid_1's binary_logloss: 0.140331 [20] valid_0's auc: 0.864859 valid_0's binary_logloss: 0.127657 valid_1's auc: 0.837832 valid_1's binary_logloss: 0.140179 [21] valid_0's auc: 0.866227 valid_0's binary_logloss: 0.127137 valid_1's auc: 0.837735 valid_1's binary_logloss: 0.140043 [22] valid_0's auc: 0.866925 valid_0's binary_logloss: 0.126772 valid_1's auc: 0.838268 valid_1's binary_logloss: 0.139927 [23] valid_0's auc: 0.867727 valid_0's binary_logloss: 0.126369 valid_1's auc: 0.838482 valid_1's binary_logloss: 0.139787 [24] valid_0's auc: 0.868239 valid_0's binary_logloss: 0.126013 valid_1's auc: 0.838767 valid_1's binary_logloss: 0.13964 [25] valid_0's auc: 0.869304 valid_0's binary_logloss: 0.125622 valid_1's auc: 0.838562 valid_1's binary_logloss: 0.139648 [26] valid_0's auc: 0.870347 valid_0's binary_logloss: 0.125288 valid_1's auc: 0.838228 valid_1's binary_logloss: 0.139618 [27] valid_0's auc: 0.871198 valid_0's binary_logloss: 0.124953 valid_1's auc: 0.838403 valid_1's binary_logloss: 0.139594 [28] valid_0's auc: 0.872024 valid_0's binary_logloss: 0.124672 valid_1's auc: 0.838405 valid_1's binary_logloss: 0.139526 [29] valid_0's auc: 0.873184 valid_0's binary_logloss: 0.124303 valid_1's auc: 0.838211 valid_1's binary_logloss: 0.139531 [30] valid_0's auc: 0.874076 valid_0's binary_logloss: 0.12403 valid_1's auc: 0.838983 valid_1's binary_logloss: 0.139411 [31] valid_0's auc: 0.874768 valid_0's binary_logloss: 0.123745 valid_1's auc: 0.839314 valid_1's binary_logloss: 0.139314 [32] valid_0's auc: 0.875593 valid_0's binary_logloss: 0.123486 valid_1's auc: 0.838875 valid_1's binary_logloss: 0.139322 [33] valid_0's auc: 0.8767 valid_0's binary_logloss: 0.123182 valid_1's auc: 0.838809 valid_1's binary_logloss: 0.139329 [34] valid_0's auc: 0.87774 valid_0's binary_logloss: 0.122892 valid_1's auc: 0.838376 valid_1's binary_logloss: 0.139342 [35] valid_0's auc: 0.878372 valid_0's binary_logloss: 0.122634 valid_1's auc: 0.838454 valid_1's binary_logloss: 0.13931 [36] valid_0's auc: 0.879098 valid_0's binary_logloss: 0.122414 valid_1's auc: 0.838895 valid_1's binary_logloss: 0.13925 [37] valid_0's auc: 0.879502 valid_0's binary_logloss: 0.122216 valid_1's auc: 0.838441 valid_1's binary_logloss: 0.139302 [38] valid_0's auc: 0.880036 valid_0's binary_logloss: 0.121998 valid_1's auc: 0.838582 valid_1's binary_logloss: 0.139306 [39] valid_0's auc: 0.880641 valid_0's binary_logloss: 0.121716 valid_1's auc: 0.838787 valid_1's binary_logloss: 0.139269 [40] valid_0's auc: 0.881249 valid_0's binary_logloss: 0.121482 valid_1's auc: 0.838906 valid_1's binary_logloss: 0.139223 [41] valid_0's auc: 0.881919 valid_0's binary_logloss: 0.121223 valid_1's auc: 0.838567 valid_1's binary_logloss: 0.13926
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.821645 valid_0's binary_logloss: 0.156528 valid_1's auc: 0.81857 valid_1's binary_logloss: 0.165101 [2] valid_0's auc: 0.827488 valid_0's binary_logloss: 0.151189 valid_1's auc: 0.822299 valid_1's binary_logloss: 0.160072 [3] valid_0's auc: 0.837855 valid_0's binary_logloss: 0.147263 valid_1's auc: 0.829855 valid_1's binary_logloss: 0.156527 [4] valid_0's auc: 0.840063 valid_0's binary_logloss: 0.144261 valid_1's auc: 0.833088 valid_1's binary_logloss: 0.153446 [5] valid_0's auc: 0.842802 valid_0's binary_logloss: 0.141691 valid_1's auc: 0.834541 valid_1's binary_logloss: 0.151144 [6] valid_0's auc: 0.844 valid_0's binary_logloss: 0.139654 valid_1's auc: 0.834542 valid_1's binary_logloss: 0.149333 [7] valid_0's auc: 0.845838 valid_0's binary_logloss: 0.138002 valid_1's auc: 0.835645 valid_1's binary_logloss: 0.147676 [8] valid_0's auc: 0.846869 valid_0's binary_logloss: 0.136628 valid_1's auc: 0.836118 valid_1's binary_logloss: 0.146491 [9] valid_0's auc: 0.849282 valid_0's binary_logloss: 0.135382 valid_1's auc: 0.837542 valid_1's binary_logloss: 0.14539 [10] valid_0's auc: 0.851021 valid_0's binary_logloss: 0.134282 valid_1's auc: 0.836942 valid_1's binary_logloss: 0.144584 [11] valid_0's auc: 0.852037 valid_0's binary_logloss: 0.133358 valid_1's auc: 0.8374 valid_1's binary_logloss: 0.143836 [12] valid_0's auc: 0.854496 valid_0's binary_logloss: 0.132505 valid_1's auc: 0.838593 valid_1's binary_logloss: 0.143171 [13] valid_0's auc: 0.857514 valid_0's binary_logloss: 0.131695 valid_1's auc: 0.838558 valid_1's binary_logloss: 0.142646 [14] valid_0's auc: 0.858827 valid_0's binary_logloss: 0.131006 valid_1's auc: 0.838498 valid_1's binary_logloss: 0.142158 [15] valid_0's auc: 0.860574 valid_0's binary_logloss: 0.130352 valid_1's auc: 0.837435 valid_1's binary_logloss: 0.141868 [16] valid_0's auc: 0.861239 valid_0's binary_logloss: 0.129765 valid_1's auc: 0.837374 valid_1's binary_logloss: 0.141537 [17] valid_0's auc: 0.86217 valid_0's binary_logloss: 0.129164 valid_1's auc: 0.837703 valid_1's binary_logloss: 0.141192 [18] valid_0's auc: 0.863228 valid_0's binary_logloss: 0.128615 valid_1's auc: 0.837526 valid_1's binary_logloss: 0.140917 [19] valid_0's auc: 0.86473 valid_0's binary_logloss: 0.128113 valid_1's auc: 0.838235 valid_1's binary_logloss: 0.140572 [20] valid_0's auc: 0.865797 valid_0's binary_logloss: 0.127679 valid_1's auc: 0.838788 valid_1's binary_logloss: 0.140332 [21] valid_0's auc: 0.866561 valid_0's binary_logloss: 0.127235 valid_1's auc: 0.839171 valid_1's binary_logloss: 0.140108 [22] valid_0's auc: 0.867237 valid_0's binary_logloss: 0.12688 valid_1's auc: 0.839213 valid_1's binary_logloss: 0.13991 [23] valid_0's auc: 0.867894 valid_0's binary_logloss: 0.126519 valid_1's auc: 0.839641 valid_1's binary_logloss: 0.139745 [24] valid_0's auc: 0.868501 valid_0's binary_logloss: 0.126192 valid_1's auc: 0.840025 valid_1's binary_logloss: 0.139593 [25] valid_0's auc: 0.869311 valid_0's binary_logloss: 0.125838 valid_1's auc: 0.839961 valid_1's binary_logloss: 0.139531 [26] valid_0's auc: 0.870325 valid_0's binary_logloss: 0.125518 valid_1's auc: 0.839261 valid_1's binary_logloss: 0.139524 [27] valid_0's auc: 0.871488 valid_0's binary_logloss: 0.125147 valid_1's auc: 0.839671 valid_1's binary_logloss: 0.139365 [28] valid_0's auc: 0.87235 valid_0's binary_logloss: 0.12484 valid_1's auc: 0.840114 valid_1's binary_logloss: 0.139236 [29] valid_0's auc: 0.872991 valid_0's binary_logloss: 0.124593 valid_1's auc: 0.839491 valid_1's binary_logloss: 0.139271 [30] valid_0's auc: 0.874129 valid_0's binary_logloss: 0.124312 valid_1's auc: 0.839589 valid_1's binary_logloss: 0.13918 [31] valid_0's auc: 0.875305 valid_0's binary_logloss: 0.123988 valid_1's auc: 0.839441 valid_1's binary_logloss: 0.139184 [32] valid_0's auc: 0.875943 valid_0's binary_logloss: 0.123748 valid_1's auc: 0.839268 valid_1's binary_logloss: 0.13919 [33] valid_0's auc: 0.876575 valid_0's binary_logloss: 0.123484 valid_1's auc: 0.839549 valid_1's binary_logloss: 0.139075 [34] valid_0's auc: 0.877426 valid_0's binary_logloss: 0.123156 valid_1's auc: 0.839087 valid_1's binary_logloss: 0.139148 [35] valid_0's auc: 0.87822 valid_0's binary_logloss: 0.122873 valid_1's auc: 0.8389 valid_1's binary_logloss: 0.139187 [36] valid_0's auc: 0.878932 valid_0's binary_logloss: 0.12259 valid_1's auc: 0.838921 valid_1's binary_logloss: 0.139194 [37] valid_0's auc: 0.879842 valid_0's binary_logloss: 0.12233 valid_1's auc: 0.839133 valid_1's binary_logloss: 0.139161 [38] valid_0's auc: 0.880497 valid_0's binary_logloss: 0.12208 valid_1's auc: 0.838975 valid_1's binary_logloss: 0.139143 [39] valid_0's auc: 0.881056 valid_0's binary_logloss: 0.121827 valid_1's auc: 0.839037 valid_1's binary_logloss: 0.139138 [40] valid_0's auc: 0.881604 valid_0's binary_logloss: 0.121603 valid_1's auc: 0.839204 valid_1's binary_logloss: 0.139119 [41] valid_0's auc: 0.882159 valid_0's binary_logloss: 0.121355 valid_1's auc: 0.839277 valid_1's binary_logloss: 0.139091 [42] valid_0's auc: 0.882757 valid_0's binary_logloss: 0.121116 valid_1's auc: 0.838964 valid_1's binary_logloss: 0.139133 [43] valid_0's auc: 0.883143 valid_0's binary_logloss: 0.120918 valid_1's auc: 0.839024 valid_1's binary_logloss: 0.139124 [44] valid_0's auc: 0.883697 valid_0's binary_logloss: 0.12072 valid_1's auc: 0.838652 valid_1's binary_logloss: 0.139203 [45] valid_0's auc: 0.884292 valid_0's binary_logloss: 0.120482 valid_1's auc: 0.839016 valid_1's binary_logloss: 0.139124 [46] valid_0's auc: 0.884969 valid_0's binary_logloss: 0.120266 valid_1's auc: 0.838683 valid_1's binary_logloss: 0.139184 [47] valid_0's auc: 0.8853 valid_0's binary_logloss: 0.120089 valid_1's auc: 0.838624 valid_1's binary_logloss: 0.139193 [48] valid_0's auc: 0.885876 valid_0's binary_logloss: 0.11993 valid_1's auc: 0.838569 valid_1's binary_logloss: 0.139212 [49] valid_0's auc: 0.886141 valid_0's binary_logloss: 0.119757 valid_1's auc: 0.838345 valid_1's binary_logloss: 0.139288 [50] valid_0's auc: 0.886433 valid_0's binary_logloss: 0.119595 valid_1's auc: 0.838342 valid_1's binary_logloss: 0.139332 [51] valid_0's auc: 0.886975 valid_0's binary_logloss: 0.119377 valid_1's auc: 0.838335 valid_1's binary_logloss: 0.139331 [52] valid_0's auc: 0.887568 valid_0's binary_logloss: 0.119161 valid_1's auc: 0.838204 valid_1's binary_logloss: 0.139331 [53] valid_0's auc: 0.887867 valid_0's binary_logloss: 0.118974 valid_1's auc: 0.838044 valid_1's binary_logloss: 0.13936 [54] valid_0's auc: 0.888093 valid_0's binary_logloss: 0.118834 valid_1's auc: 0.838137 valid_1's binary_logloss: 0.13935 [55] valid_0's auc: 0.888289 valid_0's binary_logloss: 0.118675 valid_1's auc: 0.837878 valid_1's binary_logloss: 0.139392 [56] valid_0's auc: 0.888615 valid_0's binary_logloss: 0.118561 valid_1's auc: 0.837776 valid_1's binary_logloss: 0.139418 [57] valid_0's auc: 0.889157 valid_0's binary_logloss: 0.118369 valid_1's auc: 0.837585 valid_1's binary_logloss: 0.139447 [58] valid_0's auc: 0.889659 valid_0's binary_logloss: 0.11819 valid_1's auc: 0.837789 valid_1's binary_logloss: 0.139431
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.832891 valid_0's binary_logloss: 0.155302 valid_1's auc: 0.818851 valid_1's binary_logloss: 0.164826 [2] valid_0's auc: 0.84519 valid_0's binary_logloss: 0.149727 valid_1's auc: 0.827144 valid_1's binary_logloss: 0.159879 [3] valid_0's auc: 0.848018 valid_0's binary_logloss: 0.145627 valid_1's auc: 0.826851 valid_1's binary_logloss: 0.15631 [4] valid_0's auc: 0.851096 valid_0's binary_logloss: 0.142423 valid_1's auc: 0.83073 valid_1's binary_logloss: 0.1534 [5] valid_0's auc: 0.854735 valid_0's binary_logloss: 0.139746 valid_1's auc: 0.832753 valid_1's binary_logloss: 0.151136 [6] valid_0's auc: 0.856928 valid_0's binary_logloss: 0.137509 valid_1's auc: 0.835605 valid_1's binary_logloss: 0.14924 [7] valid_0's auc: 0.859448 valid_0's binary_logloss: 0.135575 valid_1's auc: 0.835612 valid_1's binary_logloss: 0.147799 [8] valid_0's auc: 0.861685 valid_0's binary_logloss: 0.133953 valid_1's auc: 0.834408 valid_1's binary_logloss: 0.146634 [9] valid_0's auc: 0.863391 valid_0's binary_logloss: 0.132468 valid_1's auc: 0.835623 valid_1's binary_logloss: 0.145549 [10] valid_0's auc: 0.865858 valid_0's binary_logloss: 0.131185 valid_1's auc: 0.83487 valid_1's binary_logloss: 0.144745 [11] valid_0's auc: 0.867134 valid_0's binary_logloss: 0.130116 valid_1's auc: 0.834692 valid_1's binary_logloss: 0.14411 [12] valid_0's auc: 0.868217 valid_0's binary_logloss: 0.129097 valid_1's auc: 0.834746 valid_1's binary_logloss: 0.143527 [13] valid_0's auc: 0.87073 valid_0's binary_logloss: 0.128129 valid_1's auc: 0.833582 valid_1's binary_logloss: 0.143122 [14] valid_0's auc: 0.872621 valid_0's binary_logloss: 0.12721 valid_1's auc: 0.833205 valid_1's binary_logloss: 0.142745 [15] valid_0's auc: 0.874007 valid_0's binary_logloss: 0.126363 valid_1's auc: 0.83246 valid_1's binary_logloss: 0.142489 [16] valid_0's auc: 0.875141 valid_0's binary_logloss: 0.125606 valid_1's auc: 0.831958 valid_1's binary_logloss: 0.142275 [17] valid_0's auc: 0.876061 valid_0's binary_logloss: 0.124928 valid_1's auc: 0.831586 valid_1's binary_logloss: 0.142141 [18] valid_0's auc: 0.876982 valid_0's binary_logloss: 0.124313 valid_1's auc: 0.830954 valid_1's binary_logloss: 0.142066 [19] valid_0's auc: 0.877885 valid_0's binary_logloss: 0.123709 valid_1's auc: 0.830572 valid_1's binary_logloss: 0.14196 [20] valid_0's auc: 0.879378 valid_0's binary_logloss: 0.123088 valid_1's auc: 0.830076 valid_1's binary_logloss: 0.14196 [21] valid_0's auc: 0.880647 valid_0's binary_logloss: 0.122488 valid_1's auc: 0.830109 valid_1's binary_logloss: 0.141858 [22] valid_0's auc: 0.881614 valid_0's binary_logloss: 0.121973 valid_1's auc: 0.829735 valid_1's binary_logloss: 0.141822 [23] valid_0's auc: 0.882402 valid_0's binary_logloss: 0.121554 valid_1's auc: 0.829254 valid_1's binary_logloss: 0.141805 [24] valid_0's auc: 0.883011 valid_0's binary_logloss: 0.121078 valid_1's auc: 0.829054 valid_1's binary_logloss: 0.14178 [25] valid_0's auc: 0.884627 valid_0's binary_logloss: 0.120587 valid_1's auc: 0.82942 valid_1's binary_logloss: 0.141653 [26] valid_0's auc: 0.885304 valid_0's binary_logloss: 0.120169 valid_1's auc: 0.828716 valid_1's binary_logloss: 0.141755 [27] valid_0's auc: 0.88664 valid_0's binary_logloss: 0.119673 valid_1's auc: 0.828869 valid_1's binary_logloss: 0.141682 [28] valid_0's auc: 0.887143 valid_0's binary_logloss: 0.119308 valid_1's auc: 0.828987 valid_1's binary_logloss: 0.141649 [29] valid_0's auc: 0.88825 valid_0's binary_logloss: 0.1189 valid_1's auc: 0.829075 valid_1's binary_logloss: 0.141601 [30] valid_0's auc: 0.889081 valid_0's binary_logloss: 0.118531 valid_1's auc: 0.828871 valid_1's binary_logloss: 0.141605 [31] valid_0's auc: 0.890195 valid_0's binary_logloss: 0.118117 valid_1's auc: 0.828972 valid_1's binary_logloss: 0.141605 [32] valid_0's auc: 0.890928 valid_0's binary_logloss: 0.117735 valid_1's auc: 0.827969 valid_1's binary_logloss: 0.141796 [33] valid_0's auc: 0.891505 valid_0's binary_logloss: 0.117389 valid_1's auc: 0.827611 valid_1's binary_logloss: 0.141916 [34] valid_0's auc: 0.892223 valid_0's binary_logloss: 0.11707 valid_1's auc: 0.827019 valid_1's binary_logloss: 0.142051 [35] valid_0's auc: 0.892825 valid_0's binary_logloss: 0.116751 valid_1's auc: 0.826865 valid_1's binary_logloss: 0.142116 [36] valid_0's auc: 0.893984 valid_0's binary_logloss: 0.116353 valid_1's auc: 0.827203 valid_1's binary_logloss: 0.14207 [37] valid_0's auc: 0.89456 valid_0's binary_logloss: 0.11603 valid_1's auc: 0.827292 valid_1's binary_logloss: 0.142005 [38] valid_0's auc: 0.89511 valid_0's binary_logloss: 0.115713 valid_1's auc: 0.827214 valid_1's binary_logloss: 0.14206 [39] valid_0's auc: 0.895738 valid_0's binary_logloss: 0.115415 valid_1's auc: 0.82695 valid_1's binary_logloss: 0.142162
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.833054 valid_0's binary_logloss: 0.15572 valid_1's auc: 0.817048 valid_1's binary_logloss: 0.165036 [2] valid_0's auc: 0.841397 valid_0's binary_logloss: 0.149862 valid_1's auc: 0.82157 valid_1's binary_logloss: 0.159575 [3] valid_0's auc: 0.849058 valid_0's binary_logloss: 0.145662 valid_1's auc: 0.829866 valid_1's binary_logloss: 0.155774 [4] valid_0's auc: 0.854301 valid_0's binary_logloss: 0.142356 valid_1's auc: 0.832415 valid_1's binary_logloss: 0.152936 [5] valid_0's auc: 0.858045 valid_0's binary_logloss: 0.139697 valid_1's auc: 0.834554 valid_1's binary_logloss: 0.150635 [6] valid_0's auc: 0.860767 valid_0's binary_logloss: 0.137458 valid_1's auc: 0.834885 valid_1's binary_logloss: 0.148761 [7] valid_0's auc: 0.863011 valid_0's binary_logloss: 0.135522 valid_1's auc: 0.835812 valid_1's binary_logloss: 0.147245 [8] valid_0's auc: 0.864923 valid_0's binary_logloss: 0.133792 valid_1's auc: 0.836656 valid_1's binary_logloss: 0.145923 [9] valid_0's auc: 0.865706 valid_0's binary_logloss: 0.13236 valid_1's auc: 0.836912 valid_1's binary_logloss: 0.144867 [10] valid_0's auc: 0.867693 valid_0's binary_logloss: 0.131066 valid_1's auc: 0.837266 valid_1's binary_logloss: 0.143895 [11] valid_0's auc: 0.868596 valid_0's binary_logloss: 0.129937 valid_1's auc: 0.836466 valid_1's binary_logloss: 0.143255 [12] valid_0's auc: 0.87012 valid_0's binary_logloss: 0.128904 valid_1's auc: 0.836589 valid_1's binary_logloss: 0.142728 [13] valid_0's auc: 0.871703 valid_0's binary_logloss: 0.127913 valid_1's auc: 0.836567 valid_1's binary_logloss: 0.142105 [14] valid_0's auc: 0.873468 valid_0's binary_logloss: 0.126983 valid_1's auc: 0.835538 valid_1's binary_logloss: 0.141771 [15] valid_0's auc: 0.874839 valid_0's binary_logloss: 0.126147 valid_1's auc: 0.835363 valid_1's binary_logloss: 0.141464 [16] valid_0's auc: 0.876399 valid_0's binary_logloss: 0.125331 valid_1's auc: 0.83478 valid_1's binary_logloss: 0.141245 [17] valid_0's auc: 0.877465 valid_0's binary_logloss: 0.124655 valid_1's auc: 0.834621 valid_1's binary_logloss: 0.141028 [18] valid_0's auc: 0.878935 valid_0's binary_logloss: 0.123944 valid_1's auc: 0.834165 valid_1's binary_logloss: 0.140935 [19] valid_0's auc: 0.88046 valid_0's binary_logloss: 0.123313 valid_1's auc: 0.834629 valid_1's binary_logloss: 0.140738 [20] valid_0's auc: 0.881517 valid_0's binary_logloss: 0.12269 valid_1's auc: 0.8347 valid_1's binary_logloss: 0.140611 [21] valid_0's auc: 0.882464 valid_0's binary_logloss: 0.122095 valid_1's auc: 0.834656 valid_1's binary_logloss: 0.140487 [22] valid_0's auc: 0.883744 valid_0's binary_logloss: 0.121504 valid_1's auc: 0.834562 valid_1's binary_logloss: 0.140328 [23] valid_0's auc: 0.885301 valid_0's binary_logloss: 0.12091 valid_1's auc: 0.835278 valid_1's binary_logloss: 0.140199 [24] valid_0's auc: 0.886266 valid_0's binary_logloss: 0.120437 valid_1's auc: 0.835728 valid_1's binary_logloss: 0.140094 [25] valid_0's auc: 0.88755 valid_0's binary_logloss: 0.119931 valid_1's auc: 0.836199 valid_1's binary_logloss: 0.140076 [26] valid_0's auc: 0.888525 valid_0's binary_logloss: 0.119473 valid_1's auc: 0.836708 valid_1's binary_logloss: 0.139945 [27] valid_0's auc: 0.889589 valid_0's binary_logloss: 0.119012 valid_1's auc: 0.836951 valid_1's binary_logloss: 0.139843 [28] valid_0's auc: 0.890552 valid_0's binary_logloss: 0.118602 valid_1's auc: 0.836524 valid_1's binary_logloss: 0.139871 [29] valid_0's auc: 0.891402 valid_0's binary_logloss: 0.118166 valid_1's auc: 0.836264 valid_1's binary_logloss: 0.139884 [30] valid_0's auc: 0.891982 valid_0's binary_logloss: 0.117805 valid_1's auc: 0.835959 valid_1's binary_logloss: 0.139937 [31] valid_0's auc: 0.893185 valid_0's binary_logloss: 0.117392 valid_1's auc: 0.836384 valid_1's binary_logloss: 0.13992 [32] valid_0's auc: 0.894065 valid_0's binary_logloss: 0.117017 valid_1's auc: 0.836341 valid_1's binary_logloss: 0.139888 [33] valid_0's auc: 0.894791 valid_0's binary_logloss: 0.116671 valid_1's auc: 0.836753 valid_1's binary_logloss: 0.139812 [34] valid_0's auc: 0.895313 valid_0's binary_logloss: 0.116321 valid_1's auc: 0.836733 valid_1's binary_logloss: 0.139826 [35] valid_0's auc: 0.895876 valid_0's binary_logloss: 0.116039 valid_1's auc: 0.836245 valid_1's binary_logloss: 0.139883 [36] valid_0's auc: 0.896909 valid_0's binary_logloss: 0.115684 valid_1's auc: 0.836079 valid_1's binary_logloss: 0.139912 [37] valid_0's auc: 0.897427 valid_0's binary_logloss: 0.115388 valid_1's auc: 0.835564 valid_1's binary_logloss: 0.140024 [38] valid_0's auc: 0.898442 valid_0's binary_logloss: 0.115006 valid_1's auc: 0.835612 valid_1's binary_logloss: 0.140075 [39] valid_0's auc: 0.899304 valid_0's binary_logloss: 0.114592 valid_1's auc: 0.836273 valid_1's binary_logloss: 0.139974 [40] valid_0's auc: 0.89974 valid_0's binary_logloss: 0.11432 valid_1's auc: 0.836096 valid_1's binary_logloss: 0.140042
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.830643 valid_0's binary_logloss: 0.155759 valid_1's auc: 0.816734 valid_1's binary_logloss: 0.164985 [2] valid_0's auc: 0.839353 valid_0's binary_logloss: 0.149977 valid_1's auc: 0.822571 valid_1's binary_logloss: 0.159808 [3] valid_0's auc: 0.847366 valid_0's binary_logloss: 0.145866 valid_1's auc: 0.829312 valid_1's binary_logloss: 0.156171 [4] valid_0's auc: 0.850911 valid_0's binary_logloss: 0.14247 valid_1's auc: 0.830848 valid_1's binary_logloss: 0.153328 [5] valid_0's auc: 0.854674 valid_0's binary_logloss: 0.139764 valid_1's auc: 0.833041 valid_1's binary_logloss: 0.151023 [6] valid_0's auc: 0.856722 valid_0's binary_logloss: 0.1375 valid_1's auc: 0.834264 valid_1's binary_logloss: 0.149166 [7] valid_0's auc: 0.858253 valid_0's binary_logloss: 0.135713 valid_1's auc: 0.834998 valid_1's binary_logloss: 0.147631 [8] valid_0's auc: 0.859768 valid_0's binary_logloss: 0.134063 valid_1's auc: 0.835678 valid_1's binary_logloss: 0.146384 [9] valid_0's auc: 0.86262 valid_0's binary_logloss: 0.132622 valid_1's auc: 0.836272 valid_1's binary_logloss: 0.145313 [10] valid_0's auc: 0.864631 valid_0's binary_logloss: 0.131324 valid_1's auc: 0.835827 valid_1's binary_logloss: 0.144553 [11] valid_0's auc: 0.866805 valid_0's binary_logloss: 0.130172 valid_1's auc: 0.835375 valid_1's binary_logloss: 0.143933 [12] valid_0's auc: 0.868266 valid_0's binary_logloss: 0.129101 valid_1's auc: 0.835951 valid_1's binary_logloss: 0.143342 [13] valid_0's auc: 0.870762 valid_0's binary_logloss: 0.128144 valid_1's auc: 0.83626 valid_1's binary_logloss: 0.142813 [14] valid_0's auc: 0.872747 valid_0's binary_logloss: 0.127222 valid_1's auc: 0.835864 valid_1's binary_logloss: 0.142466 [15] valid_0's auc: 0.874158 valid_0's binary_logloss: 0.126428 valid_1's auc: 0.83548 valid_1's binary_logloss: 0.142108 [16] valid_0's auc: 0.875931 valid_0's binary_logloss: 0.125651 valid_1's auc: 0.836367 valid_1's binary_logloss: 0.141684 [17] valid_0's auc: 0.876854 valid_0's binary_logloss: 0.124918 valid_1's auc: 0.835689 valid_1's binary_logloss: 0.141524 [18] valid_0's auc: 0.878211 valid_0's binary_logloss: 0.124197 valid_1's auc: 0.835893 valid_1's binary_logloss: 0.141285 [19] valid_0's auc: 0.879125 valid_0's binary_logloss: 0.123553 valid_1's auc: 0.835877 valid_1's binary_logloss: 0.141128 [20] valid_0's auc: 0.880489 valid_0's binary_logloss: 0.122856 valid_1's auc: 0.835385 valid_1's binary_logloss: 0.141032 [21] valid_0's auc: 0.881696 valid_0's binary_logloss: 0.122219 valid_1's auc: 0.835822 valid_1's binary_logloss: 0.140843 [22] valid_0's auc: 0.882257 valid_0's binary_logloss: 0.121726 valid_1's auc: 0.835849 valid_1's binary_logloss: 0.140761 [23] valid_0's auc: 0.883635 valid_0's binary_logloss: 0.121206 valid_1's auc: 0.836223 valid_1's binary_logloss: 0.140607 [24] valid_0's auc: 0.884533 valid_0's binary_logloss: 0.120734 valid_1's auc: 0.836473 valid_1's binary_logloss: 0.14049 [25] valid_0's auc: 0.885234 valid_0's binary_logloss: 0.120268 valid_1's auc: 0.836722 valid_1's binary_logloss: 0.140403 [26] valid_0's auc: 0.886292 valid_0's binary_logloss: 0.119794 valid_1's auc: 0.836549 valid_1's binary_logloss: 0.140423 [27] valid_0's auc: 0.887064 valid_0's binary_logloss: 0.119366 valid_1's auc: 0.836155 valid_1's binary_logloss: 0.140447 [28] valid_0's auc: 0.887621 valid_0's binary_logloss: 0.119008 valid_1's auc: 0.835594 valid_1's binary_logloss: 0.140532 [29] valid_0's auc: 0.888965 valid_0's binary_logloss: 0.118547 valid_1's auc: 0.835464 valid_1's binary_logloss: 0.140508 [30] valid_0's auc: 0.889898 valid_0's binary_logloss: 0.118139 valid_1's auc: 0.83577 valid_1's binary_logloss: 0.140461 [31] valid_0's auc: 0.890896 valid_0's binary_logloss: 0.117734 valid_1's auc: 0.835475 valid_1's binary_logloss: 0.140463 [32] valid_0's auc: 0.892374 valid_0's binary_logloss: 0.1173 valid_1's auc: 0.835364 valid_1's binary_logloss: 0.140506 [33] valid_0's auc: 0.893164 valid_0's binary_logloss: 0.116978 valid_1's auc: 0.835865 valid_1's binary_logloss: 0.14041 [34] valid_0's auc: 0.893848 valid_0's binary_logloss: 0.11662 valid_1's auc: 0.836021 valid_1's binary_logloss: 0.140353 [35] valid_0's auc: 0.894232 valid_0's binary_logloss: 0.116323 valid_1's auc: 0.8359 valid_1's binary_logloss: 0.140396 [36] valid_0's auc: 0.895003 valid_0's binary_logloss: 0.115986 valid_1's auc: 0.835855 valid_1's binary_logloss: 0.140416 [37] valid_0's auc: 0.895898 valid_0's binary_logloss: 0.115609 valid_1's auc: 0.836185 valid_1's binary_logloss: 0.140369 [38] valid_0's auc: 0.896459 valid_0's binary_logloss: 0.11527 valid_1's auc: 0.835754 valid_1's binary_logloss: 0.140443 [39] valid_0's auc: 0.897377 valid_0's binary_logloss: 0.114873 valid_1's auc: 0.835638 valid_1's binary_logloss: 0.140474 [40] valid_0's auc: 0.89776 valid_0's binary_logloss: 0.114588 valid_1's auc: 0.835639 valid_1's binary_logloss: 0.140491 [41] valid_0's auc: 0.898583 valid_0's binary_logloss: 0.114302 valid_1's auc: 0.835705 valid_1's binary_logloss: 0.140506 [42] valid_0's auc: 0.899197 valid_0's binary_logloss: 0.113975 valid_1's auc: 0.835052 valid_1's binary_logloss: 0.14064 [43] valid_0's auc: 0.899803 valid_0's binary_logloss: 0.113654 valid_1's auc: 0.835035 valid_1's binary_logloss: 0.140691 [44] valid_0's auc: 0.900641 valid_0's binary_logloss: 0.113388 valid_1's auc: 0.835214 valid_1's binary_logloss: 0.140703 [45] valid_0's auc: 0.900962 valid_0's binary_logloss: 0.113098 valid_1's auc: 0.835276 valid_1's binary_logloss: 0.140695 [46] valid_0's auc: 0.901584 valid_0's binary_logloss: 0.112771 valid_1's auc: 0.83495 valid_1's binary_logloss: 0.140754 [47] valid_0's auc: 0.902256 valid_0's binary_logloss: 0.112493 valid_1's auc: 0.835639 valid_1's binary_logloss: 0.14064 [48] valid_0's auc: 0.902688 valid_0's binary_logloss: 0.112198 valid_1's auc: 0.835495 valid_1's binary_logloss: 0.140691 [49] valid_0's auc: 0.902922 valid_0's binary_logloss: 0.111944 valid_1's auc: 0.835281 valid_1's binary_logloss: 0.140819 [50] valid_0's auc: 0.903747 valid_0's binary_logloss: 0.111595 valid_1's auc: 0.835359 valid_1's binary_logloss: 0.140811 [51] valid_0's auc: 0.904427 valid_0's binary_logloss: 0.111354 valid_1's auc: 0.835245 valid_1's binary_logloss: 0.140873 [52] valid_0's auc: 0.90467 valid_0's binary_logloss: 0.111111 valid_1's auc: 0.835057 valid_1's binary_logloss: 0.140993 [53] valid_0's auc: 0.904868 valid_0's binary_logloss: 0.110853 valid_1's auc: 0.834751 valid_1's binary_logloss: 0.14108 [54] valid_0's auc: 0.905166 valid_0's binary_logloss: 0.110627 valid_1's auc: 0.83411 valid_1's binary_logloss: 0.141282 [55] valid_0's auc: 0.905665 valid_0's binary_logloss: 0.110375 valid_1's auc: 0.833739 valid_1's binary_logloss: 0.141413
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.832891 valid_0's binary_logloss: 0.155302 valid_1's auc: 0.818851 valid_1's binary_logloss: 0.164826 [2] valid_0's auc: 0.84519 valid_0's binary_logloss: 0.149727 valid_1's auc: 0.827144 valid_1's binary_logloss: 0.159879 [3] valid_0's auc: 0.848018 valid_0's binary_logloss: 0.145627 valid_1's auc: 0.826851 valid_1's binary_logloss: 0.15631 [4] valid_0's auc: 0.851096 valid_0's binary_logloss: 0.142423 valid_1's auc: 0.83073 valid_1's binary_logloss: 0.1534 [5] valid_0's auc: 0.854735 valid_0's binary_logloss: 0.139746 valid_1's auc: 0.832753 valid_1's binary_logloss: 0.151136 [6] valid_0's auc: 0.856928 valid_0's binary_logloss: 0.137509 valid_1's auc: 0.835605 valid_1's binary_logloss: 0.14924 [7] valid_0's auc: 0.859448 valid_0's binary_logloss: 0.135575 valid_1's auc: 0.835612 valid_1's binary_logloss: 0.147799 [8] valid_0's auc: 0.861685 valid_0's binary_logloss: 0.133953 valid_1's auc: 0.834408 valid_1's binary_logloss: 0.146634 [9] valid_0's auc: 0.863391 valid_0's binary_logloss: 0.132468 valid_1's auc: 0.835623 valid_1's binary_logloss: 0.145549 [10] valid_0's auc: 0.865858 valid_0's binary_logloss: 0.131185 valid_1's auc: 0.83487 valid_1's binary_logloss: 0.144745 [11] valid_0's auc: 0.867134 valid_0's binary_logloss: 0.130116 valid_1's auc: 0.834692 valid_1's binary_logloss: 0.14411 [12] valid_0's auc: 0.868217 valid_0's binary_logloss: 0.129097 valid_1's auc: 0.834746 valid_1's binary_logloss: 0.143527 [13] valid_0's auc: 0.87073 valid_0's binary_logloss: 0.128129 valid_1's auc: 0.833582 valid_1's binary_logloss: 0.143122 [14] valid_0's auc: 0.872621 valid_0's binary_logloss: 0.12721 valid_1's auc: 0.833205 valid_1's binary_logloss: 0.142745 [15] valid_0's auc: 0.874007 valid_0's binary_logloss: 0.126363 valid_1's auc: 0.83246 valid_1's binary_logloss: 0.142489 [16] valid_0's auc: 0.875141 valid_0's binary_logloss: 0.125606 valid_1's auc: 0.831958 valid_1's binary_logloss: 0.142275 [17] valid_0's auc: 0.876061 valid_0's binary_logloss: 0.124928 valid_1's auc: 0.831586 valid_1's binary_logloss: 0.142141 [18] valid_0's auc: 0.876982 valid_0's binary_logloss: 0.124313 valid_1's auc: 0.830954 valid_1's binary_logloss: 0.142066 [19] valid_0's auc: 0.877885 valid_0's binary_logloss: 0.123709 valid_1's auc: 0.830572 valid_1's binary_logloss: 0.14196 [20] valid_0's auc: 0.879378 valid_0's binary_logloss: 0.123088 valid_1's auc: 0.830076 valid_1's binary_logloss: 0.14196 [21] valid_0's auc: 0.880647 valid_0's binary_logloss: 0.122488 valid_1's auc: 0.830109 valid_1's binary_logloss: 0.141858 [22] valid_0's auc: 0.881614 valid_0's binary_logloss: 0.121973 valid_1's auc: 0.829735 valid_1's binary_logloss: 0.141822 [23] valid_0's auc: 0.882402 valid_0's binary_logloss: 0.121554 valid_1's auc: 0.829254 valid_1's binary_logloss: 0.141805 [24] valid_0's auc: 0.883011 valid_0's binary_logloss: 0.121078 valid_1's auc: 0.829054 valid_1's binary_logloss: 0.14178 [25] valid_0's auc: 0.884627 valid_0's binary_logloss: 0.120587 valid_1's auc: 0.82942 valid_1's binary_logloss: 0.141653 [26] valid_0's auc: 0.885304 valid_0's binary_logloss: 0.120169 valid_1's auc: 0.828716 valid_1's binary_logloss: 0.141755 [27] valid_0's auc: 0.88664 valid_0's binary_logloss: 0.119673 valid_1's auc: 0.828869 valid_1's binary_logloss: 0.141682 [28] valid_0's auc: 0.887143 valid_0's binary_logloss: 0.119308 valid_1's auc: 0.828987 valid_1's binary_logloss: 0.141649 [29] valid_0's auc: 0.88825 valid_0's binary_logloss: 0.1189 valid_1's auc: 0.829075 valid_1's binary_logloss: 0.141601 [30] valid_0's auc: 0.889081 valid_0's binary_logloss: 0.118531 valid_1's auc: 0.828871 valid_1's binary_logloss: 0.141605 [31] valid_0's auc: 0.890195 valid_0's binary_logloss: 0.118117 valid_1's auc: 0.828972 valid_1's binary_logloss: 0.141605 [32] valid_0's auc: 0.890928 valid_0's binary_logloss: 0.117735 valid_1's auc: 0.827969 valid_1's binary_logloss: 0.141796 [33] valid_0's auc: 0.891505 valid_0's binary_logloss: 0.117389 valid_1's auc: 0.827611 valid_1's binary_logloss: 0.141916 [34] valid_0's auc: 0.892223 valid_0's binary_logloss: 0.11707 valid_1's auc: 0.827019 valid_1's binary_logloss: 0.142051 [35] valid_0's auc: 0.892825 valid_0's binary_logloss: 0.116751 valid_1's auc: 0.826865 valid_1's binary_logloss: 0.142116 [36] valid_0's auc: 0.893984 valid_0's binary_logloss: 0.116353 valid_1's auc: 0.827203 valid_1's binary_logloss: 0.14207 [37] valid_0's auc: 0.89456 valid_0's binary_logloss: 0.11603 valid_1's auc: 0.827292 valid_1's binary_logloss: 0.142005 [38] valid_0's auc: 0.89511 valid_0's binary_logloss: 0.115713 valid_1's auc: 0.827214 valid_1's binary_logloss: 0.14206 [39] valid_0's auc: 0.895738 valid_0's binary_logloss: 0.115415 valid_1's auc: 0.82695 valid_1's binary_logloss: 0.142162
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.833054 valid_0's binary_logloss: 0.15572 valid_1's auc: 0.817048 valid_1's binary_logloss: 0.165036 [2] valid_0's auc: 0.841397 valid_0's binary_logloss: 0.149862 valid_1's auc: 0.82157 valid_1's binary_logloss: 0.159575 [3] valid_0's auc: 0.849058 valid_0's binary_logloss: 0.145662 valid_1's auc: 0.829866 valid_1's binary_logloss: 0.155774 [4] valid_0's auc: 0.854301 valid_0's binary_logloss: 0.142356 valid_1's auc: 0.832415 valid_1's binary_logloss: 0.152936 [5] valid_0's auc: 0.858045 valid_0's binary_logloss: 0.139697 valid_1's auc: 0.834554 valid_1's binary_logloss: 0.150635 [6] valid_0's auc: 0.860767 valid_0's binary_logloss: 0.137458 valid_1's auc: 0.834885 valid_1's binary_logloss: 0.148761 [7] valid_0's auc: 0.863011 valid_0's binary_logloss: 0.135522 valid_1's auc: 0.835812 valid_1's binary_logloss: 0.147245 [8] valid_0's auc: 0.864923 valid_0's binary_logloss: 0.133792 valid_1's auc: 0.836656 valid_1's binary_logloss: 0.145923 [9] valid_0's auc: 0.865706 valid_0's binary_logloss: 0.13236 valid_1's auc: 0.836912 valid_1's binary_logloss: 0.144867 [10] valid_0's auc: 0.867693 valid_0's binary_logloss: 0.131066 valid_1's auc: 0.837266 valid_1's binary_logloss: 0.143895 [11] valid_0's auc: 0.868596 valid_0's binary_logloss: 0.129937 valid_1's auc: 0.836466 valid_1's binary_logloss: 0.143255 [12] valid_0's auc: 0.87012 valid_0's binary_logloss: 0.128904 valid_1's auc: 0.836589 valid_1's binary_logloss: 0.142728 [13] valid_0's auc: 0.871703 valid_0's binary_logloss: 0.127913 valid_1's auc: 0.836567 valid_1's binary_logloss: 0.142105 [14] valid_0's auc: 0.873468 valid_0's binary_logloss: 0.126983 valid_1's auc: 0.835538 valid_1's binary_logloss: 0.141771 [15] valid_0's auc: 0.874839 valid_0's binary_logloss: 0.126147 valid_1's auc: 0.835363 valid_1's binary_logloss: 0.141464 [16] valid_0's auc: 0.876399 valid_0's binary_logloss: 0.125331 valid_1's auc: 0.83478 valid_1's binary_logloss: 0.141245 [17] valid_0's auc: 0.877465 valid_0's binary_logloss: 0.124655 valid_1's auc: 0.834621 valid_1's binary_logloss: 0.141028 [18] valid_0's auc: 0.878935 valid_0's binary_logloss: 0.123944 valid_1's auc: 0.834165 valid_1's binary_logloss: 0.140935 [19] valid_0's auc: 0.88046 valid_0's binary_logloss: 0.123313 valid_1's auc: 0.834629 valid_1's binary_logloss: 0.140738 [20] valid_0's auc: 0.881517 valid_0's binary_logloss: 0.12269 valid_1's auc: 0.8347 valid_1's binary_logloss: 0.140611 [21] valid_0's auc: 0.882464 valid_0's binary_logloss: 0.122095 valid_1's auc: 0.834656 valid_1's binary_logloss: 0.140487 [22] valid_0's auc: 0.883744 valid_0's binary_logloss: 0.121504 valid_1's auc: 0.834562 valid_1's binary_logloss: 0.140328 [23] valid_0's auc: 0.885301 valid_0's binary_logloss: 0.12091 valid_1's auc: 0.835278 valid_1's binary_logloss: 0.140199 [24] valid_0's auc: 0.886266 valid_0's binary_logloss: 0.120437 valid_1's auc: 0.835728 valid_1's binary_logloss: 0.140094 [25] valid_0's auc: 0.88755 valid_0's binary_logloss: 0.119931 valid_1's auc: 0.836199 valid_1's binary_logloss: 0.140076 [26] valid_0's auc: 0.888525 valid_0's binary_logloss: 0.119473 valid_1's auc: 0.836708 valid_1's binary_logloss: 0.139945 [27] valid_0's auc: 0.889589 valid_0's binary_logloss: 0.119012 valid_1's auc: 0.836951 valid_1's binary_logloss: 0.139843 [28] valid_0's auc: 0.890552 valid_0's binary_logloss: 0.118602 valid_1's auc: 0.836524 valid_1's binary_logloss: 0.139871 [29] valid_0's auc: 0.891402 valid_0's binary_logloss: 0.118166 valid_1's auc: 0.836264 valid_1's binary_logloss: 0.139884 [30] valid_0's auc: 0.891982 valid_0's binary_logloss: 0.117805 valid_1's auc: 0.835959 valid_1's binary_logloss: 0.139937 [31] valid_0's auc: 0.893185 valid_0's binary_logloss: 0.117392 valid_1's auc: 0.836384 valid_1's binary_logloss: 0.13992 [32] valid_0's auc: 0.894065 valid_0's binary_logloss: 0.117017 valid_1's auc: 0.836341 valid_1's binary_logloss: 0.139888 [33] valid_0's auc: 0.894791 valid_0's binary_logloss: 0.116671 valid_1's auc: 0.836753 valid_1's binary_logloss: 0.139812 [34] valid_0's auc: 0.895313 valid_0's binary_logloss: 0.116321 valid_1's auc: 0.836733 valid_1's binary_logloss: 0.139826 [35] valid_0's auc: 0.895876 valid_0's binary_logloss: 0.116039 valid_1's auc: 0.836245 valid_1's binary_logloss: 0.139883 [36] valid_0's auc: 0.896909 valid_0's binary_logloss: 0.115684 valid_1's auc: 0.836079 valid_1's binary_logloss: 0.139912 [37] valid_0's auc: 0.897427 valid_0's binary_logloss: 0.115388 valid_1's auc: 0.835564 valid_1's binary_logloss: 0.140024 [38] valid_0's auc: 0.898442 valid_0's binary_logloss: 0.115006 valid_1's auc: 0.835612 valid_1's binary_logloss: 0.140075 [39] valid_0's auc: 0.899304 valid_0's binary_logloss: 0.114592 valid_1's auc: 0.836273 valid_1's binary_logloss: 0.139974 [40] valid_0's auc: 0.89974 valid_0's binary_logloss: 0.11432 valid_1's auc: 0.836096 valid_1's binary_logloss: 0.140042
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.830643 valid_0's binary_logloss: 0.155759 valid_1's auc: 0.816734 valid_1's binary_logloss: 0.164985 [2] valid_0's auc: 0.839353 valid_0's binary_logloss: 0.149977 valid_1's auc: 0.822571 valid_1's binary_logloss: 0.159808 [3] valid_0's auc: 0.847366 valid_0's binary_logloss: 0.145866 valid_1's auc: 0.829312 valid_1's binary_logloss: 0.156171 [4] valid_0's auc: 0.850911 valid_0's binary_logloss: 0.14247 valid_1's auc: 0.830848 valid_1's binary_logloss: 0.153328 [5] valid_0's auc: 0.854674 valid_0's binary_logloss: 0.139764 valid_1's auc: 0.833041 valid_1's binary_logloss: 0.151023 [6] valid_0's auc: 0.856722 valid_0's binary_logloss: 0.1375 valid_1's auc: 0.834264 valid_1's binary_logloss: 0.149166 [7] valid_0's auc: 0.858253 valid_0's binary_logloss: 0.135713 valid_1's auc: 0.834998 valid_1's binary_logloss: 0.147631 [8] valid_0's auc: 0.859768 valid_0's binary_logloss: 0.134063 valid_1's auc: 0.835678 valid_1's binary_logloss: 0.146384 [9] valid_0's auc: 0.86262 valid_0's binary_logloss: 0.132622 valid_1's auc: 0.836272 valid_1's binary_logloss: 0.145313 [10] valid_0's auc: 0.864631 valid_0's binary_logloss: 0.131324 valid_1's auc: 0.835827 valid_1's binary_logloss: 0.144553 [11] valid_0's auc: 0.866805 valid_0's binary_logloss: 0.130172 valid_1's auc: 0.835375 valid_1's binary_logloss: 0.143933 [12] valid_0's auc: 0.868266 valid_0's binary_logloss: 0.129101 valid_1's auc: 0.835951 valid_1's binary_logloss: 0.143342 [13] valid_0's auc: 0.870762 valid_0's binary_logloss: 0.128144 valid_1's auc: 0.83626 valid_1's binary_logloss: 0.142813 [14] valid_0's auc: 0.872747 valid_0's binary_logloss: 0.127222 valid_1's auc: 0.835864 valid_1's binary_logloss: 0.142466 [15] valid_0's auc: 0.874158 valid_0's binary_logloss: 0.126428 valid_1's auc: 0.83548 valid_1's binary_logloss: 0.142108 [16] valid_0's auc: 0.875931 valid_0's binary_logloss: 0.125651 valid_1's auc: 0.836367 valid_1's binary_logloss: 0.141684 [17] valid_0's auc: 0.876854 valid_0's binary_logloss: 0.124918 valid_1's auc: 0.835689 valid_1's binary_logloss: 0.141524 [18] valid_0's auc: 0.878211 valid_0's binary_logloss: 0.124197 valid_1's auc: 0.835893 valid_1's binary_logloss: 0.141285 [19] valid_0's auc: 0.879125 valid_0's binary_logloss: 0.123553 valid_1's auc: 0.835877 valid_1's binary_logloss: 0.141128 [20] valid_0's auc: 0.880489 valid_0's binary_logloss: 0.122856 valid_1's auc: 0.835385 valid_1's binary_logloss: 0.141032 [21] valid_0's auc: 0.881696 valid_0's binary_logloss: 0.122219 valid_1's auc: 0.835822 valid_1's binary_logloss: 0.140843 [22] valid_0's auc: 0.882257 valid_0's binary_logloss: 0.121726 valid_1's auc: 0.835849 valid_1's binary_logloss: 0.140761 [23] valid_0's auc: 0.883635 valid_0's binary_logloss: 0.121206 valid_1's auc: 0.836223 valid_1's binary_logloss: 0.140607 [24] valid_0's auc: 0.884533 valid_0's binary_logloss: 0.120734 valid_1's auc: 0.836473 valid_1's binary_logloss: 0.14049 [25] valid_0's auc: 0.885234 valid_0's binary_logloss: 0.120268 valid_1's auc: 0.836722 valid_1's binary_logloss: 0.140403 [26] valid_0's auc: 0.886292 valid_0's binary_logloss: 0.119794 valid_1's auc: 0.836549 valid_1's binary_logloss: 0.140423 [27] valid_0's auc: 0.887064 valid_0's binary_logloss: 0.119366 valid_1's auc: 0.836155 valid_1's binary_logloss: 0.140447 [28] valid_0's auc: 0.887621 valid_0's binary_logloss: 0.119008 valid_1's auc: 0.835594 valid_1's binary_logloss: 0.140532 [29] valid_0's auc: 0.888965 valid_0's binary_logloss: 0.118547 valid_1's auc: 0.835464 valid_1's binary_logloss: 0.140508 [30] valid_0's auc: 0.889898 valid_0's binary_logloss: 0.118139 valid_1's auc: 0.83577 valid_1's binary_logloss: 0.140461 [31] valid_0's auc: 0.890896 valid_0's binary_logloss: 0.117734 valid_1's auc: 0.835475 valid_1's binary_logloss: 0.140463 [32] valid_0's auc: 0.892374 valid_0's binary_logloss: 0.1173 valid_1's auc: 0.835364 valid_1's binary_logloss: 0.140506 [33] valid_0's auc: 0.893164 valid_0's binary_logloss: 0.116978 valid_1's auc: 0.835865 valid_1's binary_logloss: 0.14041 [34] valid_0's auc: 0.893848 valid_0's binary_logloss: 0.11662 valid_1's auc: 0.836021 valid_1's binary_logloss: 0.140353 [35] valid_0's auc: 0.894232 valid_0's binary_logloss: 0.116323 valid_1's auc: 0.8359 valid_1's binary_logloss: 0.140396 [36] valid_0's auc: 0.895003 valid_0's binary_logloss: 0.115986 valid_1's auc: 0.835855 valid_1's binary_logloss: 0.140416 [37] valid_0's auc: 0.895898 valid_0's binary_logloss: 0.115609 valid_1's auc: 0.836185 valid_1's binary_logloss: 0.140369 [38] valid_0's auc: 0.896459 valid_0's binary_logloss: 0.11527 valid_1's auc: 0.835754 valid_1's binary_logloss: 0.140443 [39] valid_0's auc: 0.897377 valid_0's binary_logloss: 0.114873 valid_1's auc: 0.835638 valid_1's binary_logloss: 0.140474 [40] valid_0's auc: 0.89776 valid_0's binary_logloss: 0.114588 valid_1's auc: 0.835639 valid_1's binary_logloss: 0.140491 [41] valid_0's auc: 0.898583 valid_0's binary_logloss: 0.114302 valid_1's auc: 0.835705 valid_1's binary_logloss: 0.140506 [42] valid_0's auc: 0.899197 valid_0's binary_logloss: 0.113975 valid_1's auc: 0.835052 valid_1's binary_logloss: 0.14064 [43] valid_0's auc: 0.899803 valid_0's binary_logloss: 0.113654 valid_1's auc: 0.835035 valid_1's binary_logloss: 0.140691 [44] valid_0's auc: 0.900641 valid_0's binary_logloss: 0.113388 valid_1's auc: 0.835214 valid_1's binary_logloss: 0.140703 [45] valid_0's auc: 0.900962 valid_0's binary_logloss: 0.113098 valid_1's auc: 0.835276 valid_1's binary_logloss: 0.140695 [46] valid_0's auc: 0.901584 valid_0's binary_logloss: 0.112771 valid_1's auc: 0.83495 valid_1's binary_logloss: 0.140754 [47] valid_0's auc: 0.902256 valid_0's binary_logloss: 0.112493 valid_1's auc: 0.835639 valid_1's binary_logloss: 0.14064 [48] valid_0's auc: 0.902688 valid_0's binary_logloss: 0.112198 valid_1's auc: 0.835495 valid_1's binary_logloss: 0.140691 [49] valid_0's auc: 0.902922 valid_0's binary_logloss: 0.111944 valid_1's auc: 0.835281 valid_1's binary_logloss: 0.140819 [50] valid_0's auc: 0.903747 valid_0's binary_logloss: 0.111595 valid_1's auc: 0.835359 valid_1's binary_logloss: 0.140811 [51] valid_0's auc: 0.904427 valid_0's binary_logloss: 0.111354 valid_1's auc: 0.835245 valid_1's binary_logloss: 0.140873 [52] valid_0's auc: 0.90467 valid_0's binary_logloss: 0.111111 valid_1's auc: 0.835057 valid_1's binary_logloss: 0.140993 [53] valid_0's auc: 0.904868 valid_0's binary_logloss: 0.110853 valid_1's auc: 0.834751 valid_1's binary_logloss: 0.14108 [54] valid_0's auc: 0.905166 valid_0's binary_logloss: 0.110627 valid_1's auc: 0.83411 valid_1's binary_logloss: 0.141282 [55] valid_0's auc: 0.905665 valid_0's binary_logloss: 0.110375 valid_1's auc: 0.833739 valid_1's binary_logloss: 0.141413
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.824873 valid_0's binary_logloss: 0.156222 valid_1's auc: 0.817791 valid_1's binary_logloss: 0.165072 [2] valid_0's auc: 0.828725 valid_0's binary_logloss: 0.151244 valid_1's auc: 0.822586 valid_1's binary_logloss: 0.160253 [3] valid_0's auc: 0.83594 valid_0's binary_logloss: 0.147423 valid_1's auc: 0.828474 valid_1's binary_logloss: 0.156542 [4] valid_0's auc: 0.839489 valid_0's binary_logloss: 0.144426 valid_1's auc: 0.831396 valid_1's binary_logloss: 0.153706 [5] valid_0's auc: 0.843358 valid_0's binary_logloss: 0.142067 valid_1's auc: 0.833466 valid_1's binary_logloss: 0.151399 [6] valid_0's auc: 0.845601 valid_0's binary_logloss: 0.14009 valid_1's auc: 0.833857 valid_1's binary_logloss: 0.149488 [7] valid_0's auc: 0.846477 valid_0's binary_logloss: 0.138491 valid_1's auc: 0.833143 valid_1's binary_logloss: 0.148023 [8] valid_0's auc: 0.847725 valid_0's binary_logloss: 0.137129 valid_1's auc: 0.833971 valid_1's binary_logloss: 0.146757 [9] valid_0's auc: 0.848442 valid_0's binary_logloss: 0.135908 valid_1's auc: 0.835976 valid_1's binary_logloss: 0.145685 [10] valid_0's auc: 0.849759 valid_0's binary_logloss: 0.134781 valid_1's auc: 0.836214 valid_1's binary_logloss: 0.144769 [11] valid_0's auc: 0.852238 valid_0's binary_logloss: 0.133835 valid_1's auc: 0.837243 valid_1's binary_logloss: 0.143925 [12] valid_0's auc: 0.853743 valid_0's binary_logloss: 0.132972 valid_1's auc: 0.836647 valid_1's binary_logloss: 0.143391 [13] valid_0's auc: 0.854568 valid_0's binary_logloss: 0.132256 valid_1's auc: 0.837182 valid_1's binary_logloss: 0.142849 [14] valid_0's auc: 0.855928 valid_0's binary_logloss: 0.131554 valid_1's auc: 0.835941 valid_1's binary_logloss: 0.142474 [15] valid_0's auc: 0.85712 valid_0's binary_logloss: 0.130984 valid_1's auc: 0.834938 valid_1's binary_logloss: 0.142198 [16] valid_0's auc: 0.858721 valid_0's binary_logloss: 0.130371 valid_1's auc: 0.83561 valid_1's binary_logloss: 0.141802 [17] valid_0's auc: 0.859281 valid_0's binary_logloss: 0.129877 valid_1's auc: 0.835146 valid_1's binary_logloss: 0.141605 [18] valid_0's auc: 0.859881 valid_0's binary_logloss: 0.129417 valid_1's auc: 0.835386 valid_1's binary_logloss: 0.14132 [19] valid_0's auc: 0.861409 valid_0's binary_logloss: 0.128929 valid_1's auc: 0.834974 valid_1's binary_logloss: 0.141151 [20] valid_0's auc: 0.862574 valid_0's binary_logloss: 0.128458 valid_1's auc: 0.834949 valid_1's binary_logloss: 0.140968 [21] valid_0's auc: 0.863262 valid_0's binary_logloss: 0.128069 valid_1's auc: 0.834616 valid_1's binary_logloss: 0.14086 [22] valid_0's auc: 0.864655 valid_0's binary_logloss: 0.127684 valid_1's auc: 0.834363 valid_1's binary_logloss: 0.140766 [23] valid_0's auc: 0.865247 valid_0's binary_logloss: 0.127349 valid_1's auc: 0.834317 valid_1's binary_logloss: 0.140688 [24] valid_0's auc: 0.865882 valid_0's binary_logloss: 0.12704 valid_1's auc: 0.833543 valid_1's binary_logloss: 0.14068 [25] valid_0's auc: 0.867496 valid_0's binary_logloss: 0.126629 valid_1's auc: 0.834195 valid_1's binary_logloss: 0.140539 [26] valid_0's auc: 0.867923 valid_0's binary_logloss: 0.126353 valid_1's auc: 0.834028 valid_1's binary_logloss: 0.140506 [27] valid_0's auc: 0.868685 valid_0's binary_logloss: 0.126058 valid_1's auc: 0.834718 valid_1's binary_logloss: 0.140359 [28] valid_0's auc: 0.869304 valid_0's binary_logloss: 0.125764 valid_1's auc: 0.834935 valid_1's binary_logloss: 0.140287 [29] valid_0's auc: 0.870037 valid_0's binary_logloss: 0.125514 valid_1's auc: 0.834481 valid_1's binary_logloss: 0.140258 [30] valid_0's auc: 0.870785 valid_0's binary_logloss: 0.125254 valid_1's auc: 0.834179 valid_1's binary_logloss: 0.140275 [31] valid_0's auc: 0.871706 valid_0's binary_logloss: 0.124992 valid_1's auc: 0.834475 valid_1's binary_logloss: 0.140205 [32] valid_0's auc: 0.872582 valid_0's binary_logloss: 0.124728 valid_1's auc: 0.834353 valid_1's binary_logloss: 0.140189 [33] valid_0's auc: 0.873445 valid_0's binary_logloss: 0.124481 valid_1's auc: 0.834592 valid_1's binary_logloss: 0.140082 [34] valid_0's auc: 0.874095 valid_0's binary_logloss: 0.12426 valid_1's auc: 0.83436 valid_1's binary_logloss: 0.140101 [35] valid_0's auc: 0.874869 valid_0's binary_logloss: 0.123982 valid_1's auc: 0.834045 valid_1's binary_logloss: 0.140151 [36] valid_0's auc: 0.875446 valid_0's binary_logloss: 0.123753 valid_1's auc: 0.834073 valid_1's binary_logloss: 0.140125 [37] valid_0's auc: 0.875763 valid_0's binary_logloss: 0.123587 valid_1's auc: 0.833611 valid_1's binary_logloss: 0.140201 [38] valid_0's auc: 0.876603 valid_0's binary_logloss: 0.123335 valid_1's auc: 0.833805 valid_1's binary_logloss: 0.140159 [39] valid_0's auc: 0.877126 valid_0's binary_logloss: 0.123134 valid_1's auc: 0.834422 valid_1's binary_logloss: 0.140048 [40] valid_0's auc: 0.877575 valid_0's binary_logloss: 0.123013 valid_1's auc: 0.834343 valid_1's binary_logloss: 0.140069 [41] valid_0's auc: 0.87809 valid_0's binary_logloss: 0.122813 valid_1's auc: 0.834199 valid_1's binary_logloss: 0.140085
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.821831 valid_0's binary_logloss: 0.156466 valid_1's auc: 0.817525 valid_1's binary_logloss: 0.165186 [2] valid_0's auc: 0.831974 valid_0's binary_logloss: 0.151137 valid_1's auc: 0.82532 valid_1's binary_logloss: 0.159691 [3] valid_0's auc: 0.839496 valid_0's binary_logloss: 0.14733 valid_1's auc: 0.831946 valid_1's binary_logloss: 0.156 [4] valid_0's auc: 0.843984 valid_0's binary_logloss: 0.144371 valid_1's auc: 0.834064 valid_1's binary_logloss: 0.153082 [5] valid_0's auc: 0.845854 valid_0's binary_logloss: 0.142024 valid_1's auc: 0.836918 valid_1's binary_logloss: 0.150735 [6] valid_0's auc: 0.848041 valid_0's binary_logloss: 0.140009 valid_1's auc: 0.838831 valid_1's binary_logloss: 0.148771 [7] valid_0's auc: 0.849655 valid_0's binary_logloss: 0.138307 valid_1's auc: 0.839111 valid_1's binary_logloss: 0.147373 [8] valid_0's auc: 0.85185 valid_0's binary_logloss: 0.136891 valid_1's auc: 0.838955 valid_1's binary_logloss: 0.146094 [9] valid_0's auc: 0.853067 valid_0's binary_logloss: 0.135655 valid_1's auc: 0.838081 valid_1's binary_logloss: 0.14516 [10] valid_0's auc: 0.853922 valid_0's binary_logloss: 0.134622 valid_1's auc: 0.837333 valid_1's binary_logloss: 0.144318 [11] valid_0's auc: 0.854729 valid_0's binary_logloss: 0.133702 valid_1's auc: 0.83725 valid_1's binary_logloss: 0.143512 [12] valid_0's auc: 0.856303 valid_0's binary_logloss: 0.132789 valid_1's auc: 0.837602 valid_1's binary_logloss: 0.142833 [13] valid_0's auc: 0.857206 valid_0's binary_logloss: 0.132038 valid_1's auc: 0.837364 valid_1's binary_logloss: 0.142245 [14] valid_0's auc: 0.858161 valid_0's binary_logloss: 0.131391 valid_1's auc: 0.83777 valid_1's binary_logloss: 0.141759 [15] valid_0's auc: 0.858975 valid_0's binary_logloss: 0.130772 valid_1's auc: 0.837831 valid_1's binary_logloss: 0.14139 [16] valid_0's auc: 0.859623 valid_0's binary_logloss: 0.130219 valid_1's auc: 0.837953 valid_1's binary_logloss: 0.141016 [17] valid_0's auc: 0.860576 valid_0's binary_logloss: 0.129684 valid_1's auc: 0.837985 valid_1's binary_logloss: 0.140713 [18] valid_0's auc: 0.861311 valid_0's binary_logloss: 0.129202 valid_1's auc: 0.83796 valid_1's binary_logloss: 0.140452 [19] valid_0's auc: 0.862347 valid_0's binary_logloss: 0.128715 valid_1's auc: 0.838506 valid_1's binary_logloss: 0.140189 [20] valid_0's auc: 0.86305 valid_0's binary_logloss: 0.128312 valid_1's auc: 0.837702 valid_1's binary_logloss: 0.140094 [21] valid_0's auc: 0.863758 valid_0's binary_logloss: 0.127907 valid_1's auc: 0.838127 valid_1's binary_logloss: 0.139858 [22] valid_0's auc: 0.864635 valid_0's binary_logloss: 0.127525 valid_1's auc: 0.838331 valid_1's binary_logloss: 0.139696 [23] valid_0's auc: 0.865866 valid_0's binary_logloss: 0.127143 valid_1's auc: 0.837841 valid_1's binary_logloss: 0.139625 [24] valid_0's auc: 0.867054 valid_0's binary_logloss: 0.126749 valid_1's auc: 0.838187 valid_1's binary_logloss: 0.139526 [25] valid_0's auc: 0.867553 valid_0's binary_logloss: 0.126476 valid_1's auc: 0.838308 valid_1's binary_logloss: 0.13949 [26] valid_0's auc: 0.868108 valid_0's binary_logloss: 0.126164 valid_1's auc: 0.838035 valid_1's binary_logloss: 0.139426 [27] valid_0's auc: 0.869014 valid_0's binary_logloss: 0.125868 valid_1's auc: 0.837545 valid_1's binary_logloss: 0.139445 [28] valid_0's auc: 0.869797 valid_0's binary_logloss: 0.12559 valid_1's auc: 0.837894 valid_1's binary_logloss: 0.139419 [29] valid_0's auc: 0.870435 valid_0's binary_logloss: 0.1253 valid_1's auc: 0.838103 valid_1's binary_logloss: 0.139321 [30] valid_0's auc: 0.87141 valid_0's binary_logloss: 0.125025 valid_1's auc: 0.838164 valid_1's binary_logloss: 0.139275 [31] valid_0's auc: 0.872143 valid_0's binary_logloss: 0.124769 valid_1's auc: 0.837843 valid_1's binary_logloss: 0.139285 [32] valid_0's auc: 0.872606 valid_0's binary_logloss: 0.124561 valid_1's auc: 0.837662 valid_1's binary_logloss: 0.139274 [33] valid_0's auc: 0.873337 valid_0's binary_logloss: 0.124346 valid_1's auc: 0.837661 valid_1's binary_logloss: 0.139284 [34] valid_0's auc: 0.873965 valid_0's binary_logloss: 0.124108 valid_1's auc: 0.837639 valid_1's binary_logloss: 0.139263 [35] valid_0's auc: 0.87457 valid_0's binary_logloss: 0.123857 valid_1's auc: 0.838159 valid_1's binary_logloss: 0.139137 [36] valid_0's auc: 0.874973 valid_0's binary_logloss: 0.123651 valid_1's auc: 0.838114 valid_1's binary_logloss: 0.139148 [37] valid_0's auc: 0.875657 valid_0's binary_logloss: 0.123447 valid_1's auc: 0.838519 valid_1's binary_logloss: 0.139109
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.821427 valid_0's binary_logloss: 0.156592 valid_1's auc: 0.81711 valid_1's binary_logloss: 0.165273 [2] valid_0's auc: 0.827893 valid_0's binary_logloss: 0.151336 valid_1's auc: 0.820533 valid_1's binary_logloss: 0.160243 [3] valid_0's auc: 0.83753 valid_0's binary_logloss: 0.147487 valid_1's auc: 0.82841 valid_1's binary_logloss: 0.156547 [4] valid_0's auc: 0.84038 valid_0's binary_logloss: 0.144428 valid_1's auc: 0.8313 valid_1's binary_logloss: 0.153575 [5] valid_0's auc: 0.842945 valid_0's binary_logloss: 0.142089 valid_1's auc: 0.833579 valid_1's binary_logloss: 0.151354 [6] valid_0's auc: 0.843246 valid_0's binary_logloss: 0.140186 valid_1's auc: 0.833781 valid_1's binary_logloss: 0.14953 [7] valid_0's auc: 0.844301 valid_0's binary_logloss: 0.138471 valid_1's auc: 0.834317 valid_1's binary_logloss: 0.147954 [8] valid_0's auc: 0.846945 valid_0's binary_logloss: 0.137078 valid_1's auc: 0.834895 valid_1's binary_logloss: 0.146786 [9] valid_0's auc: 0.849381 valid_0's binary_logloss: 0.135906 valid_1's auc: 0.834922 valid_1's binary_logloss: 0.145762 [10] valid_0's auc: 0.850944 valid_0's binary_logloss: 0.134855 valid_1's auc: 0.835441 valid_1's binary_logloss: 0.144958 [11] valid_0's auc: 0.852557 valid_0's binary_logloss: 0.133895 valid_1's auc: 0.835103 valid_1's binary_logloss: 0.144293 [12] valid_0's auc: 0.854609 valid_0's binary_logloss: 0.133013 valid_1's auc: 0.835686 valid_1's binary_logloss: 0.143793 [13] valid_0's auc: 0.855817 valid_0's binary_logloss: 0.132247 valid_1's auc: 0.835296 valid_1's binary_logloss: 0.143302 [14] valid_0's auc: 0.857501 valid_0's binary_logloss: 0.131545 valid_1's auc: 0.836432 valid_1's binary_logloss: 0.142761 [15] valid_0's auc: 0.858907 valid_0's binary_logloss: 0.130878 valid_1's auc: 0.836329 valid_1's binary_logloss: 0.142383 [16] valid_0's auc: 0.859887 valid_0's binary_logloss: 0.130287 valid_1's auc: 0.836611 valid_1's binary_logloss: 0.141883 [17] valid_0's auc: 0.860889 valid_0's binary_logloss: 0.129757 valid_1's auc: 0.836848 valid_1's binary_logloss: 0.141535 [18] valid_0's auc: 0.861827 valid_0's binary_logloss: 0.129301 valid_1's auc: 0.837106 valid_1's binary_logloss: 0.141257 [19] valid_0's auc: 0.862972 valid_0's binary_logloss: 0.128826 valid_1's auc: 0.837185 valid_1's binary_logloss: 0.141043 [20] valid_0's auc: 0.864083 valid_0's binary_logloss: 0.128369 valid_1's auc: 0.837509 valid_1's binary_logloss: 0.140794 [21] valid_0's auc: 0.864747 valid_0's binary_logloss: 0.127959 valid_1's auc: 0.837888 valid_1's binary_logloss: 0.140626 [22] valid_0's auc: 0.865769 valid_0's binary_logloss: 0.127562 valid_1's auc: 0.837811 valid_1's binary_logloss: 0.140487 [23] valid_0's auc: 0.866657 valid_0's binary_logloss: 0.127217 valid_1's auc: 0.837884 valid_1's binary_logloss: 0.140328 [24] valid_0's auc: 0.867293 valid_0's binary_logloss: 0.126875 valid_1's auc: 0.838481 valid_1's binary_logloss: 0.140215 [25] valid_0's auc: 0.867983 valid_0's binary_logloss: 0.126562 valid_1's auc: 0.838239 valid_1's binary_logloss: 0.140124 [26] valid_0's auc: 0.868559 valid_0's binary_logloss: 0.126248 valid_1's auc: 0.837903 valid_1's binary_logloss: 0.140092 [27] valid_0's auc: 0.869394 valid_0's binary_logloss: 0.125936 valid_1's auc: 0.837493 valid_1's binary_logloss: 0.14006 [28] valid_0's auc: 0.87048 valid_0's binary_logloss: 0.125677 valid_1's auc: 0.837623 valid_1's binary_logloss: 0.140007 [29] valid_0's auc: 0.87105 valid_0's binary_logloss: 0.125405 valid_1's auc: 0.838216 valid_1's binary_logloss: 0.13986 [30] valid_0's auc: 0.871749 valid_0's binary_logloss: 0.125147 valid_1's auc: 0.838898 valid_1's binary_logloss: 0.139742 [31] valid_0's auc: 0.87247 valid_0's binary_logloss: 0.124907 valid_1's auc: 0.838959 valid_1's binary_logloss: 0.139727 [32] valid_0's auc: 0.87282 valid_0's binary_logloss: 0.124724 valid_1's auc: 0.838675 valid_1's binary_logloss: 0.139761 [33] valid_0's auc: 0.874106 valid_0's binary_logloss: 0.124412 valid_1's auc: 0.838893 valid_1's binary_logloss: 0.139687 [34] valid_0's auc: 0.874887 valid_0's binary_logloss: 0.124169 valid_1's auc: 0.838801 valid_1's binary_logloss: 0.139672 [35] valid_0's auc: 0.875447 valid_0's binary_logloss: 0.123934 valid_1's auc: 0.838835 valid_1's binary_logloss: 0.139667 [36] valid_0's auc: 0.87617 valid_0's binary_logloss: 0.123693 valid_1's auc: 0.838505 valid_1's binary_logloss: 0.139699 [37] valid_0's auc: 0.876793 valid_0's binary_logloss: 0.12346 valid_1's auc: 0.838104 valid_1's binary_logloss: 0.139783 [38] valid_0's auc: 0.877265 valid_0's binary_logloss: 0.123251 valid_1's auc: 0.838267 valid_1's binary_logloss: 0.139787 [39] valid_0's auc: 0.877869 valid_0's binary_logloss: 0.123018 valid_1's auc: 0.838004 valid_1's binary_logloss: 0.139806 [40] valid_0's auc: 0.878509 valid_0's binary_logloss: 0.122803 valid_1's auc: 0.838086 valid_1's binary_logloss: 0.139745 [41] valid_0's auc: 0.879077 valid_0's binary_logloss: 0.122585 valid_1's auc: 0.838538 valid_1's binary_logloss: 0.139694 [42] valid_0's auc: 0.879515 valid_0's binary_logloss: 0.122368 valid_1's auc: 0.838647 valid_1's binary_logloss: 0.139655 [43] valid_0's auc: 0.879985 valid_0's binary_logloss: 0.122166 valid_1's auc: 0.838495 valid_1's binary_logloss: 0.139653 [44] valid_0's auc: 0.88041 valid_0's binary_logloss: 0.121985 valid_1's auc: 0.838221 valid_1's binary_logloss: 0.139755 [45] valid_0's auc: 0.880907 valid_0's binary_logloss: 0.121777 valid_1's auc: 0.837981 valid_1's binary_logloss: 0.139769 [46] valid_0's auc: 0.881216 valid_0's binary_logloss: 0.121594 valid_1's auc: 0.838471 valid_1's binary_logloss: 0.139693 [47] valid_0's auc: 0.881591 valid_0's binary_logloss: 0.121422 valid_1's auc: 0.83861 valid_1's binary_logloss: 0.139687 [48] valid_0's auc: 0.881867 valid_0's binary_logloss: 0.121266 valid_1's auc: 0.838593 valid_1's binary_logloss: 0.139682 [49] valid_0's auc: 0.882285 valid_0's binary_logloss: 0.121041 valid_1's auc: 0.838317 valid_1's binary_logloss: 0.139741 [50] valid_0's auc: 0.882828 valid_0's binary_logloss: 0.120853 valid_1's auc: 0.838244 valid_1's binary_logloss: 0.139759 [51] valid_0's auc: 0.883154 valid_0's binary_logloss: 0.120688 valid_1's auc: 0.838222 valid_1's binary_logloss: 0.139803 [52] valid_0's auc: 0.883348 valid_0's binary_logloss: 0.120567 valid_1's auc: 0.838064 valid_1's binary_logloss: 0.139824 [53] valid_0's auc: 0.883583 valid_0's binary_logloss: 0.120424 valid_1's auc: 0.83788 valid_1's binary_logloss: 0.139844 [54] valid_0's auc: 0.884106 valid_0's binary_logloss: 0.120208 valid_1's auc: 0.837625 valid_1's binary_logloss: 0.139886 [55] valid_0's auc: 0.884777 valid_0's binary_logloss: 0.120039 valid_1's auc: 0.837585 valid_1's binary_logloss: 0.139902 [56] valid_0's auc: 0.88511 valid_0's binary_logloss: 0.11989 valid_1's auc: 0.837646 valid_1's binary_logloss: 0.139926 [57] valid_0's auc: 0.885365 valid_0's binary_logloss: 0.11975 valid_1's auc: 0.837639 valid_1's binary_logloss: 0.139934 [58] valid_0's auc: 0.885606 valid_0's binary_logloss: 0.119595 valid_1's auc: 0.837726 valid_1's binary_logloss: 0.139938 [59] valid_0's auc: 0.885965 valid_0's binary_logloss: 0.119403 valid_1's auc: 0.837558 valid_1's binary_logloss: 0.140007 [60] valid_0's auc: 0.886208 valid_0's binary_logloss: 0.119263 valid_1's auc: 0.83744 valid_1's binary_logloss: 0.140079 [61] valid_0's auc: 0.886458 valid_0's binary_logloss: 0.119118 valid_1's auc: 0.837349 valid_1's binary_logloss: 0.140059
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.824873 valid_0's binary_logloss: 0.156222 valid_1's auc: 0.817791 valid_1's binary_logloss: 0.165072 [2] valid_0's auc: 0.828725 valid_0's binary_logloss: 0.151244 valid_1's auc: 0.822586 valid_1's binary_logloss: 0.160253 [3] valid_0's auc: 0.83594 valid_0's binary_logloss: 0.147423 valid_1's auc: 0.828474 valid_1's binary_logloss: 0.156542 [4] valid_0's auc: 0.839489 valid_0's binary_logloss: 0.144426 valid_1's auc: 0.831396 valid_1's binary_logloss: 0.153706 [5] valid_0's auc: 0.843358 valid_0's binary_logloss: 0.142067 valid_1's auc: 0.833466 valid_1's binary_logloss: 0.151399 [6] valid_0's auc: 0.845601 valid_0's binary_logloss: 0.14009 valid_1's auc: 0.833857 valid_1's binary_logloss: 0.149488 [7] valid_0's auc: 0.846477 valid_0's binary_logloss: 0.138491 valid_1's auc: 0.833143 valid_1's binary_logloss: 0.148023 [8] valid_0's auc: 0.847725 valid_0's binary_logloss: 0.137129 valid_1's auc: 0.833971 valid_1's binary_logloss: 0.146757 [9] valid_0's auc: 0.848442 valid_0's binary_logloss: 0.135908 valid_1's auc: 0.835976 valid_1's binary_logloss: 0.145685 [10] valid_0's auc: 0.849759 valid_0's binary_logloss: 0.134781 valid_1's auc: 0.836214 valid_1's binary_logloss: 0.144769 [11] valid_0's auc: 0.852238 valid_0's binary_logloss: 0.133835 valid_1's auc: 0.837243 valid_1's binary_logloss: 0.143925 [12] valid_0's auc: 0.853743 valid_0's binary_logloss: 0.132972 valid_1's auc: 0.836647 valid_1's binary_logloss: 0.143391 [13] valid_0's auc: 0.854568 valid_0's binary_logloss: 0.132256 valid_1's auc: 0.837182 valid_1's binary_logloss: 0.142849 [14] valid_0's auc: 0.855928 valid_0's binary_logloss: 0.131554 valid_1's auc: 0.835941 valid_1's binary_logloss: 0.142474 [15] valid_0's auc: 0.85712 valid_0's binary_logloss: 0.130984 valid_1's auc: 0.834938 valid_1's binary_logloss: 0.142198 [16] valid_0's auc: 0.858721 valid_0's binary_logloss: 0.130371 valid_1's auc: 0.83561 valid_1's binary_logloss: 0.141802 [17] valid_0's auc: 0.859281 valid_0's binary_logloss: 0.129877 valid_1's auc: 0.835146 valid_1's binary_logloss: 0.141605 [18] valid_0's auc: 0.859881 valid_0's binary_logloss: 0.129417 valid_1's auc: 0.835386 valid_1's binary_logloss: 0.14132 [19] valid_0's auc: 0.861409 valid_0's binary_logloss: 0.128929 valid_1's auc: 0.834974 valid_1's binary_logloss: 0.141151 [20] valid_0's auc: 0.862574 valid_0's binary_logloss: 0.128458 valid_1's auc: 0.834949 valid_1's binary_logloss: 0.140968 [21] valid_0's auc: 0.863262 valid_0's binary_logloss: 0.128069 valid_1's auc: 0.834616 valid_1's binary_logloss: 0.14086 [22] valid_0's auc: 0.864655 valid_0's binary_logloss: 0.127684 valid_1's auc: 0.834363 valid_1's binary_logloss: 0.140766 [23] valid_0's auc: 0.865247 valid_0's binary_logloss: 0.127349 valid_1's auc: 0.834317 valid_1's binary_logloss: 0.140688 [24] valid_0's auc: 0.865882 valid_0's binary_logloss: 0.12704 valid_1's auc: 0.833543 valid_1's binary_logloss: 0.14068 [25] valid_0's auc: 0.867496 valid_0's binary_logloss: 0.126629 valid_1's auc: 0.834195 valid_1's binary_logloss: 0.140539 [26] valid_0's auc: 0.867923 valid_0's binary_logloss: 0.126353 valid_1's auc: 0.834028 valid_1's binary_logloss: 0.140506 [27] valid_0's auc: 0.868685 valid_0's binary_logloss: 0.126058 valid_1's auc: 0.834718 valid_1's binary_logloss: 0.140359 [28] valid_0's auc: 0.869304 valid_0's binary_logloss: 0.125764 valid_1's auc: 0.834935 valid_1's binary_logloss: 0.140287 [29] valid_0's auc: 0.870037 valid_0's binary_logloss: 0.125514 valid_1's auc: 0.834481 valid_1's binary_logloss: 0.140258 [30] valid_0's auc: 0.870785 valid_0's binary_logloss: 0.125254 valid_1's auc: 0.834179 valid_1's binary_logloss: 0.140275 [31] valid_0's auc: 0.871706 valid_0's binary_logloss: 0.124992 valid_1's auc: 0.834475 valid_1's binary_logloss: 0.140205 [32] valid_0's auc: 0.872582 valid_0's binary_logloss: 0.124728 valid_1's auc: 0.834353 valid_1's binary_logloss: 0.140189 [33] valid_0's auc: 0.873445 valid_0's binary_logloss: 0.124481 valid_1's auc: 0.834592 valid_1's binary_logloss: 0.140082 [34] valid_0's auc: 0.874095 valid_0's binary_logloss: 0.12426 valid_1's auc: 0.83436 valid_1's binary_logloss: 0.140101 [35] valid_0's auc: 0.874869 valid_0's binary_logloss: 0.123982 valid_1's auc: 0.834045 valid_1's binary_logloss: 0.140151 [36] valid_0's auc: 0.875446 valid_0's binary_logloss: 0.123753 valid_1's auc: 0.834073 valid_1's binary_logloss: 0.140125 [37] valid_0's auc: 0.875763 valid_0's binary_logloss: 0.123587 valid_1's auc: 0.833611 valid_1's binary_logloss: 0.140201 [38] valid_0's auc: 0.876603 valid_0's binary_logloss: 0.123335 valid_1's auc: 0.833805 valid_1's binary_logloss: 0.140159 [39] valid_0's auc: 0.877126 valid_0's binary_logloss: 0.123134 valid_1's auc: 0.834422 valid_1's binary_logloss: 0.140048 [40] valid_0's auc: 0.877575 valid_0's binary_logloss: 0.123013 valid_1's auc: 0.834343 valid_1's binary_logloss: 0.140069 [41] valid_0's auc: 0.87809 valid_0's binary_logloss: 0.122813 valid_1's auc: 0.834199 valid_1's binary_logloss: 0.140085
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.821831 valid_0's binary_logloss: 0.156466 valid_1's auc: 0.817525 valid_1's binary_logloss: 0.165186 [2] valid_0's auc: 0.831974 valid_0's binary_logloss: 0.151137 valid_1's auc: 0.82532 valid_1's binary_logloss: 0.159691 [3] valid_0's auc: 0.839496 valid_0's binary_logloss: 0.14733 valid_1's auc: 0.831946 valid_1's binary_logloss: 0.156 [4] valid_0's auc: 0.843984 valid_0's binary_logloss: 0.144371 valid_1's auc: 0.834064 valid_1's binary_logloss: 0.153082 [5] valid_0's auc: 0.845854 valid_0's binary_logloss: 0.142024 valid_1's auc: 0.836918 valid_1's binary_logloss: 0.150735 [6] valid_0's auc: 0.848041 valid_0's binary_logloss: 0.140009 valid_1's auc: 0.838831 valid_1's binary_logloss: 0.148771 [7] valid_0's auc: 0.849655 valid_0's binary_logloss: 0.138307 valid_1's auc: 0.839111 valid_1's binary_logloss: 0.147373 [8] valid_0's auc: 0.85185 valid_0's binary_logloss: 0.136891 valid_1's auc: 0.838955 valid_1's binary_logloss: 0.146094 [9] valid_0's auc: 0.853067 valid_0's binary_logloss: 0.135655 valid_1's auc: 0.838081 valid_1's binary_logloss: 0.14516 [10] valid_0's auc: 0.853922 valid_0's binary_logloss: 0.134622 valid_1's auc: 0.837333 valid_1's binary_logloss: 0.144318 [11] valid_0's auc: 0.854729 valid_0's binary_logloss: 0.133702 valid_1's auc: 0.83725 valid_1's binary_logloss: 0.143512 [12] valid_0's auc: 0.856303 valid_0's binary_logloss: 0.132789 valid_1's auc: 0.837602 valid_1's binary_logloss: 0.142833 [13] valid_0's auc: 0.857206 valid_0's binary_logloss: 0.132038 valid_1's auc: 0.837364 valid_1's binary_logloss: 0.142245 [14] valid_0's auc: 0.858161 valid_0's binary_logloss: 0.131391 valid_1's auc: 0.83777 valid_1's binary_logloss: 0.141759 [15] valid_0's auc: 0.858975 valid_0's binary_logloss: 0.130772 valid_1's auc: 0.837831 valid_1's binary_logloss: 0.14139 [16] valid_0's auc: 0.859623 valid_0's binary_logloss: 0.130219 valid_1's auc: 0.837953 valid_1's binary_logloss: 0.141016 [17] valid_0's auc: 0.860576 valid_0's binary_logloss: 0.129684 valid_1's auc: 0.837985 valid_1's binary_logloss: 0.140713 [18] valid_0's auc: 0.861311 valid_0's binary_logloss: 0.129202 valid_1's auc: 0.83796 valid_1's binary_logloss: 0.140452 [19] valid_0's auc: 0.862347 valid_0's binary_logloss: 0.128715 valid_1's auc: 0.838506 valid_1's binary_logloss: 0.140189 [20] valid_0's auc: 0.86305 valid_0's binary_logloss: 0.128312 valid_1's auc: 0.837702 valid_1's binary_logloss: 0.140094 [21] valid_0's auc: 0.863758 valid_0's binary_logloss: 0.127907 valid_1's auc: 0.838127 valid_1's binary_logloss: 0.139858 [22] valid_0's auc: 0.864635 valid_0's binary_logloss: 0.127525 valid_1's auc: 0.838331 valid_1's binary_logloss: 0.139696 [23] valid_0's auc: 0.865866 valid_0's binary_logloss: 0.127143 valid_1's auc: 0.837841 valid_1's binary_logloss: 0.139625 [24] valid_0's auc: 0.867054 valid_0's binary_logloss: 0.126749 valid_1's auc: 0.838187 valid_1's binary_logloss: 0.139526 [25] valid_0's auc: 0.867553 valid_0's binary_logloss: 0.126476 valid_1's auc: 0.838308 valid_1's binary_logloss: 0.13949 [26] valid_0's auc: 0.868108 valid_0's binary_logloss: 0.126164 valid_1's auc: 0.838035 valid_1's binary_logloss: 0.139426 [27] valid_0's auc: 0.869014 valid_0's binary_logloss: 0.125868 valid_1's auc: 0.837545 valid_1's binary_logloss: 0.139445 [28] valid_0's auc: 0.869797 valid_0's binary_logloss: 0.12559 valid_1's auc: 0.837894 valid_1's binary_logloss: 0.139419 [29] valid_0's auc: 0.870435 valid_0's binary_logloss: 0.1253 valid_1's auc: 0.838103 valid_1's binary_logloss: 0.139321 [30] valid_0's auc: 0.87141 valid_0's binary_logloss: 0.125025 valid_1's auc: 0.838164 valid_1's binary_logloss: 0.139275 [31] valid_0's auc: 0.872143 valid_0's binary_logloss: 0.124769 valid_1's auc: 0.837843 valid_1's binary_logloss: 0.139285 [32] valid_0's auc: 0.872606 valid_0's binary_logloss: 0.124561 valid_1's auc: 0.837662 valid_1's binary_logloss: 0.139274 [33] valid_0's auc: 0.873337 valid_0's binary_logloss: 0.124346 valid_1's auc: 0.837661 valid_1's binary_logloss: 0.139284 [34] valid_0's auc: 0.873965 valid_0's binary_logloss: 0.124108 valid_1's auc: 0.837639 valid_1's binary_logloss: 0.139263 [35] valid_0's auc: 0.87457 valid_0's binary_logloss: 0.123857 valid_1's auc: 0.838159 valid_1's binary_logloss: 0.139137 [36] valid_0's auc: 0.874973 valid_0's binary_logloss: 0.123651 valid_1's auc: 0.838114 valid_1's binary_logloss: 0.139148 [37] valid_0's auc: 0.875657 valid_0's binary_logloss: 0.123447 valid_1's auc: 0.838519 valid_1's binary_logloss: 0.139109
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.821427 valid_0's binary_logloss: 0.156592 valid_1's auc: 0.81711 valid_1's binary_logloss: 0.165273 [2] valid_0's auc: 0.827893 valid_0's binary_logloss: 0.151336 valid_1's auc: 0.820533 valid_1's binary_logloss: 0.160243 [3] valid_0's auc: 0.83753 valid_0's binary_logloss: 0.147487 valid_1's auc: 0.82841 valid_1's binary_logloss: 0.156547 [4] valid_0's auc: 0.84038 valid_0's binary_logloss: 0.144428 valid_1's auc: 0.8313 valid_1's binary_logloss: 0.153575 [5] valid_0's auc: 0.842945 valid_0's binary_logloss: 0.142089 valid_1's auc: 0.833579 valid_1's binary_logloss: 0.151354 [6] valid_0's auc: 0.843246 valid_0's binary_logloss: 0.140186 valid_1's auc: 0.833781 valid_1's binary_logloss: 0.14953 [7] valid_0's auc: 0.844301 valid_0's binary_logloss: 0.138471 valid_1's auc: 0.834317 valid_1's binary_logloss: 0.147954 [8] valid_0's auc: 0.846945 valid_0's binary_logloss: 0.137078 valid_1's auc: 0.834895 valid_1's binary_logloss: 0.146786 [9] valid_0's auc: 0.849381 valid_0's binary_logloss: 0.135906 valid_1's auc: 0.834922 valid_1's binary_logloss: 0.145762 [10] valid_0's auc: 0.850944 valid_0's binary_logloss: 0.134855 valid_1's auc: 0.835441 valid_1's binary_logloss: 0.144958 [11] valid_0's auc: 0.852557 valid_0's binary_logloss: 0.133895 valid_1's auc: 0.835103 valid_1's binary_logloss: 0.144293 [12] valid_0's auc: 0.854609 valid_0's binary_logloss: 0.133013 valid_1's auc: 0.835686 valid_1's binary_logloss: 0.143793 [13] valid_0's auc: 0.855817 valid_0's binary_logloss: 0.132247 valid_1's auc: 0.835296 valid_1's binary_logloss: 0.143302 [14] valid_0's auc: 0.857501 valid_0's binary_logloss: 0.131545 valid_1's auc: 0.836432 valid_1's binary_logloss: 0.142761 [15] valid_0's auc: 0.858907 valid_0's binary_logloss: 0.130878 valid_1's auc: 0.836329 valid_1's binary_logloss: 0.142383 [16] valid_0's auc: 0.859887 valid_0's binary_logloss: 0.130287 valid_1's auc: 0.836611 valid_1's binary_logloss: 0.141883 [17] valid_0's auc: 0.860889 valid_0's binary_logloss: 0.129757 valid_1's auc: 0.836848 valid_1's binary_logloss: 0.141535 [18] valid_0's auc: 0.861827 valid_0's binary_logloss: 0.129301 valid_1's auc: 0.837106 valid_1's binary_logloss: 0.141257 [19] valid_0's auc: 0.862972 valid_0's binary_logloss: 0.128826 valid_1's auc: 0.837185 valid_1's binary_logloss: 0.141043 [20] valid_0's auc: 0.864083 valid_0's binary_logloss: 0.128369 valid_1's auc: 0.837509 valid_1's binary_logloss: 0.140794 [21] valid_0's auc: 0.864747 valid_0's binary_logloss: 0.127959 valid_1's auc: 0.837888 valid_1's binary_logloss: 0.140626 [22] valid_0's auc: 0.865769 valid_0's binary_logloss: 0.127562 valid_1's auc: 0.837811 valid_1's binary_logloss: 0.140487 [23] valid_0's auc: 0.866657 valid_0's binary_logloss: 0.127217 valid_1's auc: 0.837884 valid_1's binary_logloss: 0.140328 [24] valid_0's auc: 0.867293 valid_0's binary_logloss: 0.126875 valid_1's auc: 0.838481 valid_1's binary_logloss: 0.140215 [25] valid_0's auc: 0.867983 valid_0's binary_logloss: 0.126562 valid_1's auc: 0.838239 valid_1's binary_logloss: 0.140124 [26] valid_0's auc: 0.868559 valid_0's binary_logloss: 0.126248 valid_1's auc: 0.837903 valid_1's binary_logloss: 0.140092 [27] valid_0's auc: 0.869394 valid_0's binary_logloss: 0.125936 valid_1's auc: 0.837493 valid_1's binary_logloss: 0.14006 [28] valid_0's auc: 0.87048 valid_0's binary_logloss: 0.125677 valid_1's auc: 0.837623 valid_1's binary_logloss: 0.140007 [29] valid_0's auc: 0.87105 valid_0's binary_logloss: 0.125405 valid_1's auc: 0.838216 valid_1's binary_logloss: 0.13986 [30] valid_0's auc: 0.871749 valid_0's binary_logloss: 0.125147 valid_1's auc: 0.838898 valid_1's binary_logloss: 0.139742 [31] valid_0's auc: 0.87247 valid_0's binary_logloss: 0.124907 valid_1's auc: 0.838959 valid_1's binary_logloss: 0.139727 [32] valid_0's auc: 0.87282 valid_0's binary_logloss: 0.124724 valid_1's auc: 0.838675 valid_1's binary_logloss: 0.139761 [33] valid_0's auc: 0.874106 valid_0's binary_logloss: 0.124412 valid_1's auc: 0.838893 valid_1's binary_logloss: 0.139687 [34] valid_0's auc: 0.874887 valid_0's binary_logloss: 0.124169 valid_1's auc: 0.838801 valid_1's binary_logloss: 0.139672 [35] valid_0's auc: 0.875447 valid_0's binary_logloss: 0.123934 valid_1's auc: 0.838835 valid_1's binary_logloss: 0.139667 [36] valid_0's auc: 0.87617 valid_0's binary_logloss: 0.123693 valid_1's auc: 0.838505 valid_1's binary_logloss: 0.139699 [37] valid_0's auc: 0.876793 valid_0's binary_logloss: 0.12346 valid_1's auc: 0.838104 valid_1's binary_logloss: 0.139783 [38] valid_0's auc: 0.877265 valid_0's binary_logloss: 0.123251 valid_1's auc: 0.838267 valid_1's binary_logloss: 0.139787 [39] valid_0's auc: 0.877869 valid_0's binary_logloss: 0.123018 valid_1's auc: 0.838004 valid_1's binary_logloss: 0.139806 [40] valid_0's auc: 0.878509 valid_0's binary_logloss: 0.122803 valid_1's auc: 0.838086 valid_1's binary_logloss: 0.139745 [41] valid_0's auc: 0.879077 valid_0's binary_logloss: 0.122585 valid_1's auc: 0.838538 valid_1's binary_logloss: 0.139694 [42] valid_0's auc: 0.879515 valid_0's binary_logloss: 0.122368 valid_1's auc: 0.838647 valid_1's binary_logloss: 0.139655 [43] valid_0's auc: 0.879985 valid_0's binary_logloss: 0.122166 valid_1's auc: 0.838495 valid_1's binary_logloss: 0.139653 [44] valid_0's auc: 0.88041 valid_0's binary_logloss: 0.121985 valid_1's auc: 0.838221 valid_1's binary_logloss: 0.139755 [45] valid_0's auc: 0.880907 valid_0's binary_logloss: 0.121777 valid_1's auc: 0.837981 valid_1's binary_logloss: 0.139769 [46] valid_0's auc: 0.881216 valid_0's binary_logloss: 0.121594 valid_1's auc: 0.838471 valid_1's binary_logloss: 0.139693 [47] valid_0's auc: 0.881591 valid_0's binary_logloss: 0.121422 valid_1's auc: 0.83861 valid_1's binary_logloss: 0.139687 [48] valid_0's auc: 0.881867 valid_0's binary_logloss: 0.121266 valid_1's auc: 0.838593 valid_1's binary_logloss: 0.139682 [49] valid_0's auc: 0.882285 valid_0's binary_logloss: 0.121041 valid_1's auc: 0.838317 valid_1's binary_logloss: 0.139741 [50] valid_0's auc: 0.882828 valid_0's binary_logloss: 0.120853 valid_1's auc: 0.838244 valid_1's binary_logloss: 0.139759 [51] valid_0's auc: 0.883154 valid_0's binary_logloss: 0.120688 valid_1's auc: 0.838222 valid_1's binary_logloss: 0.139803 [52] valid_0's auc: 0.883348 valid_0's binary_logloss: 0.120567 valid_1's auc: 0.838064 valid_1's binary_logloss: 0.139824 [53] valid_0's auc: 0.883583 valid_0's binary_logloss: 0.120424 valid_1's auc: 0.83788 valid_1's binary_logloss: 0.139844 [54] valid_0's auc: 0.884106 valid_0's binary_logloss: 0.120208 valid_1's auc: 0.837625 valid_1's binary_logloss: 0.139886 [55] valid_0's auc: 0.884777 valid_0's binary_logloss: 0.120039 valid_1's auc: 0.837585 valid_1's binary_logloss: 0.139902 [56] valid_0's auc: 0.88511 valid_0's binary_logloss: 0.11989 valid_1's auc: 0.837646 valid_1's binary_logloss: 0.139926 [57] valid_0's auc: 0.885365 valid_0's binary_logloss: 0.11975 valid_1's auc: 0.837639 valid_1's binary_logloss: 0.139934 [58] valid_0's auc: 0.885606 valid_0's binary_logloss: 0.119595 valid_1's auc: 0.837726 valid_1's binary_logloss: 0.139938 [59] valid_0's auc: 0.885965 valid_0's binary_logloss: 0.119403 valid_1's auc: 0.837558 valid_1's binary_logloss: 0.140007 [60] valid_0's auc: 0.886208 valid_0's binary_logloss: 0.119263 valid_1's auc: 0.83744 valid_1's binary_logloss: 0.140079 [61] valid_0's auc: 0.886458 valid_0's binary_logloss: 0.119118 valid_1's auc: 0.837349 valid_1's binary_logloss: 0.140059
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.835412 valid_0's binary_logloss: 0.155721 valid_1's auc: 0.81973 valid_1's binary_logloss: 0.164844 [2] valid_0's auc: 0.841188 valid_0's binary_logloss: 0.150354 valid_1's auc: 0.823402 valid_1's binary_logloss: 0.16006 [3] valid_0's auc: 0.846758 valid_0's binary_logloss: 0.146288 valid_1's auc: 0.824811 valid_1's binary_logloss: 0.15621 [4] valid_0's auc: 0.850398 valid_0's binary_logloss: 0.143088 valid_1's auc: 0.830278 valid_1's binary_logloss: 0.153352 [5] valid_0's auc: 0.853086 valid_0's binary_logloss: 0.140514 valid_1's auc: 0.833574 valid_1's binary_logloss: 0.151071 [6] valid_0's auc: 0.855915 valid_0's binary_logloss: 0.138329 valid_1's auc: 0.834881 valid_1's binary_logloss: 0.149277 [7] valid_0's auc: 0.858115 valid_0's binary_logloss: 0.136481 valid_1's auc: 0.833603 valid_1's binary_logloss: 0.14786 [8] valid_0's auc: 0.859479 valid_0's binary_logloss: 0.134947 valid_1's auc: 0.834093 valid_1's binary_logloss: 0.146607 [9] valid_0's auc: 0.86143 valid_0's binary_logloss: 0.133519 valid_1's auc: 0.833898 valid_1's binary_logloss: 0.14559 [10] valid_0's auc: 0.862964 valid_0's binary_logloss: 0.132331 valid_1's auc: 0.835026 valid_1's binary_logloss: 0.144789 [11] valid_0's auc: 0.864277 valid_0's binary_logloss: 0.13126 valid_1's auc: 0.834957 valid_1's binary_logloss: 0.144152 [12] valid_0's auc: 0.865572 valid_0's binary_logloss: 0.130304 valid_1's auc: 0.833693 valid_1's binary_logloss: 0.143697 [13] valid_0's auc: 0.867519 valid_0's binary_logloss: 0.129385 valid_1's auc: 0.833158 valid_1's binary_logloss: 0.143184 [14] valid_0's auc: 0.869354 valid_0's binary_logloss: 0.128524 valid_1's auc: 0.833598 valid_1's binary_logloss: 0.142668 [15] valid_0's auc: 0.870553 valid_0's binary_logloss: 0.127746 valid_1's auc: 0.833467 valid_1's binary_logloss: 0.142302 [16] valid_0's auc: 0.871816 valid_0's binary_logloss: 0.126943 valid_1's auc: 0.83329 valid_1's binary_logloss: 0.142022 [17] valid_0's auc: 0.872964 valid_0's binary_logloss: 0.126266 valid_1's auc: 0.83279 valid_1's binary_logloss: 0.141891 [18] valid_0's auc: 0.874047 valid_0's binary_logloss: 0.125646 valid_1's auc: 0.831917 valid_1's binary_logloss: 0.141748 [19] valid_0's auc: 0.875336 valid_0's binary_logloss: 0.125072 valid_1's auc: 0.831274 valid_1's binary_logloss: 0.141658 [20] valid_0's auc: 0.876959 valid_0's binary_logloss: 0.124484 valid_1's auc: 0.831275 valid_1's binary_logloss: 0.141511 [21] valid_0's auc: 0.878049 valid_0's binary_logloss: 0.123928 valid_1's auc: 0.830813 valid_1's binary_logloss: 0.141459 [22] valid_0's auc: 0.878905 valid_0's binary_logloss: 0.123447 valid_1's auc: 0.83012 valid_1's binary_logloss: 0.141449 [23] valid_0's auc: 0.879827 valid_0's binary_logloss: 0.12295 valid_1's auc: 0.829554 valid_1's binary_logloss: 0.141492 [24] valid_0's auc: 0.880692 valid_0's binary_logloss: 0.122479 valid_1's auc: 0.829256 valid_1's binary_logloss: 0.141487 [25] valid_0's auc: 0.881715 valid_0's binary_logloss: 0.121994 valid_1's auc: 0.829326 valid_1's binary_logloss: 0.141362 [26] valid_0's auc: 0.883014 valid_0's binary_logloss: 0.121527 valid_1's auc: 0.829553 valid_1's binary_logloss: 0.14132 [27] valid_0's auc: 0.884245 valid_0's binary_logloss: 0.121024 valid_1's auc: 0.829624 valid_1's binary_logloss: 0.14127 [28] valid_0's auc: 0.885238 valid_0's binary_logloss: 0.12058 valid_1's auc: 0.829417 valid_1's binary_logloss: 0.141237 [29] valid_0's auc: 0.88602 valid_0's binary_logloss: 0.120198 valid_1's auc: 0.82917 valid_1's binary_logloss: 0.141201 [30] valid_0's auc: 0.88684 valid_0's binary_logloss: 0.119831 valid_1's auc: 0.82962 valid_1's binary_logloss: 0.141121 [31] valid_0's auc: 0.887965 valid_0's binary_logloss: 0.119437 valid_1's auc: 0.83035 valid_1's binary_logloss: 0.14101 [32] valid_0's auc: 0.88868 valid_0's binary_logloss: 0.119086 valid_1's auc: 0.82975 valid_1's binary_logloss: 0.141093 [33] valid_0's auc: 0.889895 valid_0's binary_logloss: 0.118649 valid_1's auc: 0.829977 valid_1's binary_logloss: 0.141037 [34] valid_0's auc: 0.890626 valid_0's binary_logloss: 0.118328 valid_1's auc: 0.829368 valid_1's binary_logloss: 0.141161 [35] valid_0's auc: 0.89116 valid_0's binary_logloss: 0.11806 valid_1's auc: 0.829262 valid_1's binary_logloss: 0.141183 [36] valid_0's auc: 0.891999 valid_0's binary_logloss: 0.11775 valid_1's auc: 0.828947 valid_1's binary_logloss: 0.14129 [37] valid_0's auc: 0.892306 valid_0's binary_logloss: 0.117477 valid_1's auc: 0.828544 valid_1's binary_logloss: 0.141389 [38] valid_0's auc: 0.892937 valid_0's binary_logloss: 0.117192 valid_1's auc: 0.827983 valid_1's binary_logloss: 0.141516 [39] valid_0's auc: 0.893563 valid_0's binary_logloss: 0.116869 valid_1's auc: 0.828068 valid_1's binary_logloss: 0.141517 [40] valid_0's auc: 0.893942 valid_0's binary_logloss: 0.11662 valid_1's auc: 0.827852 valid_1's binary_logloss: 0.141621
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.830474 valid_0's binary_logloss: 0.155928 valid_1's auc: 0.817343 valid_1's binary_logloss: 0.164928 [2] valid_0's auc: 0.842931 valid_0's binary_logloss: 0.1503 valid_1's auc: 0.82699 valid_1's binary_logloss: 0.15948 [3] valid_0's auc: 0.850877 valid_0's binary_logloss: 0.14631 valid_1's auc: 0.832212 valid_1's binary_logloss: 0.155775 [4] valid_0's auc: 0.854431 valid_0's binary_logloss: 0.143104 valid_1's auc: 0.83392 valid_1's binary_logloss: 0.152698 [5] valid_0's auc: 0.85663 valid_0's binary_logloss: 0.140582 valid_1's auc: 0.835094 valid_1's binary_logloss: 0.150349 [6] valid_0's auc: 0.859142 valid_0's binary_logloss: 0.138289 valid_1's auc: 0.836166 valid_1's binary_logloss: 0.148424 [7] valid_0's auc: 0.861364 valid_0's binary_logloss: 0.136413 valid_1's auc: 0.837184 valid_1's binary_logloss: 0.146912 [8] valid_0's auc: 0.862199 valid_0's binary_logloss: 0.134841 valid_1's auc: 0.837545 valid_1's binary_logloss: 0.145726 [9] valid_0's auc: 0.864095 valid_0's binary_logloss: 0.133364 valid_1's auc: 0.837242 valid_1's binary_logloss: 0.144736 [10] valid_0's auc: 0.866024 valid_0's binary_logloss: 0.132096 valid_1's auc: 0.837719 valid_1's binary_logloss: 0.143766 [11] valid_0's auc: 0.867454 valid_0's binary_logloss: 0.131002 valid_1's auc: 0.837865 valid_1's binary_logloss: 0.143009 [12] valid_0's auc: 0.868329 valid_0's binary_logloss: 0.130024 valid_1's auc: 0.837259 valid_1's binary_logloss: 0.14244 [13] valid_0's auc: 0.869137 valid_0's binary_logloss: 0.129145 valid_1's auc: 0.837689 valid_1's binary_logloss: 0.141896 [14] valid_0's auc: 0.870957 valid_0's binary_logloss: 0.128226 valid_1's auc: 0.838226 valid_1's binary_logloss: 0.141392 [15] valid_0's auc: 0.872273 valid_0's binary_logloss: 0.12745 valid_1's auc: 0.837906 valid_1's binary_logloss: 0.141019 [16] valid_0's auc: 0.873243 valid_0's binary_logloss: 0.12672 valid_1's auc: 0.837761 valid_1's binary_logloss: 0.140677 [17] valid_0's auc: 0.874251 valid_0's binary_logloss: 0.126044 valid_1's auc: 0.83701 valid_1's binary_logloss: 0.140582 [18] valid_0's auc: 0.875622 valid_0's binary_logloss: 0.125387 valid_1's auc: 0.836179 valid_1's binary_logloss: 0.140485 [19] valid_0's auc: 0.877031 valid_0's binary_logloss: 0.124759 valid_1's auc: 0.836188 valid_1's binary_logloss: 0.14029 [20] valid_0's auc: 0.878046 valid_0's binary_logloss: 0.124156 valid_1's auc: 0.836531 valid_1's binary_logloss: 0.140133 [21] valid_0's auc: 0.879478 valid_0's binary_logloss: 0.123507 valid_1's auc: 0.837068 valid_1's binary_logloss: 0.13995 [22] valid_0's auc: 0.880423 valid_0's binary_logloss: 0.123029 valid_1's auc: 0.836817 valid_1's binary_logloss: 0.139912 [23] valid_0's auc: 0.881684 valid_0's binary_logloss: 0.122492 valid_1's auc: 0.836983 valid_1's binary_logloss: 0.139762 [24] valid_0's auc: 0.882873 valid_0's binary_logloss: 0.121986 valid_1's auc: 0.837319 valid_1's binary_logloss: 0.139659 [25] valid_0's auc: 0.883597 valid_0's binary_logloss: 0.121566 valid_1's auc: 0.837154 valid_1's binary_logloss: 0.139623 [26] valid_0's auc: 0.884814 valid_0's binary_logloss: 0.121104 valid_1's auc: 0.836302 valid_1's binary_logloss: 0.139668 [27] valid_0's auc: 0.886026 valid_0's binary_logloss: 0.120635 valid_1's auc: 0.836521 valid_1's binary_logloss: 0.139601 [28] valid_0's auc: 0.887071 valid_0's binary_logloss: 0.120222 valid_1's auc: 0.836646 valid_1's binary_logloss: 0.139557 [29] valid_0's auc: 0.887946 valid_0's binary_logloss: 0.119804 valid_1's auc: 0.836735 valid_1's binary_logloss: 0.139518 [30] valid_0's auc: 0.88898 valid_0's binary_logloss: 0.119416 valid_1's auc: 0.836858 valid_1's binary_logloss: 0.139499 [31] valid_0's auc: 0.889792 valid_0's binary_logloss: 0.119058 valid_1's auc: 0.836917 valid_1's binary_logloss: 0.139463 [32] valid_0's auc: 0.890876 valid_0's binary_logloss: 0.118631 valid_1's auc: 0.836346 valid_1's binary_logloss: 0.139532 [33] valid_0's auc: 0.891629 valid_0's binary_logloss: 0.118259 valid_1's auc: 0.836206 valid_1's binary_logloss: 0.139603 [34] valid_0's auc: 0.892446 valid_0's binary_logloss: 0.117893 valid_1's auc: 0.836005 valid_1's binary_logloss: 0.139603 [35] valid_0's auc: 0.893407 valid_0's binary_logloss: 0.11752 valid_1's auc: 0.8361 valid_1's binary_logloss: 0.139574 [36] valid_0's auc: 0.893836 valid_0's binary_logloss: 0.117247 valid_1's auc: 0.836147 valid_1's binary_logloss: 0.139608 [37] valid_0's auc: 0.894774 valid_0's binary_logloss: 0.116913 valid_1's auc: 0.836601 valid_1's binary_logloss: 0.139569 [38] valid_0's auc: 0.895494 valid_0's binary_logloss: 0.116611 valid_1's auc: 0.836232 valid_1's binary_logloss: 0.139645 [39] valid_0's auc: 0.896102 valid_0's binary_logloss: 0.116275 valid_1's auc: 0.836415 valid_1's binary_logloss: 0.139653 [40] valid_0's auc: 0.896715 valid_0's binary_logloss: 0.115934 valid_1's auc: 0.836463 valid_1's binary_logloss: 0.139671 [41] valid_0's auc: 0.897232 valid_0's binary_logloss: 0.115612 valid_1's auc: 0.836223 valid_1's binary_logloss: 0.139762 [42] valid_0's auc: 0.897875 valid_0's binary_logloss: 0.11528 valid_1's auc: 0.836151 valid_1's binary_logloss: 0.139777 [43] valid_0's auc: 0.898493 valid_0's binary_logloss: 0.114999 valid_1's auc: 0.836216 valid_1's binary_logloss: 0.139761 [44] valid_0's auc: 0.899179 valid_0's binary_logloss: 0.114703 valid_1's auc: 0.836328 valid_1's binary_logloss: 0.139755
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.834724 valid_0's binary_logloss: 0.15607 valid_1's auc: 0.822983 valid_1's binary_logloss: 0.165104 [2] valid_0's auc: 0.842835 valid_0's binary_logloss: 0.150494 valid_1's auc: 0.830472 valid_1's binary_logloss: 0.159671 [3] valid_0's auc: 0.847187 valid_0's binary_logloss: 0.146306 valid_1's auc: 0.830873 valid_1's binary_logloss: 0.155985 [4] valid_0's auc: 0.850394 valid_0's binary_logloss: 0.143088 valid_1's auc: 0.830975 valid_1's binary_logloss: 0.15321 [5] valid_0's auc: 0.853379 valid_0's binary_logloss: 0.140508 valid_1's auc: 0.832135 valid_1's binary_logloss: 0.150854 [6] valid_0's auc: 0.855463 valid_0's binary_logloss: 0.138297 valid_1's auc: 0.833116 valid_1's binary_logloss: 0.149013 [7] valid_0's auc: 0.856723 valid_0's binary_logloss: 0.136504 valid_1's auc: 0.833811 valid_1's binary_logloss: 0.147577 [8] valid_0's auc: 0.858076 valid_0's binary_logloss: 0.13495 valid_1's auc: 0.835315 valid_1's binary_logloss: 0.146273 [9] valid_0's auc: 0.861024 valid_0's binary_logloss: 0.133583 valid_1's auc: 0.835042 valid_1's binary_logloss: 0.145374 [10] valid_0's auc: 0.862281 valid_0's binary_logloss: 0.132357 valid_1's auc: 0.834154 valid_1's binary_logloss: 0.144649 [11] valid_0's auc: 0.864612 valid_0's binary_logloss: 0.131283 valid_1's auc: 0.834587 valid_1's binary_logloss: 0.143941 [12] valid_0's auc: 0.866377 valid_0's binary_logloss: 0.130299 valid_1's auc: 0.834242 valid_1's binary_logloss: 0.143366 [13] valid_0's auc: 0.868343 valid_0's binary_logloss: 0.129417 valid_1's auc: 0.833273 valid_1's binary_logloss: 0.142976 [14] valid_0's auc: 0.86957 valid_0's binary_logloss: 0.128593 valid_1's auc: 0.833783 valid_1's binary_logloss: 0.142567 [15] valid_0's auc: 0.871109 valid_0's binary_logloss: 0.127759 valid_1's auc: 0.834057 valid_1's binary_logloss: 0.142234 [16] valid_0's auc: 0.872893 valid_0's binary_logloss: 0.126996 valid_1's auc: 0.835329 valid_1's binary_logloss: 0.141809 [17] valid_0's auc: 0.874236 valid_0's binary_logloss: 0.12631 valid_1's auc: 0.834985 valid_1's binary_logloss: 0.141613 [18] valid_0's auc: 0.875324 valid_0's binary_logloss: 0.125725 valid_1's auc: 0.834942 valid_1's binary_logloss: 0.141363 [19] valid_0's auc: 0.876659 valid_0's binary_logloss: 0.125068 valid_1's auc: 0.835024 valid_1's binary_logloss: 0.141162 [20] valid_0's auc: 0.877885 valid_0's binary_logloss: 0.124484 valid_1's auc: 0.835893 valid_1's binary_logloss: 0.140933 [21] valid_0's auc: 0.879121 valid_0's binary_logloss: 0.12391 valid_1's auc: 0.837029 valid_1's binary_logloss: 0.140651 [22] valid_0's auc: 0.880116 valid_0's binary_logloss: 0.123339 valid_1's auc: 0.837366 valid_1's binary_logloss: 0.140547 [23] valid_0's auc: 0.881224 valid_0's binary_logloss: 0.12282 valid_1's auc: 0.837357 valid_1's binary_logloss: 0.140445 [24] valid_0's auc: 0.882014 valid_0's binary_logloss: 0.122386 valid_1's auc: 0.837343 valid_1's binary_logloss: 0.140371 [25] valid_0's auc: 0.88318 valid_0's binary_logloss: 0.121861 valid_1's auc: 0.83723 valid_1's binary_logloss: 0.140313 [26] valid_0's auc: 0.884008 valid_0's binary_logloss: 0.121441 valid_1's auc: 0.837761 valid_1's binary_logloss: 0.140173 [27] valid_0's auc: 0.884676 valid_0's binary_logloss: 0.121001 valid_1's auc: 0.838046 valid_1's binary_logloss: 0.140086 [28] valid_0's auc: 0.885524 valid_0's binary_logloss: 0.120598 valid_1's auc: 0.838029 valid_1's binary_logloss: 0.140051 [29] valid_0's auc: 0.886461 valid_0's binary_logloss: 0.120157 valid_1's auc: 0.837775 valid_1's binary_logloss: 0.140057 [30] valid_0's auc: 0.887053 valid_0's binary_logloss: 0.119807 valid_1's auc: 0.837472 valid_1's binary_logloss: 0.140111 [31] valid_0's auc: 0.888177 valid_0's binary_logloss: 0.119425 valid_1's auc: 0.837575 valid_1's binary_logloss: 0.140093 [32] valid_0's auc: 0.889072 valid_0's binary_logloss: 0.119055 valid_1's auc: 0.837158 valid_1's binary_logloss: 0.140195 [33] valid_0's auc: 0.889782 valid_0's binary_logloss: 0.118676 valid_1's auc: 0.837296 valid_1's binary_logloss: 0.140221 [34] valid_0's auc: 0.890876 valid_0's binary_logloss: 0.118304 valid_1's auc: 0.837481 valid_1's binary_logloss: 0.140165 [35] valid_0's auc: 0.891448 valid_0's binary_logloss: 0.11798 valid_1's auc: 0.837953 valid_1's binary_logloss: 0.140085 [36] valid_0's auc: 0.892165 valid_0's binary_logloss: 0.11764 valid_1's auc: 0.837794 valid_1's binary_logloss: 0.140112 [37] valid_0's auc: 0.892798 valid_0's binary_logloss: 0.117321 valid_1's auc: 0.837291 valid_1's binary_logloss: 0.140221 [38] valid_0's auc: 0.893318 valid_0's binary_logloss: 0.117028 valid_1's auc: 0.837278 valid_1's binary_logloss: 0.140221 [39] valid_0's auc: 0.894018 valid_0's binary_logloss: 0.116742 valid_1's auc: 0.83724 valid_1's binary_logloss: 0.140232 [40] valid_0's auc: 0.894781 valid_0's binary_logloss: 0.116373 valid_1's auc: 0.836901 valid_1's binary_logloss: 0.140328 [41] valid_0's auc: 0.895222 valid_0's binary_logloss: 0.116075 valid_1's auc: 0.836655 valid_1's binary_logloss: 0.140422 [42] valid_0's auc: 0.895842 valid_0's binary_logloss: 0.115755 valid_1's auc: 0.836383 valid_1's binary_logloss: 0.140503 [43] valid_0's auc: 0.896389 valid_0's binary_logloss: 0.115503 valid_1's auc: 0.836348 valid_1's binary_logloss: 0.140505 [44] valid_0's auc: 0.896843 valid_0's binary_logloss: 0.115204 valid_1's auc: 0.836521 valid_1's binary_logloss: 0.140518 [45] valid_0's auc: 0.897272 valid_0's binary_logloss: 0.114886 valid_1's auc: 0.836311 valid_1's binary_logloss: 0.140581 [46] valid_0's auc: 0.898034 valid_0's binary_logloss: 0.114544 valid_1's auc: 0.835871 valid_1's binary_logloss: 0.140663 [47] valid_0's auc: 0.898562 valid_0's binary_logloss: 0.114262 valid_1's auc: 0.835926 valid_1's binary_logloss: 0.140642 [48] valid_0's auc: 0.898919 valid_0's binary_logloss: 0.114006 valid_1's auc: 0.835849 valid_1's binary_logloss: 0.140687 [49] valid_0's auc: 0.899111 valid_0's binary_logloss: 0.113791 valid_1's auc: 0.835874 valid_1's binary_logloss: 0.140728 [50] valid_0's auc: 0.89987 valid_0's binary_logloss: 0.113543 valid_1's auc: 0.835915 valid_1's binary_logloss: 0.14075 [51] valid_0's auc: 0.90004 valid_0's binary_logloss: 0.113342 valid_1's auc: 0.835947 valid_1's binary_logloss: 0.140748 [52] valid_0's auc: 0.900405 valid_0's binary_logloss: 0.113087 valid_1's auc: 0.836011 valid_1's binary_logloss: 0.140767 [53] valid_0's auc: 0.900828 valid_0's binary_logloss: 0.112831 valid_1's auc: 0.836259 valid_1's binary_logloss: 0.140771 [54] valid_0's auc: 0.901597 valid_0's binary_logloss: 0.112604 valid_1's auc: 0.836296 valid_1's binary_logloss: 0.14078 [55] valid_0's auc: 0.901645 valid_0's binary_logloss: 0.112429 valid_1's auc: 0.836095 valid_1's binary_logloss: 0.140822 [56] valid_0's auc: 0.902162 valid_0's binary_logloss: 0.112169 valid_1's auc: 0.835965 valid_1's binary_logloss: 0.14086 [57] valid_0's auc: 0.902422 valid_0's binary_logloss: 0.111944 valid_1's auc: 0.835493 valid_1's binary_logloss: 0.140993
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.835412 valid_0's binary_logloss: 0.155721 valid_1's auc: 0.81973 valid_1's binary_logloss: 0.164844 [2] valid_0's auc: 0.841188 valid_0's binary_logloss: 0.150354 valid_1's auc: 0.823402 valid_1's binary_logloss: 0.16006 [3] valid_0's auc: 0.846758 valid_0's binary_logloss: 0.146288 valid_1's auc: 0.824811 valid_1's binary_logloss: 0.15621 [4] valid_0's auc: 0.850398 valid_0's binary_logloss: 0.143088 valid_1's auc: 0.830278 valid_1's binary_logloss: 0.153352 [5] valid_0's auc: 0.853086 valid_0's binary_logloss: 0.140514 valid_1's auc: 0.833574 valid_1's binary_logloss: 0.151071 [6] valid_0's auc: 0.855915 valid_0's binary_logloss: 0.138329 valid_1's auc: 0.834881 valid_1's binary_logloss: 0.149277 [7] valid_0's auc: 0.858115 valid_0's binary_logloss: 0.136481 valid_1's auc: 0.833603 valid_1's binary_logloss: 0.14786 [8] valid_0's auc: 0.859479 valid_0's binary_logloss: 0.134947 valid_1's auc: 0.834093 valid_1's binary_logloss: 0.146607 [9] valid_0's auc: 0.86143 valid_0's binary_logloss: 0.133519 valid_1's auc: 0.833898 valid_1's binary_logloss: 0.14559 [10] valid_0's auc: 0.862964 valid_0's binary_logloss: 0.132331 valid_1's auc: 0.835026 valid_1's binary_logloss: 0.144789 [11] valid_0's auc: 0.864277 valid_0's binary_logloss: 0.13126 valid_1's auc: 0.834957 valid_1's binary_logloss: 0.144152 [12] valid_0's auc: 0.865572 valid_0's binary_logloss: 0.130304 valid_1's auc: 0.833693 valid_1's binary_logloss: 0.143697 [13] valid_0's auc: 0.867519 valid_0's binary_logloss: 0.129385 valid_1's auc: 0.833158 valid_1's binary_logloss: 0.143184 [14] valid_0's auc: 0.869354 valid_0's binary_logloss: 0.128524 valid_1's auc: 0.833598 valid_1's binary_logloss: 0.142668 [15] valid_0's auc: 0.870553 valid_0's binary_logloss: 0.127746 valid_1's auc: 0.833467 valid_1's binary_logloss: 0.142302 [16] valid_0's auc: 0.871816 valid_0's binary_logloss: 0.126943 valid_1's auc: 0.83329 valid_1's binary_logloss: 0.142022 [17] valid_0's auc: 0.872964 valid_0's binary_logloss: 0.126266 valid_1's auc: 0.83279 valid_1's binary_logloss: 0.141891 [18] valid_0's auc: 0.874047 valid_0's binary_logloss: 0.125646 valid_1's auc: 0.831917 valid_1's binary_logloss: 0.141748 [19] valid_0's auc: 0.875336 valid_0's binary_logloss: 0.125072 valid_1's auc: 0.831274 valid_1's binary_logloss: 0.141658 [20] valid_0's auc: 0.876959 valid_0's binary_logloss: 0.124484 valid_1's auc: 0.831275 valid_1's binary_logloss: 0.141511 [21] valid_0's auc: 0.878049 valid_0's binary_logloss: 0.123928 valid_1's auc: 0.830813 valid_1's binary_logloss: 0.141459 [22] valid_0's auc: 0.878905 valid_0's binary_logloss: 0.123447 valid_1's auc: 0.83012 valid_1's binary_logloss: 0.141449 [23] valid_0's auc: 0.879827 valid_0's binary_logloss: 0.12295 valid_1's auc: 0.829554 valid_1's binary_logloss: 0.141492 [24] valid_0's auc: 0.880692 valid_0's binary_logloss: 0.122479 valid_1's auc: 0.829256 valid_1's binary_logloss: 0.141487 [25] valid_0's auc: 0.881715 valid_0's binary_logloss: 0.121994 valid_1's auc: 0.829326 valid_1's binary_logloss: 0.141362 [26] valid_0's auc: 0.883014 valid_0's binary_logloss: 0.121527 valid_1's auc: 0.829553 valid_1's binary_logloss: 0.14132 [27] valid_0's auc: 0.884245 valid_0's binary_logloss: 0.121024 valid_1's auc: 0.829624 valid_1's binary_logloss: 0.14127 [28] valid_0's auc: 0.885238 valid_0's binary_logloss: 0.12058 valid_1's auc: 0.829417 valid_1's binary_logloss: 0.141237 [29] valid_0's auc: 0.88602 valid_0's binary_logloss: 0.120198 valid_1's auc: 0.82917 valid_1's binary_logloss: 0.141201 [30] valid_0's auc: 0.88684 valid_0's binary_logloss: 0.119831 valid_1's auc: 0.82962 valid_1's binary_logloss: 0.141121 [31] valid_0's auc: 0.887965 valid_0's binary_logloss: 0.119437 valid_1's auc: 0.83035 valid_1's binary_logloss: 0.14101 [32] valid_0's auc: 0.88868 valid_0's binary_logloss: 0.119086 valid_1's auc: 0.82975 valid_1's binary_logloss: 0.141093 [33] valid_0's auc: 0.889895 valid_0's binary_logloss: 0.118649 valid_1's auc: 0.829977 valid_1's binary_logloss: 0.141037 [34] valid_0's auc: 0.890626 valid_0's binary_logloss: 0.118328 valid_1's auc: 0.829368 valid_1's binary_logloss: 0.141161 [35] valid_0's auc: 0.89116 valid_0's binary_logloss: 0.11806 valid_1's auc: 0.829262 valid_1's binary_logloss: 0.141183 [36] valid_0's auc: 0.891999 valid_0's binary_logloss: 0.11775 valid_1's auc: 0.828947 valid_1's binary_logloss: 0.14129 [37] valid_0's auc: 0.892306 valid_0's binary_logloss: 0.117477 valid_1's auc: 0.828544 valid_1's binary_logloss: 0.141389 [38] valid_0's auc: 0.892937 valid_0's binary_logloss: 0.117192 valid_1's auc: 0.827983 valid_1's binary_logloss: 0.141516 [39] valid_0's auc: 0.893563 valid_0's binary_logloss: 0.116869 valid_1's auc: 0.828068 valid_1's binary_logloss: 0.141517 [40] valid_0's auc: 0.893942 valid_0's binary_logloss: 0.11662 valid_1's auc: 0.827852 valid_1's binary_logloss: 0.141621
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.830474 valid_0's binary_logloss: 0.155928 valid_1's auc: 0.817343 valid_1's binary_logloss: 0.164928 [2] valid_0's auc: 0.842931 valid_0's binary_logloss: 0.1503 valid_1's auc: 0.82699 valid_1's binary_logloss: 0.15948 [3] valid_0's auc: 0.850877 valid_0's binary_logloss: 0.14631 valid_1's auc: 0.832212 valid_1's binary_logloss: 0.155775 [4] valid_0's auc: 0.854431 valid_0's binary_logloss: 0.143104 valid_1's auc: 0.83392 valid_1's binary_logloss: 0.152698 [5] valid_0's auc: 0.85663 valid_0's binary_logloss: 0.140582 valid_1's auc: 0.835094 valid_1's binary_logloss: 0.150349 [6] valid_0's auc: 0.859142 valid_0's binary_logloss: 0.138289 valid_1's auc: 0.836166 valid_1's binary_logloss: 0.148424 [7] valid_0's auc: 0.861364 valid_0's binary_logloss: 0.136413 valid_1's auc: 0.837184 valid_1's binary_logloss: 0.146912 [8] valid_0's auc: 0.862199 valid_0's binary_logloss: 0.134841 valid_1's auc: 0.837545 valid_1's binary_logloss: 0.145726 [9] valid_0's auc: 0.864095 valid_0's binary_logloss: 0.133364 valid_1's auc: 0.837242 valid_1's binary_logloss: 0.144736 [10] valid_0's auc: 0.866024 valid_0's binary_logloss: 0.132096 valid_1's auc: 0.837719 valid_1's binary_logloss: 0.143766 [11] valid_0's auc: 0.867454 valid_0's binary_logloss: 0.131002 valid_1's auc: 0.837865 valid_1's binary_logloss: 0.143009 [12] valid_0's auc: 0.868329 valid_0's binary_logloss: 0.130024 valid_1's auc: 0.837259 valid_1's binary_logloss: 0.14244 [13] valid_0's auc: 0.869137 valid_0's binary_logloss: 0.129145 valid_1's auc: 0.837689 valid_1's binary_logloss: 0.141896 [14] valid_0's auc: 0.870957 valid_0's binary_logloss: 0.128226 valid_1's auc: 0.838226 valid_1's binary_logloss: 0.141392 [15] valid_0's auc: 0.872273 valid_0's binary_logloss: 0.12745 valid_1's auc: 0.837906 valid_1's binary_logloss: 0.141019 [16] valid_0's auc: 0.873243 valid_0's binary_logloss: 0.12672 valid_1's auc: 0.837761 valid_1's binary_logloss: 0.140677 [17] valid_0's auc: 0.874251 valid_0's binary_logloss: 0.126044 valid_1's auc: 0.83701 valid_1's binary_logloss: 0.140582 [18] valid_0's auc: 0.875622 valid_0's binary_logloss: 0.125387 valid_1's auc: 0.836179 valid_1's binary_logloss: 0.140485 [19] valid_0's auc: 0.877031 valid_0's binary_logloss: 0.124759 valid_1's auc: 0.836188 valid_1's binary_logloss: 0.14029 [20] valid_0's auc: 0.878046 valid_0's binary_logloss: 0.124156 valid_1's auc: 0.836531 valid_1's binary_logloss: 0.140133 [21] valid_0's auc: 0.879478 valid_0's binary_logloss: 0.123507 valid_1's auc: 0.837068 valid_1's binary_logloss: 0.13995 [22] valid_0's auc: 0.880423 valid_0's binary_logloss: 0.123029 valid_1's auc: 0.836817 valid_1's binary_logloss: 0.139912 [23] valid_0's auc: 0.881684 valid_0's binary_logloss: 0.122492 valid_1's auc: 0.836983 valid_1's binary_logloss: 0.139762 [24] valid_0's auc: 0.882873 valid_0's binary_logloss: 0.121986 valid_1's auc: 0.837319 valid_1's binary_logloss: 0.139659 [25] valid_0's auc: 0.883597 valid_0's binary_logloss: 0.121566 valid_1's auc: 0.837154 valid_1's binary_logloss: 0.139623 [26] valid_0's auc: 0.884814 valid_0's binary_logloss: 0.121104 valid_1's auc: 0.836302 valid_1's binary_logloss: 0.139668 [27] valid_0's auc: 0.886026 valid_0's binary_logloss: 0.120635 valid_1's auc: 0.836521 valid_1's binary_logloss: 0.139601 [28] valid_0's auc: 0.887071 valid_0's binary_logloss: 0.120222 valid_1's auc: 0.836646 valid_1's binary_logloss: 0.139557 [29] valid_0's auc: 0.887946 valid_0's binary_logloss: 0.119804 valid_1's auc: 0.836735 valid_1's binary_logloss: 0.139518 [30] valid_0's auc: 0.88898 valid_0's binary_logloss: 0.119416 valid_1's auc: 0.836858 valid_1's binary_logloss: 0.139499 [31] valid_0's auc: 0.889792 valid_0's binary_logloss: 0.119058 valid_1's auc: 0.836917 valid_1's binary_logloss: 0.139463 [32] valid_0's auc: 0.890876 valid_0's binary_logloss: 0.118631 valid_1's auc: 0.836346 valid_1's binary_logloss: 0.139532 [33] valid_0's auc: 0.891629 valid_0's binary_logloss: 0.118259 valid_1's auc: 0.836206 valid_1's binary_logloss: 0.139603 [34] valid_0's auc: 0.892446 valid_0's binary_logloss: 0.117893 valid_1's auc: 0.836005 valid_1's binary_logloss: 0.139603 [35] valid_0's auc: 0.893407 valid_0's binary_logloss: 0.11752 valid_1's auc: 0.8361 valid_1's binary_logloss: 0.139574 [36] valid_0's auc: 0.893836 valid_0's binary_logloss: 0.117247 valid_1's auc: 0.836147 valid_1's binary_logloss: 0.139608 [37] valid_0's auc: 0.894774 valid_0's binary_logloss: 0.116913 valid_1's auc: 0.836601 valid_1's binary_logloss: 0.139569 [38] valid_0's auc: 0.895494 valid_0's binary_logloss: 0.116611 valid_1's auc: 0.836232 valid_1's binary_logloss: 0.139645 [39] valid_0's auc: 0.896102 valid_0's binary_logloss: 0.116275 valid_1's auc: 0.836415 valid_1's binary_logloss: 0.139653 [40] valid_0's auc: 0.896715 valid_0's binary_logloss: 0.115934 valid_1's auc: 0.836463 valid_1's binary_logloss: 0.139671 [41] valid_0's auc: 0.897232 valid_0's binary_logloss: 0.115612 valid_1's auc: 0.836223 valid_1's binary_logloss: 0.139762 [42] valid_0's auc: 0.897875 valid_0's binary_logloss: 0.11528 valid_1's auc: 0.836151 valid_1's binary_logloss: 0.139777 [43] valid_0's auc: 0.898493 valid_0's binary_logloss: 0.114999 valid_1's auc: 0.836216 valid_1's binary_logloss: 0.139761 [44] valid_0's auc: 0.899179 valid_0's binary_logloss: 0.114703 valid_1's auc: 0.836328 valid_1's binary_logloss: 0.139755
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.834724 valid_0's binary_logloss: 0.15607 valid_1's auc: 0.822983 valid_1's binary_logloss: 0.165104 [2] valid_0's auc: 0.842835 valid_0's binary_logloss: 0.150494 valid_1's auc: 0.830472 valid_1's binary_logloss: 0.159671 [3] valid_0's auc: 0.847187 valid_0's binary_logloss: 0.146306 valid_1's auc: 0.830873 valid_1's binary_logloss: 0.155985 [4] valid_0's auc: 0.850394 valid_0's binary_logloss: 0.143088 valid_1's auc: 0.830975 valid_1's binary_logloss: 0.15321 [5] valid_0's auc: 0.853379 valid_0's binary_logloss: 0.140508 valid_1's auc: 0.832135 valid_1's binary_logloss: 0.150854 [6] valid_0's auc: 0.855463 valid_0's binary_logloss: 0.138297 valid_1's auc: 0.833116 valid_1's binary_logloss: 0.149013 [7] valid_0's auc: 0.856723 valid_0's binary_logloss: 0.136504 valid_1's auc: 0.833811 valid_1's binary_logloss: 0.147577 [8] valid_0's auc: 0.858076 valid_0's binary_logloss: 0.13495 valid_1's auc: 0.835315 valid_1's binary_logloss: 0.146273 [9] valid_0's auc: 0.861024 valid_0's binary_logloss: 0.133583 valid_1's auc: 0.835042 valid_1's binary_logloss: 0.145374 [10] valid_0's auc: 0.862281 valid_0's binary_logloss: 0.132357 valid_1's auc: 0.834154 valid_1's binary_logloss: 0.144649 [11] valid_0's auc: 0.864612 valid_0's binary_logloss: 0.131283 valid_1's auc: 0.834587 valid_1's binary_logloss: 0.143941 [12] valid_0's auc: 0.866377 valid_0's binary_logloss: 0.130299 valid_1's auc: 0.834242 valid_1's binary_logloss: 0.143366 [13] valid_0's auc: 0.868343 valid_0's binary_logloss: 0.129417 valid_1's auc: 0.833273 valid_1's binary_logloss: 0.142976 [14] valid_0's auc: 0.86957 valid_0's binary_logloss: 0.128593 valid_1's auc: 0.833783 valid_1's binary_logloss: 0.142567 [15] valid_0's auc: 0.871109 valid_0's binary_logloss: 0.127759 valid_1's auc: 0.834057 valid_1's binary_logloss: 0.142234 [16] valid_0's auc: 0.872893 valid_0's binary_logloss: 0.126996 valid_1's auc: 0.835329 valid_1's binary_logloss: 0.141809 [17] valid_0's auc: 0.874236 valid_0's binary_logloss: 0.12631 valid_1's auc: 0.834985 valid_1's binary_logloss: 0.141613 [18] valid_0's auc: 0.875324 valid_0's binary_logloss: 0.125725 valid_1's auc: 0.834942 valid_1's binary_logloss: 0.141363 [19] valid_0's auc: 0.876659 valid_0's binary_logloss: 0.125068 valid_1's auc: 0.835024 valid_1's binary_logloss: 0.141162 [20] valid_0's auc: 0.877885 valid_0's binary_logloss: 0.124484 valid_1's auc: 0.835893 valid_1's binary_logloss: 0.140933 [21] valid_0's auc: 0.879121 valid_0's binary_logloss: 0.12391 valid_1's auc: 0.837029 valid_1's binary_logloss: 0.140651 [22] valid_0's auc: 0.880116 valid_0's binary_logloss: 0.123339 valid_1's auc: 0.837366 valid_1's binary_logloss: 0.140547 [23] valid_0's auc: 0.881224 valid_0's binary_logloss: 0.12282 valid_1's auc: 0.837357 valid_1's binary_logloss: 0.140445 [24] valid_0's auc: 0.882014 valid_0's binary_logloss: 0.122386 valid_1's auc: 0.837343 valid_1's binary_logloss: 0.140371 [25] valid_0's auc: 0.88318 valid_0's binary_logloss: 0.121861 valid_1's auc: 0.83723 valid_1's binary_logloss: 0.140313 [26] valid_0's auc: 0.884008 valid_0's binary_logloss: 0.121441 valid_1's auc: 0.837761 valid_1's binary_logloss: 0.140173 [27] valid_0's auc: 0.884676 valid_0's binary_logloss: 0.121001 valid_1's auc: 0.838046 valid_1's binary_logloss: 0.140086 [28] valid_0's auc: 0.885524 valid_0's binary_logloss: 0.120598 valid_1's auc: 0.838029 valid_1's binary_logloss: 0.140051 [29] valid_0's auc: 0.886461 valid_0's binary_logloss: 0.120157 valid_1's auc: 0.837775 valid_1's binary_logloss: 0.140057 [30] valid_0's auc: 0.887053 valid_0's binary_logloss: 0.119807 valid_1's auc: 0.837472 valid_1's binary_logloss: 0.140111 [31] valid_0's auc: 0.888177 valid_0's binary_logloss: 0.119425 valid_1's auc: 0.837575 valid_1's binary_logloss: 0.140093 [32] valid_0's auc: 0.889072 valid_0's binary_logloss: 0.119055 valid_1's auc: 0.837158 valid_1's binary_logloss: 0.140195 [33] valid_0's auc: 0.889782 valid_0's binary_logloss: 0.118676 valid_1's auc: 0.837296 valid_1's binary_logloss: 0.140221 [34] valid_0's auc: 0.890876 valid_0's binary_logloss: 0.118304 valid_1's auc: 0.837481 valid_1's binary_logloss: 0.140165 [35] valid_0's auc: 0.891448 valid_0's binary_logloss: 0.11798 valid_1's auc: 0.837953 valid_1's binary_logloss: 0.140085 [36] valid_0's auc: 0.892165 valid_0's binary_logloss: 0.11764 valid_1's auc: 0.837794 valid_1's binary_logloss: 0.140112 [37] valid_0's auc: 0.892798 valid_0's binary_logloss: 0.117321 valid_1's auc: 0.837291 valid_1's binary_logloss: 0.140221 [38] valid_0's auc: 0.893318 valid_0's binary_logloss: 0.117028 valid_1's auc: 0.837278 valid_1's binary_logloss: 0.140221 [39] valid_0's auc: 0.894018 valid_0's binary_logloss: 0.116742 valid_1's auc: 0.83724 valid_1's binary_logloss: 0.140232 [40] valid_0's auc: 0.894781 valid_0's binary_logloss: 0.116373 valid_1's auc: 0.836901 valid_1's binary_logloss: 0.140328 [41] valid_0's auc: 0.895222 valid_0's binary_logloss: 0.116075 valid_1's auc: 0.836655 valid_1's binary_logloss: 0.140422 [42] valid_0's auc: 0.895842 valid_0's binary_logloss: 0.115755 valid_1's auc: 0.836383 valid_1's binary_logloss: 0.140503 [43] valid_0's auc: 0.896389 valid_0's binary_logloss: 0.115503 valid_1's auc: 0.836348 valid_1's binary_logloss: 0.140505 [44] valid_0's auc: 0.896843 valid_0's binary_logloss: 0.115204 valid_1's auc: 0.836521 valid_1's binary_logloss: 0.140518 [45] valid_0's auc: 0.897272 valid_0's binary_logloss: 0.114886 valid_1's auc: 0.836311 valid_1's binary_logloss: 0.140581 [46] valid_0's auc: 0.898034 valid_0's binary_logloss: 0.114544 valid_1's auc: 0.835871 valid_1's binary_logloss: 0.140663 [47] valid_0's auc: 0.898562 valid_0's binary_logloss: 0.114262 valid_1's auc: 0.835926 valid_1's binary_logloss: 0.140642 [48] valid_0's auc: 0.898919 valid_0's binary_logloss: 0.114006 valid_1's auc: 0.835849 valid_1's binary_logloss: 0.140687 [49] valid_0's auc: 0.899111 valid_0's binary_logloss: 0.113791 valid_1's auc: 0.835874 valid_1's binary_logloss: 0.140728 [50] valid_0's auc: 0.89987 valid_0's binary_logloss: 0.113543 valid_1's auc: 0.835915 valid_1's binary_logloss: 0.14075 [51] valid_0's auc: 0.90004 valid_0's binary_logloss: 0.113342 valid_1's auc: 0.835947 valid_1's binary_logloss: 0.140748 [52] valid_0's auc: 0.900405 valid_0's binary_logloss: 0.113087 valid_1's auc: 0.836011 valid_1's binary_logloss: 0.140767 [53] valid_0's auc: 0.900828 valid_0's binary_logloss: 0.112831 valid_1's auc: 0.836259 valid_1's binary_logloss: 0.140771 [54] valid_0's auc: 0.901597 valid_0's binary_logloss: 0.112604 valid_1's auc: 0.836296 valid_1's binary_logloss: 0.14078 [55] valid_0's auc: 0.901645 valid_0's binary_logloss: 0.112429 valid_1's auc: 0.836095 valid_1's binary_logloss: 0.140822 [56] valid_0's auc: 0.902162 valid_0's binary_logloss: 0.112169 valid_1's auc: 0.835965 valid_1's binary_logloss: 0.14086 [57] valid_0's auc: 0.902422 valid_0's binary_logloss: 0.111944 valid_1's auc: 0.835493 valid_1's binary_logloss: 0.140993
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.820235 valid_0's binary_logloss: 0.156085 valid_1's auc: 0.81613 valid_1's binary_logloss: 0.164992 [2] valid_0's auc: 0.825778 valid_0's binary_logloss: 0.150951 valid_1's auc: 0.821835 valid_1's binary_logloss: 0.159874 [3] valid_0's auc: 0.832262 valid_0's binary_logloss: 0.147158 valid_1's auc: 0.826533 valid_1's binary_logloss: 0.156346 [4] valid_0's auc: 0.83865 valid_0's binary_logloss: 0.144126 valid_1's auc: 0.833166 valid_1's binary_logloss: 0.1534 [5] valid_0's auc: 0.842822 valid_0's binary_logloss: 0.141725 valid_1's auc: 0.836448 valid_1's binary_logloss: 0.151167 [6] valid_0's auc: 0.844702 valid_0's binary_logloss: 0.139642 valid_1's auc: 0.837094 valid_1's binary_logloss: 0.149356 [7] valid_0's auc: 0.847144 valid_0's binary_logloss: 0.13794 valid_1's auc: 0.837965 valid_1's binary_logloss: 0.147853 [8] valid_0's auc: 0.848277 valid_0's binary_logloss: 0.136499 valid_1's auc: 0.837663 valid_1's binary_logloss: 0.146543 [9] valid_0's auc: 0.849328 valid_0's binary_logloss: 0.135326 valid_1's auc: 0.837413 valid_1's binary_logloss: 0.145528 [10] valid_0's auc: 0.851112 valid_0's binary_logloss: 0.134188 valid_1's auc: 0.836954 valid_1's binary_logloss: 0.14466 [11] valid_0's auc: 0.852613 valid_0's binary_logloss: 0.133257 valid_1's auc: 0.837393 valid_1's binary_logloss: 0.143843 [12] valid_0's auc: 0.854906 valid_0's binary_logloss: 0.132346 valid_1's auc: 0.837459 valid_1's binary_logloss: 0.143285 [13] valid_0's auc: 0.855656 valid_0's binary_logloss: 0.131601 valid_1's auc: 0.837612 valid_1's binary_logloss: 0.142732 [14] valid_0's auc: 0.857076 valid_0's binary_logloss: 0.130884 valid_1's auc: 0.837055 valid_1's binary_logloss: 0.142403 [15] valid_0's auc: 0.857961 valid_0's binary_logloss: 0.130252 valid_1's auc: 0.837198 valid_1's binary_logloss: 0.142031 [16] valid_0's auc: 0.860191 valid_0's binary_logloss: 0.129596 valid_1's auc: 0.836016 valid_1's binary_logloss: 0.141822 [17] valid_0's auc: 0.860941 valid_0's binary_logloss: 0.129064 valid_1's auc: 0.836076 valid_1's binary_logloss: 0.141551 [18] valid_0's auc: 0.862201 valid_0's binary_logloss: 0.128565 valid_1's auc: 0.835929 valid_1's binary_logloss: 0.141326 [19] valid_0's auc: 0.863581 valid_0's binary_logloss: 0.128105 valid_1's auc: 0.835256 valid_1's binary_logloss: 0.141243 [20] valid_0's auc: 0.864799 valid_0's binary_logloss: 0.127654 valid_1's auc: 0.83435 valid_1's binary_logloss: 0.141148 [21] valid_0's auc: 0.866472 valid_0's binary_logloss: 0.127165 valid_1's auc: 0.834176 valid_1's binary_logloss: 0.141041 [22] valid_0's auc: 0.867055 valid_0's binary_logloss: 0.126777 valid_1's auc: 0.834173 valid_1's binary_logloss: 0.140887 [23] valid_0's auc: 0.867726 valid_0's binary_logloss: 0.12643 valid_1's auc: 0.833577 valid_1's binary_logloss: 0.140909 [24] valid_0's auc: 0.868612 valid_0's binary_logloss: 0.126061 valid_1's auc: 0.833336 valid_1's binary_logloss: 0.140824 [25] valid_0's auc: 0.869224 valid_0's binary_logloss: 0.125753 valid_1's auc: 0.833428 valid_1's binary_logloss: 0.140793 [26] valid_0's auc: 0.870183 valid_0's binary_logloss: 0.125414 valid_1's auc: 0.83333 valid_1's binary_logloss: 0.140724 [27] valid_0's auc: 0.870926 valid_0's binary_logloss: 0.125123 valid_1's auc: 0.832503 valid_1's binary_logloss: 0.140772 [28] valid_0's auc: 0.872431 valid_0's binary_logloss: 0.124766 valid_1's auc: 0.832826 valid_1's binary_logloss: 0.140685 [29] valid_0's auc: 0.873397 valid_0's binary_logloss: 0.124495 valid_1's auc: 0.833175 valid_1's binary_logloss: 0.140604 [30] valid_0's auc: 0.87475 valid_0's binary_logloss: 0.12417 valid_1's auc: 0.833614 valid_1's binary_logloss: 0.140497 [31] valid_0's auc: 0.875407 valid_0's binary_logloss: 0.12389 valid_1's auc: 0.833706 valid_1's binary_logloss: 0.140428 [32] valid_0's auc: 0.876136 valid_0's binary_logloss: 0.123637 valid_1's auc: 0.833458 valid_1's binary_logloss: 0.140448 [33] valid_0's auc: 0.876575 valid_0's binary_logloss: 0.123421 valid_1's auc: 0.832965 valid_1's binary_logloss: 0.140498 [34] valid_0's auc: 0.877224 valid_0's binary_logloss: 0.123219 valid_1's auc: 0.832659 valid_1's binary_logloss: 0.140537 [35] valid_0's auc: 0.877898 valid_0's binary_logloss: 0.122947 valid_1's auc: 0.832787 valid_1's binary_logloss: 0.140536 [36] valid_0's auc: 0.878334 valid_0's binary_logloss: 0.122724 valid_1's auc: 0.832724 valid_1's binary_logloss: 0.14053 [37] valid_0's auc: 0.878762 valid_0's binary_logloss: 0.122514 valid_1's auc: 0.832581 valid_1's binary_logloss: 0.140533
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.814371 valid_0's binary_logloss: 0.156452 valid_1's auc: 0.813175 valid_1's binary_logloss: 0.165418 [2] valid_0's auc: 0.827277 valid_0's binary_logloss: 0.151084 valid_1's auc: 0.819635 valid_1's binary_logloss: 0.160159 [3] valid_0's auc: 0.837033 valid_0's binary_logloss: 0.14722 valid_1's auc: 0.828221 valid_1's binary_logloss: 0.156492 [4] valid_0's auc: 0.840167 valid_0's binary_logloss: 0.14423 valid_1's auc: 0.830942 valid_1's binary_logloss: 0.153586 [5] valid_0's auc: 0.842499 valid_0's binary_logloss: 0.141721 valid_1's auc: 0.833301 valid_1's binary_logloss: 0.151219 [6] valid_0's auc: 0.845403 valid_0's binary_logloss: 0.139708 valid_1's auc: 0.836412 valid_1's binary_logloss: 0.149312 [7] valid_0's auc: 0.848049 valid_0's binary_logloss: 0.138024 valid_1's auc: 0.836054 valid_1's binary_logloss: 0.14779 [8] valid_0's auc: 0.849694 valid_0's binary_logloss: 0.136542 valid_1's auc: 0.837537 valid_1's binary_logloss: 0.146417 [9] valid_0's auc: 0.851646 valid_0's binary_logloss: 0.135289 valid_1's auc: 0.838418 valid_1's binary_logloss: 0.145329 [10] valid_0's auc: 0.853642 valid_0's binary_logloss: 0.134189 valid_1's auc: 0.839342 valid_1's binary_logloss: 0.144374 [11] valid_0's auc: 0.855647 valid_0's binary_logloss: 0.133227 valid_1's auc: 0.840035 valid_1's binary_logloss: 0.143552 [12] valid_0's auc: 0.856768 valid_0's binary_logloss: 0.132399 valid_1's auc: 0.839294 valid_1's binary_logloss: 0.143047 [13] valid_0's auc: 0.85763 valid_0's binary_logloss: 0.13165 valid_1's auc: 0.838911 valid_1's binary_logloss: 0.142469 [14] valid_0's auc: 0.859243 valid_0's binary_logloss: 0.130936 valid_1's auc: 0.838705 valid_1's binary_logloss: 0.141913 [15] valid_0's auc: 0.860124 valid_0's binary_logloss: 0.130312 valid_1's auc: 0.838608 valid_1's binary_logloss: 0.141547 [16] valid_0's auc: 0.861358 valid_0's binary_logloss: 0.129687 valid_1's auc: 0.838422 valid_1's binary_logloss: 0.141134 [17] valid_0's auc: 0.862159 valid_0's binary_logloss: 0.129139 valid_1's auc: 0.838636 valid_1's binary_logloss: 0.140786 [18] valid_0's auc: 0.862729 valid_0's binary_logloss: 0.128664 valid_1's auc: 0.838342 valid_1's binary_logloss: 0.140538 [19] valid_0's auc: 0.863842 valid_0's binary_logloss: 0.128137 valid_1's auc: 0.838464 valid_1's binary_logloss: 0.140331 [20] valid_0's auc: 0.864859 valid_0's binary_logloss: 0.127657 valid_1's auc: 0.837832 valid_1's binary_logloss: 0.140179 [21] valid_0's auc: 0.866227 valid_0's binary_logloss: 0.127137 valid_1's auc: 0.837735 valid_1's binary_logloss: 0.140043 [22] valid_0's auc: 0.866925 valid_0's binary_logloss: 0.126772 valid_1's auc: 0.838268 valid_1's binary_logloss: 0.139927 [23] valid_0's auc: 0.867727 valid_0's binary_logloss: 0.126369 valid_1's auc: 0.838482 valid_1's binary_logloss: 0.139787 [24] valid_0's auc: 0.868239 valid_0's binary_logloss: 0.126013 valid_1's auc: 0.838767 valid_1's binary_logloss: 0.13964 [25] valid_0's auc: 0.869304 valid_0's binary_logloss: 0.125622 valid_1's auc: 0.838562 valid_1's binary_logloss: 0.139648 [26] valid_0's auc: 0.870347 valid_0's binary_logloss: 0.125288 valid_1's auc: 0.838228 valid_1's binary_logloss: 0.139618 [27] valid_0's auc: 0.871198 valid_0's binary_logloss: 0.124953 valid_1's auc: 0.838403 valid_1's binary_logloss: 0.139594 [28] valid_0's auc: 0.872024 valid_0's binary_logloss: 0.124672 valid_1's auc: 0.838405 valid_1's binary_logloss: 0.139526 [29] valid_0's auc: 0.873184 valid_0's binary_logloss: 0.124303 valid_1's auc: 0.838211 valid_1's binary_logloss: 0.139531 [30] valid_0's auc: 0.874076 valid_0's binary_logloss: 0.12403 valid_1's auc: 0.838983 valid_1's binary_logloss: 0.139411 [31] valid_0's auc: 0.874768 valid_0's binary_logloss: 0.123745 valid_1's auc: 0.839314 valid_1's binary_logloss: 0.139314 [32] valid_0's auc: 0.875593 valid_0's binary_logloss: 0.123486 valid_1's auc: 0.838875 valid_1's binary_logloss: 0.139322 [33] valid_0's auc: 0.8767 valid_0's binary_logloss: 0.123182 valid_1's auc: 0.838809 valid_1's binary_logloss: 0.139329 [34] valid_0's auc: 0.87774 valid_0's binary_logloss: 0.122892 valid_1's auc: 0.838376 valid_1's binary_logloss: 0.139342 [35] valid_0's auc: 0.878372 valid_0's binary_logloss: 0.122634 valid_1's auc: 0.838454 valid_1's binary_logloss: 0.13931 [36] valid_0's auc: 0.879098 valid_0's binary_logloss: 0.122414 valid_1's auc: 0.838895 valid_1's binary_logloss: 0.13925 [37] valid_0's auc: 0.879502 valid_0's binary_logloss: 0.122216 valid_1's auc: 0.838441 valid_1's binary_logloss: 0.139302 [38] valid_0's auc: 0.880036 valid_0's binary_logloss: 0.121998 valid_1's auc: 0.838582 valid_1's binary_logloss: 0.139306 [39] valid_0's auc: 0.880641 valid_0's binary_logloss: 0.121716 valid_1's auc: 0.838787 valid_1's binary_logloss: 0.139269 [40] valid_0's auc: 0.881249 valid_0's binary_logloss: 0.121482 valid_1's auc: 0.838906 valid_1's binary_logloss: 0.139223 [41] valid_0's auc: 0.881919 valid_0's binary_logloss: 0.121223 valid_1's auc: 0.838567 valid_1's binary_logloss: 0.13926
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.821645 valid_0's binary_logloss: 0.156528 valid_1's auc: 0.81857 valid_1's binary_logloss: 0.165101 [2] valid_0's auc: 0.827488 valid_0's binary_logloss: 0.151189 valid_1's auc: 0.822299 valid_1's binary_logloss: 0.160072 [3] valid_0's auc: 0.837855 valid_0's binary_logloss: 0.147263 valid_1's auc: 0.829855 valid_1's binary_logloss: 0.156527 [4] valid_0's auc: 0.840063 valid_0's binary_logloss: 0.144261 valid_1's auc: 0.833088 valid_1's binary_logloss: 0.153446 [5] valid_0's auc: 0.842802 valid_0's binary_logloss: 0.141691 valid_1's auc: 0.834541 valid_1's binary_logloss: 0.151144 [6] valid_0's auc: 0.844 valid_0's binary_logloss: 0.139654 valid_1's auc: 0.834542 valid_1's binary_logloss: 0.149333 [7] valid_0's auc: 0.845838 valid_0's binary_logloss: 0.138002 valid_1's auc: 0.835645 valid_1's binary_logloss: 0.147676 [8] valid_0's auc: 0.846869 valid_0's binary_logloss: 0.136628 valid_1's auc: 0.836118 valid_1's binary_logloss: 0.146491 [9] valid_0's auc: 0.849282 valid_0's binary_logloss: 0.135382 valid_1's auc: 0.837542 valid_1's binary_logloss: 0.14539 [10] valid_0's auc: 0.851021 valid_0's binary_logloss: 0.134282 valid_1's auc: 0.836942 valid_1's binary_logloss: 0.144584 [11] valid_0's auc: 0.852037 valid_0's binary_logloss: 0.133358 valid_1's auc: 0.8374 valid_1's binary_logloss: 0.143836 [12] valid_0's auc: 0.854496 valid_0's binary_logloss: 0.132505 valid_1's auc: 0.838593 valid_1's binary_logloss: 0.143171 [13] valid_0's auc: 0.857514 valid_0's binary_logloss: 0.131695 valid_1's auc: 0.838558 valid_1's binary_logloss: 0.142646 [14] valid_0's auc: 0.858827 valid_0's binary_logloss: 0.131006 valid_1's auc: 0.838498 valid_1's binary_logloss: 0.142158 [15] valid_0's auc: 0.860574 valid_0's binary_logloss: 0.130352 valid_1's auc: 0.837435 valid_1's binary_logloss: 0.141868 [16] valid_0's auc: 0.861239 valid_0's binary_logloss: 0.129765 valid_1's auc: 0.837374 valid_1's binary_logloss: 0.141537 [17] valid_0's auc: 0.86217 valid_0's binary_logloss: 0.129164 valid_1's auc: 0.837703 valid_1's binary_logloss: 0.141192 [18] valid_0's auc: 0.863228 valid_0's binary_logloss: 0.128615 valid_1's auc: 0.837526 valid_1's binary_logloss: 0.140917 [19] valid_0's auc: 0.86473 valid_0's binary_logloss: 0.128113 valid_1's auc: 0.838235 valid_1's binary_logloss: 0.140572 [20] valid_0's auc: 0.865797 valid_0's binary_logloss: 0.127679 valid_1's auc: 0.838788 valid_1's binary_logloss: 0.140332 [21] valid_0's auc: 0.866561 valid_0's binary_logloss: 0.127235 valid_1's auc: 0.839171 valid_1's binary_logloss: 0.140108 [22] valid_0's auc: 0.867237 valid_0's binary_logloss: 0.12688 valid_1's auc: 0.839213 valid_1's binary_logloss: 0.13991 [23] valid_0's auc: 0.867894 valid_0's binary_logloss: 0.126519 valid_1's auc: 0.839641 valid_1's binary_logloss: 0.139745 [24] valid_0's auc: 0.868501 valid_0's binary_logloss: 0.126192 valid_1's auc: 0.840025 valid_1's binary_logloss: 0.139593 [25] valid_0's auc: 0.869311 valid_0's binary_logloss: 0.125838 valid_1's auc: 0.839961 valid_1's binary_logloss: 0.139531 [26] valid_0's auc: 0.870325 valid_0's binary_logloss: 0.125518 valid_1's auc: 0.839261 valid_1's binary_logloss: 0.139524 [27] valid_0's auc: 0.871488 valid_0's binary_logloss: 0.125147 valid_1's auc: 0.839671 valid_1's binary_logloss: 0.139365 [28] valid_0's auc: 0.87235 valid_0's binary_logloss: 0.12484 valid_1's auc: 0.840114 valid_1's binary_logloss: 0.139236 [29] valid_0's auc: 0.872991 valid_0's binary_logloss: 0.124593 valid_1's auc: 0.839491 valid_1's binary_logloss: 0.139271 [30] valid_0's auc: 0.874129 valid_0's binary_logloss: 0.124312 valid_1's auc: 0.839589 valid_1's binary_logloss: 0.13918 [31] valid_0's auc: 0.875305 valid_0's binary_logloss: 0.123988 valid_1's auc: 0.839441 valid_1's binary_logloss: 0.139184 [32] valid_0's auc: 0.875943 valid_0's binary_logloss: 0.123748 valid_1's auc: 0.839268 valid_1's binary_logloss: 0.13919 [33] valid_0's auc: 0.876575 valid_0's binary_logloss: 0.123484 valid_1's auc: 0.839549 valid_1's binary_logloss: 0.139075 [34] valid_0's auc: 0.877426 valid_0's binary_logloss: 0.123156 valid_1's auc: 0.839087 valid_1's binary_logloss: 0.139148 [35] valid_0's auc: 0.87822 valid_0's binary_logloss: 0.122873 valid_1's auc: 0.8389 valid_1's binary_logloss: 0.139187 [36] valid_0's auc: 0.878932 valid_0's binary_logloss: 0.12259 valid_1's auc: 0.838921 valid_1's binary_logloss: 0.139194 [37] valid_0's auc: 0.879842 valid_0's binary_logloss: 0.12233 valid_1's auc: 0.839133 valid_1's binary_logloss: 0.139161 [38] valid_0's auc: 0.880497 valid_0's binary_logloss: 0.12208 valid_1's auc: 0.838975 valid_1's binary_logloss: 0.139143 [39] valid_0's auc: 0.881056 valid_0's binary_logloss: 0.121827 valid_1's auc: 0.839037 valid_1's binary_logloss: 0.139138 [40] valid_0's auc: 0.881604 valid_0's binary_logloss: 0.121603 valid_1's auc: 0.839204 valid_1's binary_logloss: 0.139119 [41] valid_0's auc: 0.882159 valid_0's binary_logloss: 0.121355 valid_1's auc: 0.839277 valid_1's binary_logloss: 0.139091 [42] valid_0's auc: 0.882757 valid_0's binary_logloss: 0.121116 valid_1's auc: 0.838964 valid_1's binary_logloss: 0.139133 [43] valid_0's auc: 0.883143 valid_0's binary_logloss: 0.120918 valid_1's auc: 0.839024 valid_1's binary_logloss: 0.139124 [44] valid_0's auc: 0.883697 valid_0's binary_logloss: 0.12072 valid_1's auc: 0.838652 valid_1's binary_logloss: 0.139203 [45] valid_0's auc: 0.884292 valid_0's binary_logloss: 0.120482 valid_1's auc: 0.839016 valid_1's binary_logloss: 0.139124 [46] valid_0's auc: 0.884969 valid_0's binary_logloss: 0.120266 valid_1's auc: 0.838683 valid_1's binary_logloss: 0.139184 [47] valid_0's auc: 0.8853 valid_0's binary_logloss: 0.120089 valid_1's auc: 0.838624 valid_1's binary_logloss: 0.139193 [48] valid_0's auc: 0.885876 valid_0's binary_logloss: 0.11993 valid_1's auc: 0.838569 valid_1's binary_logloss: 0.139212 [49] valid_0's auc: 0.886141 valid_0's binary_logloss: 0.119757 valid_1's auc: 0.838345 valid_1's binary_logloss: 0.139288 [50] valid_0's auc: 0.886433 valid_0's binary_logloss: 0.119595 valid_1's auc: 0.838342 valid_1's binary_logloss: 0.139332 [51] valid_0's auc: 0.886975 valid_0's binary_logloss: 0.119377 valid_1's auc: 0.838335 valid_1's binary_logloss: 0.139331 [52] valid_0's auc: 0.887568 valid_0's binary_logloss: 0.119161 valid_1's auc: 0.838204 valid_1's binary_logloss: 0.139331 [53] valid_0's auc: 0.887867 valid_0's binary_logloss: 0.118974 valid_1's auc: 0.838044 valid_1's binary_logloss: 0.13936 [54] valid_0's auc: 0.888093 valid_0's binary_logloss: 0.118834 valid_1's auc: 0.838137 valid_1's binary_logloss: 0.13935 [55] valid_0's auc: 0.888289 valid_0's binary_logloss: 0.118675 valid_1's auc: 0.837878 valid_1's binary_logloss: 0.139392 [56] valid_0's auc: 0.888615 valid_0's binary_logloss: 0.118561 valid_1's auc: 0.837776 valid_1's binary_logloss: 0.139418 [57] valid_0's auc: 0.889157 valid_0's binary_logloss: 0.118369 valid_1's auc: 0.837585 valid_1's binary_logloss: 0.139447 [58] valid_0's auc: 0.889659 valid_0's binary_logloss: 0.11819 valid_1's auc: 0.837789 valid_1's binary_logloss: 0.139431
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.820235 valid_0's binary_logloss: 0.156085 valid_1's auc: 0.81613 valid_1's binary_logloss: 0.164992 [2] valid_0's auc: 0.825778 valid_0's binary_logloss: 0.150951 valid_1's auc: 0.821835 valid_1's binary_logloss: 0.159874 [3] valid_0's auc: 0.832262 valid_0's binary_logloss: 0.147158 valid_1's auc: 0.826533 valid_1's binary_logloss: 0.156346 [4] valid_0's auc: 0.83865 valid_0's binary_logloss: 0.144126 valid_1's auc: 0.833166 valid_1's binary_logloss: 0.1534 [5] valid_0's auc: 0.842822 valid_0's binary_logloss: 0.141725 valid_1's auc: 0.836448 valid_1's binary_logloss: 0.151167 [6] valid_0's auc: 0.844702 valid_0's binary_logloss: 0.139642 valid_1's auc: 0.837094 valid_1's binary_logloss: 0.149356 [7] valid_0's auc: 0.847144 valid_0's binary_logloss: 0.13794 valid_1's auc: 0.837965 valid_1's binary_logloss: 0.147853 [8] valid_0's auc: 0.848277 valid_0's binary_logloss: 0.136499 valid_1's auc: 0.837663 valid_1's binary_logloss: 0.146543 [9] valid_0's auc: 0.849328 valid_0's binary_logloss: 0.135326 valid_1's auc: 0.837413 valid_1's binary_logloss: 0.145528 [10] valid_0's auc: 0.851112 valid_0's binary_logloss: 0.134188 valid_1's auc: 0.836954 valid_1's binary_logloss: 0.14466 [11] valid_0's auc: 0.852613 valid_0's binary_logloss: 0.133257 valid_1's auc: 0.837393 valid_1's binary_logloss: 0.143843 [12] valid_0's auc: 0.854906 valid_0's binary_logloss: 0.132346 valid_1's auc: 0.837459 valid_1's binary_logloss: 0.143285 [13] valid_0's auc: 0.855656 valid_0's binary_logloss: 0.131601 valid_1's auc: 0.837612 valid_1's binary_logloss: 0.142732 [14] valid_0's auc: 0.857076 valid_0's binary_logloss: 0.130884 valid_1's auc: 0.837055 valid_1's binary_logloss: 0.142403 [15] valid_0's auc: 0.857961 valid_0's binary_logloss: 0.130252 valid_1's auc: 0.837198 valid_1's binary_logloss: 0.142031 [16] valid_0's auc: 0.860191 valid_0's binary_logloss: 0.129596 valid_1's auc: 0.836016 valid_1's binary_logloss: 0.141822 [17] valid_0's auc: 0.860941 valid_0's binary_logloss: 0.129064 valid_1's auc: 0.836076 valid_1's binary_logloss: 0.141551 [18] valid_0's auc: 0.862201 valid_0's binary_logloss: 0.128565 valid_1's auc: 0.835929 valid_1's binary_logloss: 0.141326 [19] valid_0's auc: 0.863581 valid_0's binary_logloss: 0.128105 valid_1's auc: 0.835256 valid_1's binary_logloss: 0.141243 [20] valid_0's auc: 0.864799 valid_0's binary_logloss: 0.127654 valid_1's auc: 0.83435 valid_1's binary_logloss: 0.141148 [21] valid_0's auc: 0.866472 valid_0's binary_logloss: 0.127165 valid_1's auc: 0.834176 valid_1's binary_logloss: 0.141041 [22] valid_0's auc: 0.867055 valid_0's binary_logloss: 0.126777 valid_1's auc: 0.834173 valid_1's binary_logloss: 0.140887 [23] valid_0's auc: 0.867726 valid_0's binary_logloss: 0.12643 valid_1's auc: 0.833577 valid_1's binary_logloss: 0.140909 [24] valid_0's auc: 0.868612 valid_0's binary_logloss: 0.126061 valid_1's auc: 0.833336 valid_1's binary_logloss: 0.140824 [25] valid_0's auc: 0.869224 valid_0's binary_logloss: 0.125753 valid_1's auc: 0.833428 valid_1's binary_logloss: 0.140793 [26] valid_0's auc: 0.870183 valid_0's binary_logloss: 0.125414 valid_1's auc: 0.83333 valid_1's binary_logloss: 0.140724 [27] valid_0's auc: 0.870926 valid_0's binary_logloss: 0.125123 valid_1's auc: 0.832503 valid_1's binary_logloss: 0.140772 [28] valid_0's auc: 0.872431 valid_0's binary_logloss: 0.124766 valid_1's auc: 0.832826 valid_1's binary_logloss: 0.140685 [29] valid_0's auc: 0.873397 valid_0's binary_logloss: 0.124495 valid_1's auc: 0.833175 valid_1's binary_logloss: 0.140604 [30] valid_0's auc: 0.87475 valid_0's binary_logloss: 0.12417 valid_1's auc: 0.833614 valid_1's binary_logloss: 0.140497 [31] valid_0's auc: 0.875407 valid_0's binary_logloss: 0.12389 valid_1's auc: 0.833706 valid_1's binary_logloss: 0.140428 [32] valid_0's auc: 0.876136 valid_0's binary_logloss: 0.123637 valid_1's auc: 0.833458 valid_1's binary_logloss: 0.140448 [33] valid_0's auc: 0.876575 valid_0's binary_logloss: 0.123421 valid_1's auc: 0.832965 valid_1's binary_logloss: 0.140498 [34] valid_0's auc: 0.877224 valid_0's binary_logloss: 0.123219 valid_1's auc: 0.832659 valid_1's binary_logloss: 0.140537 [35] valid_0's auc: 0.877898 valid_0's binary_logloss: 0.122947 valid_1's auc: 0.832787 valid_1's binary_logloss: 0.140536 [36] valid_0's auc: 0.878334 valid_0's binary_logloss: 0.122724 valid_1's auc: 0.832724 valid_1's binary_logloss: 0.14053 [37] valid_0's auc: 0.878762 valid_0's binary_logloss: 0.122514 valid_1's auc: 0.832581 valid_1's binary_logloss: 0.140533
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.814371 valid_0's binary_logloss: 0.156452 valid_1's auc: 0.813175 valid_1's binary_logloss: 0.165418 [2] valid_0's auc: 0.827277 valid_0's binary_logloss: 0.151084 valid_1's auc: 0.819635 valid_1's binary_logloss: 0.160159 [3] valid_0's auc: 0.837033 valid_0's binary_logloss: 0.14722 valid_1's auc: 0.828221 valid_1's binary_logloss: 0.156492 [4] valid_0's auc: 0.840167 valid_0's binary_logloss: 0.14423 valid_1's auc: 0.830942 valid_1's binary_logloss: 0.153586 [5] valid_0's auc: 0.842499 valid_0's binary_logloss: 0.141721 valid_1's auc: 0.833301 valid_1's binary_logloss: 0.151219 [6] valid_0's auc: 0.845403 valid_0's binary_logloss: 0.139708 valid_1's auc: 0.836412 valid_1's binary_logloss: 0.149312 [7] valid_0's auc: 0.848049 valid_0's binary_logloss: 0.138024 valid_1's auc: 0.836054 valid_1's binary_logloss: 0.14779 [8] valid_0's auc: 0.849694 valid_0's binary_logloss: 0.136542 valid_1's auc: 0.837537 valid_1's binary_logloss: 0.146417 [9] valid_0's auc: 0.851646 valid_0's binary_logloss: 0.135289 valid_1's auc: 0.838418 valid_1's binary_logloss: 0.145329 [10] valid_0's auc: 0.853642 valid_0's binary_logloss: 0.134189 valid_1's auc: 0.839342 valid_1's binary_logloss: 0.144374 [11] valid_0's auc: 0.855647 valid_0's binary_logloss: 0.133227 valid_1's auc: 0.840035 valid_1's binary_logloss: 0.143552 [12] valid_0's auc: 0.856768 valid_0's binary_logloss: 0.132399 valid_1's auc: 0.839294 valid_1's binary_logloss: 0.143047 [13] valid_0's auc: 0.85763 valid_0's binary_logloss: 0.13165 valid_1's auc: 0.838911 valid_1's binary_logloss: 0.142469 [14] valid_0's auc: 0.859243 valid_0's binary_logloss: 0.130936 valid_1's auc: 0.838705 valid_1's binary_logloss: 0.141913 [15] valid_0's auc: 0.860124 valid_0's binary_logloss: 0.130312 valid_1's auc: 0.838608 valid_1's binary_logloss: 0.141547 [16] valid_0's auc: 0.861358 valid_0's binary_logloss: 0.129687 valid_1's auc: 0.838422 valid_1's binary_logloss: 0.141134 [17] valid_0's auc: 0.862159 valid_0's binary_logloss: 0.129139 valid_1's auc: 0.838636 valid_1's binary_logloss: 0.140786 [18] valid_0's auc: 0.862729 valid_0's binary_logloss: 0.128664 valid_1's auc: 0.838342 valid_1's binary_logloss: 0.140538 [19] valid_0's auc: 0.863842 valid_0's binary_logloss: 0.128137 valid_1's auc: 0.838464 valid_1's binary_logloss: 0.140331 [20] valid_0's auc: 0.864859 valid_0's binary_logloss: 0.127657 valid_1's auc: 0.837832 valid_1's binary_logloss: 0.140179 [21] valid_0's auc: 0.866227 valid_0's binary_logloss: 0.127137 valid_1's auc: 0.837735 valid_1's binary_logloss: 0.140043 [22] valid_0's auc: 0.866925 valid_0's binary_logloss: 0.126772 valid_1's auc: 0.838268 valid_1's binary_logloss: 0.139927 [23] valid_0's auc: 0.867727 valid_0's binary_logloss: 0.126369 valid_1's auc: 0.838482 valid_1's binary_logloss: 0.139787 [24] valid_0's auc: 0.868239 valid_0's binary_logloss: 0.126013 valid_1's auc: 0.838767 valid_1's binary_logloss: 0.13964 [25] valid_0's auc: 0.869304 valid_0's binary_logloss: 0.125622 valid_1's auc: 0.838562 valid_1's binary_logloss: 0.139648 [26] valid_0's auc: 0.870347 valid_0's binary_logloss: 0.125288 valid_1's auc: 0.838228 valid_1's binary_logloss: 0.139618 [27] valid_0's auc: 0.871198 valid_0's binary_logloss: 0.124953 valid_1's auc: 0.838403 valid_1's binary_logloss: 0.139594 [28] valid_0's auc: 0.872024 valid_0's binary_logloss: 0.124672 valid_1's auc: 0.838405 valid_1's binary_logloss: 0.139526 [29] valid_0's auc: 0.873184 valid_0's binary_logloss: 0.124303 valid_1's auc: 0.838211 valid_1's binary_logloss: 0.139531 [30] valid_0's auc: 0.874076 valid_0's binary_logloss: 0.12403 valid_1's auc: 0.838983 valid_1's binary_logloss: 0.139411 [31] valid_0's auc: 0.874768 valid_0's binary_logloss: 0.123745 valid_1's auc: 0.839314 valid_1's binary_logloss: 0.139314 [32] valid_0's auc: 0.875593 valid_0's binary_logloss: 0.123486 valid_1's auc: 0.838875 valid_1's binary_logloss: 0.139322 [33] valid_0's auc: 0.8767 valid_0's binary_logloss: 0.123182 valid_1's auc: 0.838809 valid_1's binary_logloss: 0.139329 [34] valid_0's auc: 0.87774 valid_0's binary_logloss: 0.122892 valid_1's auc: 0.838376 valid_1's binary_logloss: 0.139342 [35] valid_0's auc: 0.878372 valid_0's binary_logloss: 0.122634 valid_1's auc: 0.838454 valid_1's binary_logloss: 0.13931 [36] valid_0's auc: 0.879098 valid_0's binary_logloss: 0.122414 valid_1's auc: 0.838895 valid_1's binary_logloss: 0.13925 [37] valid_0's auc: 0.879502 valid_0's binary_logloss: 0.122216 valid_1's auc: 0.838441 valid_1's binary_logloss: 0.139302 [38] valid_0's auc: 0.880036 valid_0's binary_logloss: 0.121998 valid_1's auc: 0.838582 valid_1's binary_logloss: 0.139306 [39] valid_0's auc: 0.880641 valid_0's binary_logloss: 0.121716 valid_1's auc: 0.838787 valid_1's binary_logloss: 0.139269 [40] valid_0's auc: 0.881249 valid_0's binary_logloss: 0.121482 valid_1's auc: 0.838906 valid_1's binary_logloss: 0.139223 [41] valid_0's auc: 0.881919 valid_0's binary_logloss: 0.121223 valid_1's auc: 0.838567 valid_1's binary_logloss: 0.13926
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.821645 valid_0's binary_logloss: 0.156528 valid_1's auc: 0.81857 valid_1's binary_logloss: 0.165101 [2] valid_0's auc: 0.827488 valid_0's binary_logloss: 0.151189 valid_1's auc: 0.822299 valid_1's binary_logloss: 0.160072 [3] valid_0's auc: 0.837855 valid_0's binary_logloss: 0.147263 valid_1's auc: 0.829855 valid_1's binary_logloss: 0.156527 [4] valid_0's auc: 0.840063 valid_0's binary_logloss: 0.144261 valid_1's auc: 0.833088 valid_1's binary_logloss: 0.153446 [5] valid_0's auc: 0.842802 valid_0's binary_logloss: 0.141691 valid_1's auc: 0.834541 valid_1's binary_logloss: 0.151144 [6] valid_0's auc: 0.844 valid_0's binary_logloss: 0.139654 valid_1's auc: 0.834542 valid_1's binary_logloss: 0.149333 [7] valid_0's auc: 0.845838 valid_0's binary_logloss: 0.138002 valid_1's auc: 0.835645 valid_1's binary_logloss: 0.147676 [8] valid_0's auc: 0.846869 valid_0's binary_logloss: 0.136628 valid_1's auc: 0.836118 valid_1's binary_logloss: 0.146491 [9] valid_0's auc: 0.849282 valid_0's binary_logloss: 0.135382 valid_1's auc: 0.837542 valid_1's binary_logloss: 0.14539 [10] valid_0's auc: 0.851021 valid_0's binary_logloss: 0.134282 valid_1's auc: 0.836942 valid_1's binary_logloss: 0.144584 [11] valid_0's auc: 0.852037 valid_0's binary_logloss: 0.133358 valid_1's auc: 0.8374 valid_1's binary_logloss: 0.143836 [12] valid_0's auc: 0.854496 valid_0's binary_logloss: 0.132505 valid_1's auc: 0.838593 valid_1's binary_logloss: 0.143171 [13] valid_0's auc: 0.857514 valid_0's binary_logloss: 0.131695 valid_1's auc: 0.838558 valid_1's binary_logloss: 0.142646 [14] valid_0's auc: 0.858827 valid_0's binary_logloss: 0.131006 valid_1's auc: 0.838498 valid_1's binary_logloss: 0.142158 [15] valid_0's auc: 0.860574 valid_0's binary_logloss: 0.130352 valid_1's auc: 0.837435 valid_1's binary_logloss: 0.141868 [16] valid_0's auc: 0.861239 valid_0's binary_logloss: 0.129765 valid_1's auc: 0.837374 valid_1's binary_logloss: 0.141537 [17] valid_0's auc: 0.86217 valid_0's binary_logloss: 0.129164 valid_1's auc: 0.837703 valid_1's binary_logloss: 0.141192 [18] valid_0's auc: 0.863228 valid_0's binary_logloss: 0.128615 valid_1's auc: 0.837526 valid_1's binary_logloss: 0.140917 [19] valid_0's auc: 0.86473 valid_0's binary_logloss: 0.128113 valid_1's auc: 0.838235 valid_1's binary_logloss: 0.140572 [20] valid_0's auc: 0.865797 valid_0's binary_logloss: 0.127679 valid_1's auc: 0.838788 valid_1's binary_logloss: 0.140332 [21] valid_0's auc: 0.866561 valid_0's binary_logloss: 0.127235 valid_1's auc: 0.839171 valid_1's binary_logloss: 0.140108 [22] valid_0's auc: 0.867237 valid_0's binary_logloss: 0.12688 valid_1's auc: 0.839213 valid_1's binary_logloss: 0.13991 [23] valid_0's auc: 0.867894 valid_0's binary_logloss: 0.126519 valid_1's auc: 0.839641 valid_1's binary_logloss: 0.139745 [24] valid_0's auc: 0.868501 valid_0's binary_logloss: 0.126192 valid_1's auc: 0.840025 valid_1's binary_logloss: 0.139593 [25] valid_0's auc: 0.869311 valid_0's binary_logloss: 0.125838 valid_1's auc: 0.839961 valid_1's binary_logloss: 0.139531 [26] valid_0's auc: 0.870325 valid_0's binary_logloss: 0.125518 valid_1's auc: 0.839261 valid_1's binary_logloss: 0.139524 [27] valid_0's auc: 0.871488 valid_0's binary_logloss: 0.125147 valid_1's auc: 0.839671 valid_1's binary_logloss: 0.139365 [28] valid_0's auc: 0.87235 valid_0's binary_logloss: 0.12484 valid_1's auc: 0.840114 valid_1's binary_logloss: 0.139236 [29] valid_0's auc: 0.872991 valid_0's binary_logloss: 0.124593 valid_1's auc: 0.839491 valid_1's binary_logloss: 0.139271 [30] valid_0's auc: 0.874129 valid_0's binary_logloss: 0.124312 valid_1's auc: 0.839589 valid_1's binary_logloss: 0.13918 [31] valid_0's auc: 0.875305 valid_0's binary_logloss: 0.123988 valid_1's auc: 0.839441 valid_1's binary_logloss: 0.139184 [32] valid_0's auc: 0.875943 valid_0's binary_logloss: 0.123748 valid_1's auc: 0.839268 valid_1's binary_logloss: 0.13919 [33] valid_0's auc: 0.876575 valid_0's binary_logloss: 0.123484 valid_1's auc: 0.839549 valid_1's binary_logloss: 0.139075 [34] valid_0's auc: 0.877426 valid_0's binary_logloss: 0.123156 valid_1's auc: 0.839087 valid_1's binary_logloss: 0.139148 [35] valid_0's auc: 0.87822 valid_0's binary_logloss: 0.122873 valid_1's auc: 0.8389 valid_1's binary_logloss: 0.139187 [36] valid_0's auc: 0.878932 valid_0's binary_logloss: 0.12259 valid_1's auc: 0.838921 valid_1's binary_logloss: 0.139194 [37] valid_0's auc: 0.879842 valid_0's binary_logloss: 0.12233 valid_1's auc: 0.839133 valid_1's binary_logloss: 0.139161 [38] valid_0's auc: 0.880497 valid_0's binary_logloss: 0.12208 valid_1's auc: 0.838975 valid_1's binary_logloss: 0.139143 [39] valid_0's auc: 0.881056 valid_0's binary_logloss: 0.121827 valid_1's auc: 0.839037 valid_1's binary_logloss: 0.139138 [40] valid_0's auc: 0.881604 valid_0's binary_logloss: 0.121603 valid_1's auc: 0.839204 valid_1's binary_logloss: 0.139119 [41] valid_0's auc: 0.882159 valid_0's binary_logloss: 0.121355 valid_1's auc: 0.839277 valid_1's binary_logloss: 0.139091 [42] valid_0's auc: 0.882757 valid_0's binary_logloss: 0.121116 valid_1's auc: 0.838964 valid_1's binary_logloss: 0.139133 [43] valid_0's auc: 0.883143 valid_0's binary_logloss: 0.120918 valid_1's auc: 0.839024 valid_1's binary_logloss: 0.139124 [44] valid_0's auc: 0.883697 valid_0's binary_logloss: 0.12072 valid_1's auc: 0.838652 valid_1's binary_logloss: 0.139203 [45] valid_0's auc: 0.884292 valid_0's binary_logloss: 0.120482 valid_1's auc: 0.839016 valid_1's binary_logloss: 0.139124 [46] valid_0's auc: 0.884969 valid_0's binary_logloss: 0.120266 valid_1's auc: 0.838683 valid_1's binary_logloss: 0.139184 [47] valid_0's auc: 0.8853 valid_0's binary_logloss: 0.120089 valid_1's auc: 0.838624 valid_1's binary_logloss: 0.139193 [48] valid_0's auc: 0.885876 valid_0's binary_logloss: 0.11993 valid_1's auc: 0.838569 valid_1's binary_logloss: 0.139212 [49] valid_0's auc: 0.886141 valid_0's binary_logloss: 0.119757 valid_1's auc: 0.838345 valid_1's binary_logloss: 0.139288 [50] valid_0's auc: 0.886433 valid_0's binary_logloss: 0.119595 valid_1's auc: 0.838342 valid_1's binary_logloss: 0.139332 [51] valid_0's auc: 0.886975 valid_0's binary_logloss: 0.119377 valid_1's auc: 0.838335 valid_1's binary_logloss: 0.139331 [52] valid_0's auc: 0.887568 valid_0's binary_logloss: 0.119161 valid_1's auc: 0.838204 valid_1's binary_logloss: 0.139331 [53] valid_0's auc: 0.887867 valid_0's binary_logloss: 0.118974 valid_1's auc: 0.838044 valid_1's binary_logloss: 0.13936 [54] valid_0's auc: 0.888093 valid_0's binary_logloss: 0.118834 valid_1's auc: 0.838137 valid_1's binary_logloss: 0.13935 [55] valid_0's auc: 0.888289 valid_0's binary_logloss: 0.118675 valid_1's auc: 0.837878 valid_1's binary_logloss: 0.139392 [56] valid_0's auc: 0.888615 valid_0's binary_logloss: 0.118561 valid_1's auc: 0.837776 valid_1's binary_logloss: 0.139418 [57] valid_0's auc: 0.889157 valid_0's binary_logloss: 0.118369 valid_1's auc: 0.837585 valid_1's binary_logloss: 0.139447 [58] valid_0's auc: 0.889659 valid_0's binary_logloss: 0.11819 valid_1's auc: 0.837789 valid_1's binary_logloss: 0.139431
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.832891 valid_0's binary_logloss: 0.155302 valid_1's auc: 0.818851 valid_1's binary_logloss: 0.164826 [2] valid_0's auc: 0.84519 valid_0's binary_logloss: 0.149727 valid_1's auc: 0.827144 valid_1's binary_logloss: 0.159879 [3] valid_0's auc: 0.848018 valid_0's binary_logloss: 0.145627 valid_1's auc: 0.826851 valid_1's binary_logloss: 0.15631 [4] valid_0's auc: 0.851096 valid_0's binary_logloss: 0.142423 valid_1's auc: 0.83073 valid_1's binary_logloss: 0.1534 [5] valid_0's auc: 0.854735 valid_0's binary_logloss: 0.139746 valid_1's auc: 0.832753 valid_1's binary_logloss: 0.151136 [6] valid_0's auc: 0.856928 valid_0's binary_logloss: 0.137509 valid_1's auc: 0.835605 valid_1's binary_logloss: 0.14924 [7] valid_0's auc: 0.859448 valid_0's binary_logloss: 0.135575 valid_1's auc: 0.835612 valid_1's binary_logloss: 0.147799 [8] valid_0's auc: 0.861685 valid_0's binary_logloss: 0.133953 valid_1's auc: 0.834408 valid_1's binary_logloss: 0.146634 [9] valid_0's auc: 0.863391 valid_0's binary_logloss: 0.132468 valid_1's auc: 0.835623 valid_1's binary_logloss: 0.145549 [10] valid_0's auc: 0.865858 valid_0's binary_logloss: 0.131185 valid_1's auc: 0.83487 valid_1's binary_logloss: 0.144745 [11] valid_0's auc: 0.867134 valid_0's binary_logloss: 0.130116 valid_1's auc: 0.834692 valid_1's binary_logloss: 0.14411 [12] valid_0's auc: 0.868217 valid_0's binary_logloss: 0.129097 valid_1's auc: 0.834746 valid_1's binary_logloss: 0.143527 [13] valid_0's auc: 0.87073 valid_0's binary_logloss: 0.128129 valid_1's auc: 0.833582 valid_1's binary_logloss: 0.143122 [14] valid_0's auc: 0.872621 valid_0's binary_logloss: 0.12721 valid_1's auc: 0.833205 valid_1's binary_logloss: 0.142745 [15] valid_0's auc: 0.874007 valid_0's binary_logloss: 0.126363 valid_1's auc: 0.83246 valid_1's binary_logloss: 0.142489 [16] valid_0's auc: 0.875141 valid_0's binary_logloss: 0.125606 valid_1's auc: 0.831958 valid_1's binary_logloss: 0.142275 [17] valid_0's auc: 0.876061 valid_0's binary_logloss: 0.124928 valid_1's auc: 0.831586 valid_1's binary_logloss: 0.142141 [18] valid_0's auc: 0.876982 valid_0's binary_logloss: 0.124313 valid_1's auc: 0.830954 valid_1's binary_logloss: 0.142066 [19] valid_0's auc: 0.877885 valid_0's binary_logloss: 0.123709 valid_1's auc: 0.830572 valid_1's binary_logloss: 0.14196 [20] valid_0's auc: 0.879378 valid_0's binary_logloss: 0.123088 valid_1's auc: 0.830076 valid_1's binary_logloss: 0.14196 [21] valid_0's auc: 0.880647 valid_0's binary_logloss: 0.122488 valid_1's auc: 0.830109 valid_1's binary_logloss: 0.141858 [22] valid_0's auc: 0.881614 valid_0's binary_logloss: 0.121973 valid_1's auc: 0.829735 valid_1's binary_logloss: 0.141822 [23] valid_0's auc: 0.882402 valid_0's binary_logloss: 0.121554 valid_1's auc: 0.829254 valid_1's binary_logloss: 0.141805 [24] valid_0's auc: 0.883011 valid_0's binary_logloss: 0.121078 valid_1's auc: 0.829054 valid_1's binary_logloss: 0.14178 [25] valid_0's auc: 0.884627 valid_0's binary_logloss: 0.120587 valid_1's auc: 0.82942 valid_1's binary_logloss: 0.141653 [26] valid_0's auc: 0.885304 valid_0's binary_logloss: 0.120169 valid_1's auc: 0.828716 valid_1's binary_logloss: 0.141755 [27] valid_0's auc: 0.88664 valid_0's binary_logloss: 0.119673 valid_1's auc: 0.828869 valid_1's binary_logloss: 0.141682 [28] valid_0's auc: 0.887143 valid_0's binary_logloss: 0.119308 valid_1's auc: 0.828987 valid_1's binary_logloss: 0.141649 [29] valid_0's auc: 0.88825 valid_0's binary_logloss: 0.1189 valid_1's auc: 0.829075 valid_1's binary_logloss: 0.141601 [30] valid_0's auc: 0.889081 valid_0's binary_logloss: 0.118531 valid_1's auc: 0.828871 valid_1's binary_logloss: 0.141605 [31] valid_0's auc: 0.890195 valid_0's binary_logloss: 0.118117 valid_1's auc: 0.828972 valid_1's binary_logloss: 0.141605 [32] valid_0's auc: 0.890928 valid_0's binary_logloss: 0.117735 valid_1's auc: 0.827969 valid_1's binary_logloss: 0.141796 [33] valid_0's auc: 0.891505 valid_0's binary_logloss: 0.117389 valid_1's auc: 0.827611 valid_1's binary_logloss: 0.141916 [34] valid_0's auc: 0.892223 valid_0's binary_logloss: 0.11707 valid_1's auc: 0.827019 valid_1's binary_logloss: 0.142051 [35] valid_0's auc: 0.892825 valid_0's binary_logloss: 0.116751 valid_1's auc: 0.826865 valid_1's binary_logloss: 0.142116 [36] valid_0's auc: 0.893984 valid_0's binary_logloss: 0.116353 valid_1's auc: 0.827203 valid_1's binary_logloss: 0.14207 [37] valid_0's auc: 0.89456 valid_0's binary_logloss: 0.11603 valid_1's auc: 0.827292 valid_1's binary_logloss: 0.142005 [38] valid_0's auc: 0.89511 valid_0's binary_logloss: 0.115713 valid_1's auc: 0.827214 valid_1's binary_logloss: 0.14206 [39] valid_0's auc: 0.895738 valid_0's binary_logloss: 0.115415 valid_1's auc: 0.82695 valid_1's binary_logloss: 0.142162
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.833054 valid_0's binary_logloss: 0.15572 valid_1's auc: 0.817048 valid_1's binary_logloss: 0.165036 [2] valid_0's auc: 0.841397 valid_0's binary_logloss: 0.149862 valid_1's auc: 0.82157 valid_1's binary_logloss: 0.159575 [3] valid_0's auc: 0.849058 valid_0's binary_logloss: 0.145662 valid_1's auc: 0.829866 valid_1's binary_logloss: 0.155774 [4] valid_0's auc: 0.854301 valid_0's binary_logloss: 0.142356 valid_1's auc: 0.832415 valid_1's binary_logloss: 0.152936 [5] valid_0's auc: 0.858045 valid_0's binary_logloss: 0.139697 valid_1's auc: 0.834554 valid_1's binary_logloss: 0.150635 [6] valid_0's auc: 0.860767 valid_0's binary_logloss: 0.137458 valid_1's auc: 0.834885 valid_1's binary_logloss: 0.148761 [7] valid_0's auc: 0.863011 valid_0's binary_logloss: 0.135522 valid_1's auc: 0.835812 valid_1's binary_logloss: 0.147245 [8] valid_0's auc: 0.864923 valid_0's binary_logloss: 0.133792 valid_1's auc: 0.836656 valid_1's binary_logloss: 0.145923 [9] valid_0's auc: 0.865706 valid_0's binary_logloss: 0.13236 valid_1's auc: 0.836912 valid_1's binary_logloss: 0.144867 [10] valid_0's auc: 0.867693 valid_0's binary_logloss: 0.131066 valid_1's auc: 0.837266 valid_1's binary_logloss: 0.143895 [11] valid_0's auc: 0.868596 valid_0's binary_logloss: 0.129937 valid_1's auc: 0.836466 valid_1's binary_logloss: 0.143255 [12] valid_0's auc: 0.87012 valid_0's binary_logloss: 0.128904 valid_1's auc: 0.836589 valid_1's binary_logloss: 0.142728 [13] valid_0's auc: 0.871703 valid_0's binary_logloss: 0.127913 valid_1's auc: 0.836567 valid_1's binary_logloss: 0.142105 [14] valid_0's auc: 0.873468 valid_0's binary_logloss: 0.126983 valid_1's auc: 0.835538 valid_1's binary_logloss: 0.141771 [15] valid_0's auc: 0.874839 valid_0's binary_logloss: 0.126147 valid_1's auc: 0.835363 valid_1's binary_logloss: 0.141464 [16] valid_0's auc: 0.876399 valid_0's binary_logloss: 0.125331 valid_1's auc: 0.83478 valid_1's binary_logloss: 0.141245 [17] valid_0's auc: 0.877465 valid_0's binary_logloss: 0.124655 valid_1's auc: 0.834621 valid_1's binary_logloss: 0.141028 [18] valid_0's auc: 0.878935 valid_0's binary_logloss: 0.123944 valid_1's auc: 0.834165 valid_1's binary_logloss: 0.140935 [19] valid_0's auc: 0.88046 valid_0's binary_logloss: 0.123313 valid_1's auc: 0.834629 valid_1's binary_logloss: 0.140738 [20] valid_0's auc: 0.881517 valid_0's binary_logloss: 0.12269 valid_1's auc: 0.8347 valid_1's binary_logloss: 0.140611 [21] valid_0's auc: 0.882464 valid_0's binary_logloss: 0.122095 valid_1's auc: 0.834656 valid_1's binary_logloss: 0.140487 [22] valid_0's auc: 0.883744 valid_0's binary_logloss: 0.121504 valid_1's auc: 0.834562 valid_1's binary_logloss: 0.140328 [23] valid_0's auc: 0.885301 valid_0's binary_logloss: 0.12091 valid_1's auc: 0.835278 valid_1's binary_logloss: 0.140199 [24] valid_0's auc: 0.886266 valid_0's binary_logloss: 0.120437 valid_1's auc: 0.835728 valid_1's binary_logloss: 0.140094 [25] valid_0's auc: 0.88755 valid_0's binary_logloss: 0.119931 valid_1's auc: 0.836199 valid_1's binary_logloss: 0.140076 [26] valid_0's auc: 0.888525 valid_0's binary_logloss: 0.119473 valid_1's auc: 0.836708 valid_1's binary_logloss: 0.139945 [27] valid_0's auc: 0.889589 valid_0's binary_logloss: 0.119012 valid_1's auc: 0.836951 valid_1's binary_logloss: 0.139843 [28] valid_0's auc: 0.890552 valid_0's binary_logloss: 0.118602 valid_1's auc: 0.836524 valid_1's binary_logloss: 0.139871 [29] valid_0's auc: 0.891402 valid_0's binary_logloss: 0.118166 valid_1's auc: 0.836264 valid_1's binary_logloss: 0.139884 [30] valid_0's auc: 0.891982 valid_0's binary_logloss: 0.117805 valid_1's auc: 0.835959 valid_1's binary_logloss: 0.139937 [31] valid_0's auc: 0.893185 valid_0's binary_logloss: 0.117392 valid_1's auc: 0.836384 valid_1's binary_logloss: 0.13992 [32] valid_0's auc: 0.894065 valid_0's binary_logloss: 0.117017 valid_1's auc: 0.836341 valid_1's binary_logloss: 0.139888 [33] valid_0's auc: 0.894791 valid_0's binary_logloss: 0.116671 valid_1's auc: 0.836753 valid_1's binary_logloss: 0.139812 [34] valid_0's auc: 0.895313 valid_0's binary_logloss: 0.116321 valid_1's auc: 0.836733 valid_1's binary_logloss: 0.139826 [35] valid_0's auc: 0.895876 valid_0's binary_logloss: 0.116039 valid_1's auc: 0.836245 valid_1's binary_logloss: 0.139883 [36] valid_0's auc: 0.896909 valid_0's binary_logloss: 0.115684 valid_1's auc: 0.836079 valid_1's binary_logloss: 0.139912 [37] valid_0's auc: 0.897427 valid_0's binary_logloss: 0.115388 valid_1's auc: 0.835564 valid_1's binary_logloss: 0.140024 [38] valid_0's auc: 0.898442 valid_0's binary_logloss: 0.115006 valid_1's auc: 0.835612 valid_1's binary_logloss: 0.140075 [39] valid_0's auc: 0.899304 valid_0's binary_logloss: 0.114592 valid_1's auc: 0.836273 valid_1's binary_logloss: 0.139974 [40] valid_0's auc: 0.89974 valid_0's binary_logloss: 0.11432 valid_1's auc: 0.836096 valid_1's binary_logloss: 0.140042
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.830643 valid_0's binary_logloss: 0.155759 valid_1's auc: 0.816734 valid_1's binary_logloss: 0.164985 [2] valid_0's auc: 0.839353 valid_0's binary_logloss: 0.149977 valid_1's auc: 0.822571 valid_1's binary_logloss: 0.159808 [3] valid_0's auc: 0.847366 valid_0's binary_logloss: 0.145866 valid_1's auc: 0.829312 valid_1's binary_logloss: 0.156171 [4] valid_0's auc: 0.850911 valid_0's binary_logloss: 0.14247 valid_1's auc: 0.830848 valid_1's binary_logloss: 0.153328 [5] valid_0's auc: 0.854674 valid_0's binary_logloss: 0.139764 valid_1's auc: 0.833041 valid_1's binary_logloss: 0.151023 [6] valid_0's auc: 0.856722 valid_0's binary_logloss: 0.1375 valid_1's auc: 0.834264 valid_1's binary_logloss: 0.149166 [7] valid_0's auc: 0.858253 valid_0's binary_logloss: 0.135713 valid_1's auc: 0.834998 valid_1's binary_logloss: 0.147631 [8] valid_0's auc: 0.859768 valid_0's binary_logloss: 0.134063 valid_1's auc: 0.835678 valid_1's binary_logloss: 0.146384 [9] valid_0's auc: 0.86262 valid_0's binary_logloss: 0.132622 valid_1's auc: 0.836272 valid_1's binary_logloss: 0.145313 [10] valid_0's auc: 0.864631 valid_0's binary_logloss: 0.131324 valid_1's auc: 0.835827 valid_1's binary_logloss: 0.144553 [11] valid_0's auc: 0.866805 valid_0's binary_logloss: 0.130172 valid_1's auc: 0.835375 valid_1's binary_logloss: 0.143933 [12] valid_0's auc: 0.868266 valid_0's binary_logloss: 0.129101 valid_1's auc: 0.835951 valid_1's binary_logloss: 0.143342 [13] valid_0's auc: 0.870762 valid_0's binary_logloss: 0.128144 valid_1's auc: 0.83626 valid_1's binary_logloss: 0.142813 [14] valid_0's auc: 0.872747 valid_0's binary_logloss: 0.127222 valid_1's auc: 0.835864 valid_1's binary_logloss: 0.142466 [15] valid_0's auc: 0.874158 valid_0's binary_logloss: 0.126428 valid_1's auc: 0.83548 valid_1's binary_logloss: 0.142108 [16] valid_0's auc: 0.875931 valid_0's binary_logloss: 0.125651 valid_1's auc: 0.836367 valid_1's binary_logloss: 0.141684 [17] valid_0's auc: 0.876854 valid_0's binary_logloss: 0.124918 valid_1's auc: 0.835689 valid_1's binary_logloss: 0.141524 [18] valid_0's auc: 0.878211 valid_0's binary_logloss: 0.124197 valid_1's auc: 0.835893 valid_1's binary_logloss: 0.141285 [19] valid_0's auc: 0.879125 valid_0's binary_logloss: 0.123553 valid_1's auc: 0.835877 valid_1's binary_logloss: 0.141128 [20] valid_0's auc: 0.880489 valid_0's binary_logloss: 0.122856 valid_1's auc: 0.835385 valid_1's binary_logloss: 0.141032 [21] valid_0's auc: 0.881696 valid_0's binary_logloss: 0.122219 valid_1's auc: 0.835822 valid_1's binary_logloss: 0.140843 [22] valid_0's auc: 0.882257 valid_0's binary_logloss: 0.121726 valid_1's auc: 0.835849 valid_1's binary_logloss: 0.140761 [23] valid_0's auc: 0.883635 valid_0's binary_logloss: 0.121206 valid_1's auc: 0.836223 valid_1's binary_logloss: 0.140607 [24] valid_0's auc: 0.884533 valid_0's binary_logloss: 0.120734 valid_1's auc: 0.836473 valid_1's binary_logloss: 0.14049 [25] valid_0's auc: 0.885234 valid_0's binary_logloss: 0.120268 valid_1's auc: 0.836722 valid_1's binary_logloss: 0.140403 [26] valid_0's auc: 0.886292 valid_0's binary_logloss: 0.119794 valid_1's auc: 0.836549 valid_1's binary_logloss: 0.140423 [27] valid_0's auc: 0.887064 valid_0's binary_logloss: 0.119366 valid_1's auc: 0.836155 valid_1's binary_logloss: 0.140447 [28] valid_0's auc: 0.887621 valid_0's binary_logloss: 0.119008 valid_1's auc: 0.835594 valid_1's binary_logloss: 0.140532 [29] valid_0's auc: 0.888965 valid_0's binary_logloss: 0.118547 valid_1's auc: 0.835464 valid_1's binary_logloss: 0.140508 [30] valid_0's auc: 0.889898 valid_0's binary_logloss: 0.118139 valid_1's auc: 0.83577 valid_1's binary_logloss: 0.140461 [31] valid_0's auc: 0.890896 valid_0's binary_logloss: 0.117734 valid_1's auc: 0.835475 valid_1's binary_logloss: 0.140463 [32] valid_0's auc: 0.892374 valid_0's binary_logloss: 0.1173 valid_1's auc: 0.835364 valid_1's binary_logloss: 0.140506 [33] valid_0's auc: 0.893164 valid_0's binary_logloss: 0.116978 valid_1's auc: 0.835865 valid_1's binary_logloss: 0.14041 [34] valid_0's auc: 0.893848 valid_0's binary_logloss: 0.11662 valid_1's auc: 0.836021 valid_1's binary_logloss: 0.140353 [35] valid_0's auc: 0.894232 valid_0's binary_logloss: 0.116323 valid_1's auc: 0.8359 valid_1's binary_logloss: 0.140396 [36] valid_0's auc: 0.895003 valid_0's binary_logloss: 0.115986 valid_1's auc: 0.835855 valid_1's binary_logloss: 0.140416 [37] valid_0's auc: 0.895898 valid_0's binary_logloss: 0.115609 valid_1's auc: 0.836185 valid_1's binary_logloss: 0.140369 [38] valid_0's auc: 0.896459 valid_0's binary_logloss: 0.11527 valid_1's auc: 0.835754 valid_1's binary_logloss: 0.140443 [39] valid_0's auc: 0.897377 valid_0's binary_logloss: 0.114873 valid_1's auc: 0.835638 valid_1's binary_logloss: 0.140474 [40] valid_0's auc: 0.89776 valid_0's binary_logloss: 0.114588 valid_1's auc: 0.835639 valid_1's binary_logloss: 0.140491 [41] valid_0's auc: 0.898583 valid_0's binary_logloss: 0.114302 valid_1's auc: 0.835705 valid_1's binary_logloss: 0.140506 [42] valid_0's auc: 0.899197 valid_0's binary_logloss: 0.113975 valid_1's auc: 0.835052 valid_1's binary_logloss: 0.14064 [43] valid_0's auc: 0.899803 valid_0's binary_logloss: 0.113654 valid_1's auc: 0.835035 valid_1's binary_logloss: 0.140691 [44] valid_0's auc: 0.900641 valid_0's binary_logloss: 0.113388 valid_1's auc: 0.835214 valid_1's binary_logloss: 0.140703 [45] valid_0's auc: 0.900962 valid_0's binary_logloss: 0.113098 valid_1's auc: 0.835276 valid_1's binary_logloss: 0.140695 [46] valid_0's auc: 0.901584 valid_0's binary_logloss: 0.112771 valid_1's auc: 0.83495 valid_1's binary_logloss: 0.140754 [47] valid_0's auc: 0.902256 valid_0's binary_logloss: 0.112493 valid_1's auc: 0.835639 valid_1's binary_logloss: 0.14064 [48] valid_0's auc: 0.902688 valid_0's binary_logloss: 0.112198 valid_1's auc: 0.835495 valid_1's binary_logloss: 0.140691 [49] valid_0's auc: 0.902922 valid_0's binary_logloss: 0.111944 valid_1's auc: 0.835281 valid_1's binary_logloss: 0.140819 [50] valid_0's auc: 0.903747 valid_0's binary_logloss: 0.111595 valid_1's auc: 0.835359 valid_1's binary_logloss: 0.140811 [51] valid_0's auc: 0.904427 valid_0's binary_logloss: 0.111354 valid_1's auc: 0.835245 valid_1's binary_logloss: 0.140873 [52] valid_0's auc: 0.90467 valid_0's binary_logloss: 0.111111 valid_1's auc: 0.835057 valid_1's binary_logloss: 0.140993 [53] valid_0's auc: 0.904868 valid_0's binary_logloss: 0.110853 valid_1's auc: 0.834751 valid_1's binary_logloss: 0.14108 [54] valid_0's auc: 0.905166 valid_0's binary_logloss: 0.110627 valid_1's auc: 0.83411 valid_1's binary_logloss: 0.141282 [55] valid_0's auc: 0.905665 valid_0's binary_logloss: 0.110375 valid_1's auc: 0.833739 valid_1's binary_logloss: 0.141413
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.832891 valid_0's binary_logloss: 0.155302 valid_1's auc: 0.818851 valid_1's binary_logloss: 0.164826 [2] valid_0's auc: 0.84519 valid_0's binary_logloss: 0.149727 valid_1's auc: 0.827144 valid_1's binary_logloss: 0.159879 [3] valid_0's auc: 0.848018 valid_0's binary_logloss: 0.145627 valid_1's auc: 0.826851 valid_1's binary_logloss: 0.15631 [4] valid_0's auc: 0.851096 valid_0's binary_logloss: 0.142423 valid_1's auc: 0.83073 valid_1's binary_logloss: 0.1534 [5] valid_0's auc: 0.854735 valid_0's binary_logloss: 0.139746 valid_1's auc: 0.832753 valid_1's binary_logloss: 0.151136 [6] valid_0's auc: 0.856928 valid_0's binary_logloss: 0.137509 valid_1's auc: 0.835605 valid_1's binary_logloss: 0.14924 [7] valid_0's auc: 0.859448 valid_0's binary_logloss: 0.135575 valid_1's auc: 0.835612 valid_1's binary_logloss: 0.147799 [8] valid_0's auc: 0.861685 valid_0's binary_logloss: 0.133953 valid_1's auc: 0.834408 valid_1's binary_logloss: 0.146634 [9] valid_0's auc: 0.863391 valid_0's binary_logloss: 0.132468 valid_1's auc: 0.835623 valid_1's binary_logloss: 0.145549 [10] valid_0's auc: 0.865858 valid_0's binary_logloss: 0.131185 valid_1's auc: 0.83487 valid_1's binary_logloss: 0.144745 [11] valid_0's auc: 0.867134 valid_0's binary_logloss: 0.130116 valid_1's auc: 0.834692 valid_1's binary_logloss: 0.14411 [12] valid_0's auc: 0.868217 valid_0's binary_logloss: 0.129097 valid_1's auc: 0.834746 valid_1's binary_logloss: 0.143527 [13] valid_0's auc: 0.87073 valid_0's binary_logloss: 0.128129 valid_1's auc: 0.833582 valid_1's binary_logloss: 0.143122 [14] valid_0's auc: 0.872621 valid_0's binary_logloss: 0.12721 valid_1's auc: 0.833205 valid_1's binary_logloss: 0.142745 [15] valid_0's auc: 0.874007 valid_0's binary_logloss: 0.126363 valid_1's auc: 0.83246 valid_1's binary_logloss: 0.142489 [16] valid_0's auc: 0.875141 valid_0's binary_logloss: 0.125606 valid_1's auc: 0.831958 valid_1's binary_logloss: 0.142275 [17] valid_0's auc: 0.876061 valid_0's binary_logloss: 0.124928 valid_1's auc: 0.831586 valid_1's binary_logloss: 0.142141 [18] valid_0's auc: 0.876982 valid_0's binary_logloss: 0.124313 valid_1's auc: 0.830954 valid_1's binary_logloss: 0.142066 [19] valid_0's auc: 0.877885 valid_0's binary_logloss: 0.123709 valid_1's auc: 0.830572 valid_1's binary_logloss: 0.14196 [20] valid_0's auc: 0.879378 valid_0's binary_logloss: 0.123088 valid_1's auc: 0.830076 valid_1's binary_logloss: 0.14196 [21] valid_0's auc: 0.880647 valid_0's binary_logloss: 0.122488 valid_1's auc: 0.830109 valid_1's binary_logloss: 0.141858 [22] valid_0's auc: 0.881614 valid_0's binary_logloss: 0.121973 valid_1's auc: 0.829735 valid_1's binary_logloss: 0.141822 [23] valid_0's auc: 0.882402 valid_0's binary_logloss: 0.121554 valid_1's auc: 0.829254 valid_1's binary_logloss: 0.141805 [24] valid_0's auc: 0.883011 valid_0's binary_logloss: 0.121078 valid_1's auc: 0.829054 valid_1's binary_logloss: 0.14178 [25] valid_0's auc: 0.884627 valid_0's binary_logloss: 0.120587 valid_1's auc: 0.82942 valid_1's binary_logloss: 0.141653 [26] valid_0's auc: 0.885304 valid_0's binary_logloss: 0.120169 valid_1's auc: 0.828716 valid_1's binary_logloss: 0.141755 [27] valid_0's auc: 0.88664 valid_0's binary_logloss: 0.119673 valid_1's auc: 0.828869 valid_1's binary_logloss: 0.141682 [28] valid_0's auc: 0.887143 valid_0's binary_logloss: 0.119308 valid_1's auc: 0.828987 valid_1's binary_logloss: 0.141649 [29] valid_0's auc: 0.88825 valid_0's binary_logloss: 0.1189 valid_1's auc: 0.829075 valid_1's binary_logloss: 0.141601 [30] valid_0's auc: 0.889081 valid_0's binary_logloss: 0.118531 valid_1's auc: 0.828871 valid_1's binary_logloss: 0.141605 [31] valid_0's auc: 0.890195 valid_0's binary_logloss: 0.118117 valid_1's auc: 0.828972 valid_1's binary_logloss: 0.141605 [32] valid_0's auc: 0.890928 valid_0's binary_logloss: 0.117735 valid_1's auc: 0.827969 valid_1's binary_logloss: 0.141796 [33] valid_0's auc: 0.891505 valid_0's binary_logloss: 0.117389 valid_1's auc: 0.827611 valid_1's binary_logloss: 0.141916 [34] valid_0's auc: 0.892223 valid_0's binary_logloss: 0.11707 valid_1's auc: 0.827019 valid_1's binary_logloss: 0.142051 [35] valid_0's auc: 0.892825 valid_0's binary_logloss: 0.116751 valid_1's auc: 0.826865 valid_1's binary_logloss: 0.142116 [36] valid_0's auc: 0.893984 valid_0's binary_logloss: 0.116353 valid_1's auc: 0.827203 valid_1's binary_logloss: 0.14207 [37] valid_0's auc: 0.89456 valid_0's binary_logloss: 0.11603 valid_1's auc: 0.827292 valid_1's binary_logloss: 0.142005 [38] valid_0's auc: 0.89511 valid_0's binary_logloss: 0.115713 valid_1's auc: 0.827214 valid_1's binary_logloss: 0.14206 [39] valid_0's auc: 0.895738 valid_0's binary_logloss: 0.115415 valid_1's auc: 0.82695 valid_1's binary_logloss: 0.142162
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.833054 valid_0's binary_logloss: 0.15572 valid_1's auc: 0.817048 valid_1's binary_logloss: 0.165036 [2] valid_0's auc: 0.841397 valid_0's binary_logloss: 0.149862 valid_1's auc: 0.82157 valid_1's binary_logloss: 0.159575 [3] valid_0's auc: 0.849058 valid_0's binary_logloss: 0.145662 valid_1's auc: 0.829866 valid_1's binary_logloss: 0.155774 [4] valid_0's auc: 0.854301 valid_0's binary_logloss: 0.142356 valid_1's auc: 0.832415 valid_1's binary_logloss: 0.152936 [5] valid_0's auc: 0.858045 valid_0's binary_logloss: 0.139697 valid_1's auc: 0.834554 valid_1's binary_logloss: 0.150635 [6] valid_0's auc: 0.860767 valid_0's binary_logloss: 0.137458 valid_1's auc: 0.834885 valid_1's binary_logloss: 0.148761 [7] valid_0's auc: 0.863011 valid_0's binary_logloss: 0.135522 valid_1's auc: 0.835812 valid_1's binary_logloss: 0.147245 [8] valid_0's auc: 0.864923 valid_0's binary_logloss: 0.133792 valid_1's auc: 0.836656 valid_1's binary_logloss: 0.145923 [9] valid_0's auc: 0.865706 valid_0's binary_logloss: 0.13236 valid_1's auc: 0.836912 valid_1's binary_logloss: 0.144867 [10] valid_0's auc: 0.867693 valid_0's binary_logloss: 0.131066 valid_1's auc: 0.837266 valid_1's binary_logloss: 0.143895 [11] valid_0's auc: 0.868596 valid_0's binary_logloss: 0.129937 valid_1's auc: 0.836466 valid_1's binary_logloss: 0.143255 [12] valid_0's auc: 0.87012 valid_0's binary_logloss: 0.128904 valid_1's auc: 0.836589 valid_1's binary_logloss: 0.142728 [13] valid_0's auc: 0.871703 valid_0's binary_logloss: 0.127913 valid_1's auc: 0.836567 valid_1's binary_logloss: 0.142105 [14] valid_0's auc: 0.873468 valid_0's binary_logloss: 0.126983 valid_1's auc: 0.835538 valid_1's binary_logloss: 0.141771 [15] valid_0's auc: 0.874839 valid_0's binary_logloss: 0.126147 valid_1's auc: 0.835363 valid_1's binary_logloss: 0.141464 [16] valid_0's auc: 0.876399 valid_0's binary_logloss: 0.125331 valid_1's auc: 0.83478 valid_1's binary_logloss: 0.141245 [17] valid_0's auc: 0.877465 valid_0's binary_logloss: 0.124655 valid_1's auc: 0.834621 valid_1's binary_logloss: 0.141028 [18] valid_0's auc: 0.878935 valid_0's binary_logloss: 0.123944 valid_1's auc: 0.834165 valid_1's binary_logloss: 0.140935 [19] valid_0's auc: 0.88046 valid_0's binary_logloss: 0.123313 valid_1's auc: 0.834629 valid_1's binary_logloss: 0.140738 [20] valid_0's auc: 0.881517 valid_0's binary_logloss: 0.12269 valid_1's auc: 0.8347 valid_1's binary_logloss: 0.140611 [21] valid_0's auc: 0.882464 valid_0's binary_logloss: 0.122095 valid_1's auc: 0.834656 valid_1's binary_logloss: 0.140487 [22] valid_0's auc: 0.883744 valid_0's binary_logloss: 0.121504 valid_1's auc: 0.834562 valid_1's binary_logloss: 0.140328 [23] valid_0's auc: 0.885301 valid_0's binary_logloss: 0.12091 valid_1's auc: 0.835278 valid_1's binary_logloss: 0.140199 [24] valid_0's auc: 0.886266 valid_0's binary_logloss: 0.120437 valid_1's auc: 0.835728 valid_1's binary_logloss: 0.140094 [25] valid_0's auc: 0.88755 valid_0's binary_logloss: 0.119931 valid_1's auc: 0.836199 valid_1's binary_logloss: 0.140076 [26] valid_0's auc: 0.888525 valid_0's binary_logloss: 0.119473 valid_1's auc: 0.836708 valid_1's binary_logloss: 0.139945 [27] valid_0's auc: 0.889589 valid_0's binary_logloss: 0.119012 valid_1's auc: 0.836951 valid_1's binary_logloss: 0.139843 [28] valid_0's auc: 0.890552 valid_0's binary_logloss: 0.118602 valid_1's auc: 0.836524 valid_1's binary_logloss: 0.139871 [29] valid_0's auc: 0.891402 valid_0's binary_logloss: 0.118166 valid_1's auc: 0.836264 valid_1's binary_logloss: 0.139884 [30] valid_0's auc: 0.891982 valid_0's binary_logloss: 0.117805 valid_1's auc: 0.835959 valid_1's binary_logloss: 0.139937 [31] valid_0's auc: 0.893185 valid_0's binary_logloss: 0.117392 valid_1's auc: 0.836384 valid_1's binary_logloss: 0.13992 [32] valid_0's auc: 0.894065 valid_0's binary_logloss: 0.117017 valid_1's auc: 0.836341 valid_1's binary_logloss: 0.139888 [33] valid_0's auc: 0.894791 valid_0's binary_logloss: 0.116671 valid_1's auc: 0.836753 valid_1's binary_logloss: 0.139812 [34] valid_0's auc: 0.895313 valid_0's binary_logloss: 0.116321 valid_1's auc: 0.836733 valid_1's binary_logloss: 0.139826 [35] valid_0's auc: 0.895876 valid_0's binary_logloss: 0.116039 valid_1's auc: 0.836245 valid_1's binary_logloss: 0.139883 [36] valid_0's auc: 0.896909 valid_0's binary_logloss: 0.115684 valid_1's auc: 0.836079 valid_1's binary_logloss: 0.139912 [37] valid_0's auc: 0.897427 valid_0's binary_logloss: 0.115388 valid_1's auc: 0.835564 valid_1's binary_logloss: 0.140024 [38] valid_0's auc: 0.898442 valid_0's binary_logloss: 0.115006 valid_1's auc: 0.835612 valid_1's binary_logloss: 0.140075 [39] valid_0's auc: 0.899304 valid_0's binary_logloss: 0.114592 valid_1's auc: 0.836273 valid_1's binary_logloss: 0.139974 [40] valid_0's auc: 0.89974 valid_0's binary_logloss: 0.11432 valid_1's auc: 0.836096 valid_1's binary_logloss: 0.140042
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.830643 valid_0's binary_logloss: 0.155759 valid_1's auc: 0.816734 valid_1's binary_logloss: 0.164985 [2] valid_0's auc: 0.839353 valid_0's binary_logloss: 0.149977 valid_1's auc: 0.822571 valid_1's binary_logloss: 0.159808 [3] valid_0's auc: 0.847366 valid_0's binary_logloss: 0.145866 valid_1's auc: 0.829312 valid_1's binary_logloss: 0.156171 [4] valid_0's auc: 0.850911 valid_0's binary_logloss: 0.14247 valid_1's auc: 0.830848 valid_1's binary_logloss: 0.153328 [5] valid_0's auc: 0.854674 valid_0's binary_logloss: 0.139764 valid_1's auc: 0.833041 valid_1's binary_logloss: 0.151023 [6] valid_0's auc: 0.856722 valid_0's binary_logloss: 0.1375 valid_1's auc: 0.834264 valid_1's binary_logloss: 0.149166 [7] valid_0's auc: 0.858253 valid_0's binary_logloss: 0.135713 valid_1's auc: 0.834998 valid_1's binary_logloss: 0.147631 [8] valid_0's auc: 0.859768 valid_0's binary_logloss: 0.134063 valid_1's auc: 0.835678 valid_1's binary_logloss: 0.146384 [9] valid_0's auc: 0.86262 valid_0's binary_logloss: 0.132622 valid_1's auc: 0.836272 valid_1's binary_logloss: 0.145313 [10] valid_0's auc: 0.864631 valid_0's binary_logloss: 0.131324 valid_1's auc: 0.835827 valid_1's binary_logloss: 0.144553 [11] valid_0's auc: 0.866805 valid_0's binary_logloss: 0.130172 valid_1's auc: 0.835375 valid_1's binary_logloss: 0.143933 [12] valid_0's auc: 0.868266 valid_0's binary_logloss: 0.129101 valid_1's auc: 0.835951 valid_1's binary_logloss: 0.143342 [13] valid_0's auc: 0.870762 valid_0's binary_logloss: 0.128144 valid_1's auc: 0.83626 valid_1's binary_logloss: 0.142813 [14] valid_0's auc: 0.872747 valid_0's binary_logloss: 0.127222 valid_1's auc: 0.835864 valid_1's binary_logloss: 0.142466 [15] valid_0's auc: 0.874158 valid_0's binary_logloss: 0.126428 valid_1's auc: 0.83548 valid_1's binary_logloss: 0.142108 [16] valid_0's auc: 0.875931 valid_0's binary_logloss: 0.125651 valid_1's auc: 0.836367 valid_1's binary_logloss: 0.141684 [17] valid_0's auc: 0.876854 valid_0's binary_logloss: 0.124918 valid_1's auc: 0.835689 valid_1's binary_logloss: 0.141524 [18] valid_0's auc: 0.878211 valid_0's binary_logloss: 0.124197 valid_1's auc: 0.835893 valid_1's binary_logloss: 0.141285 [19] valid_0's auc: 0.879125 valid_0's binary_logloss: 0.123553 valid_1's auc: 0.835877 valid_1's binary_logloss: 0.141128 [20] valid_0's auc: 0.880489 valid_0's binary_logloss: 0.122856 valid_1's auc: 0.835385 valid_1's binary_logloss: 0.141032 [21] valid_0's auc: 0.881696 valid_0's binary_logloss: 0.122219 valid_1's auc: 0.835822 valid_1's binary_logloss: 0.140843 [22] valid_0's auc: 0.882257 valid_0's binary_logloss: 0.121726 valid_1's auc: 0.835849 valid_1's binary_logloss: 0.140761 [23] valid_0's auc: 0.883635 valid_0's binary_logloss: 0.121206 valid_1's auc: 0.836223 valid_1's binary_logloss: 0.140607 [24] valid_0's auc: 0.884533 valid_0's binary_logloss: 0.120734 valid_1's auc: 0.836473 valid_1's binary_logloss: 0.14049 [25] valid_0's auc: 0.885234 valid_0's binary_logloss: 0.120268 valid_1's auc: 0.836722 valid_1's binary_logloss: 0.140403 [26] valid_0's auc: 0.886292 valid_0's binary_logloss: 0.119794 valid_1's auc: 0.836549 valid_1's binary_logloss: 0.140423 [27] valid_0's auc: 0.887064 valid_0's binary_logloss: 0.119366 valid_1's auc: 0.836155 valid_1's binary_logloss: 0.140447 [28] valid_0's auc: 0.887621 valid_0's binary_logloss: 0.119008 valid_1's auc: 0.835594 valid_1's binary_logloss: 0.140532 [29] valid_0's auc: 0.888965 valid_0's binary_logloss: 0.118547 valid_1's auc: 0.835464 valid_1's binary_logloss: 0.140508 [30] valid_0's auc: 0.889898 valid_0's binary_logloss: 0.118139 valid_1's auc: 0.83577 valid_1's binary_logloss: 0.140461 [31] valid_0's auc: 0.890896 valid_0's binary_logloss: 0.117734 valid_1's auc: 0.835475 valid_1's binary_logloss: 0.140463 [32] valid_0's auc: 0.892374 valid_0's binary_logloss: 0.1173 valid_1's auc: 0.835364 valid_1's binary_logloss: 0.140506 [33] valid_0's auc: 0.893164 valid_0's binary_logloss: 0.116978 valid_1's auc: 0.835865 valid_1's binary_logloss: 0.14041 [34] valid_0's auc: 0.893848 valid_0's binary_logloss: 0.11662 valid_1's auc: 0.836021 valid_1's binary_logloss: 0.140353 [35] valid_0's auc: 0.894232 valid_0's binary_logloss: 0.116323 valid_1's auc: 0.8359 valid_1's binary_logloss: 0.140396 [36] valid_0's auc: 0.895003 valid_0's binary_logloss: 0.115986 valid_1's auc: 0.835855 valid_1's binary_logloss: 0.140416 [37] valid_0's auc: 0.895898 valid_0's binary_logloss: 0.115609 valid_1's auc: 0.836185 valid_1's binary_logloss: 0.140369 [38] valid_0's auc: 0.896459 valid_0's binary_logloss: 0.11527 valid_1's auc: 0.835754 valid_1's binary_logloss: 0.140443 [39] valid_0's auc: 0.897377 valid_0's binary_logloss: 0.114873 valid_1's auc: 0.835638 valid_1's binary_logloss: 0.140474 [40] valid_0's auc: 0.89776 valid_0's binary_logloss: 0.114588 valid_1's auc: 0.835639 valid_1's binary_logloss: 0.140491 [41] valid_0's auc: 0.898583 valid_0's binary_logloss: 0.114302 valid_1's auc: 0.835705 valid_1's binary_logloss: 0.140506 [42] valid_0's auc: 0.899197 valid_0's binary_logloss: 0.113975 valid_1's auc: 0.835052 valid_1's binary_logloss: 0.14064 [43] valid_0's auc: 0.899803 valid_0's binary_logloss: 0.113654 valid_1's auc: 0.835035 valid_1's binary_logloss: 0.140691 [44] valid_0's auc: 0.900641 valid_0's binary_logloss: 0.113388 valid_1's auc: 0.835214 valid_1's binary_logloss: 0.140703 [45] valid_0's auc: 0.900962 valid_0's binary_logloss: 0.113098 valid_1's auc: 0.835276 valid_1's binary_logloss: 0.140695 [46] valid_0's auc: 0.901584 valid_0's binary_logloss: 0.112771 valid_1's auc: 0.83495 valid_1's binary_logloss: 0.140754 [47] valid_0's auc: 0.902256 valid_0's binary_logloss: 0.112493 valid_1's auc: 0.835639 valid_1's binary_logloss: 0.14064 [48] valid_0's auc: 0.902688 valid_0's binary_logloss: 0.112198 valid_1's auc: 0.835495 valid_1's binary_logloss: 0.140691 [49] valid_0's auc: 0.902922 valid_0's binary_logloss: 0.111944 valid_1's auc: 0.835281 valid_1's binary_logloss: 0.140819 [50] valid_0's auc: 0.903747 valid_0's binary_logloss: 0.111595 valid_1's auc: 0.835359 valid_1's binary_logloss: 0.140811 [51] valid_0's auc: 0.904427 valid_0's binary_logloss: 0.111354 valid_1's auc: 0.835245 valid_1's binary_logloss: 0.140873 [52] valid_0's auc: 0.90467 valid_0's binary_logloss: 0.111111 valid_1's auc: 0.835057 valid_1's binary_logloss: 0.140993 [53] valid_0's auc: 0.904868 valid_0's binary_logloss: 0.110853 valid_1's auc: 0.834751 valid_1's binary_logloss: 0.14108 [54] valid_0's auc: 0.905166 valid_0's binary_logloss: 0.110627 valid_1's auc: 0.83411 valid_1's binary_logloss: 0.141282 [55] valid_0's auc: 0.905665 valid_0's binary_logloss: 0.110375 valid_1's auc: 0.833739 valid_1's binary_logloss: 0.141413
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.824873 valid_0's binary_logloss: 0.156222 valid_1's auc: 0.817791 valid_1's binary_logloss: 0.165072 [2] valid_0's auc: 0.828725 valid_0's binary_logloss: 0.151244 valid_1's auc: 0.822586 valid_1's binary_logloss: 0.160253 [3] valid_0's auc: 0.83594 valid_0's binary_logloss: 0.147423 valid_1's auc: 0.828474 valid_1's binary_logloss: 0.156542 [4] valid_0's auc: 0.839489 valid_0's binary_logloss: 0.144426 valid_1's auc: 0.831396 valid_1's binary_logloss: 0.153706 [5] valid_0's auc: 0.843358 valid_0's binary_logloss: 0.142067 valid_1's auc: 0.833466 valid_1's binary_logloss: 0.151399 [6] valid_0's auc: 0.845601 valid_0's binary_logloss: 0.14009 valid_1's auc: 0.833857 valid_1's binary_logloss: 0.149488 [7] valid_0's auc: 0.846477 valid_0's binary_logloss: 0.138491 valid_1's auc: 0.833143 valid_1's binary_logloss: 0.148023 [8] valid_0's auc: 0.847725 valid_0's binary_logloss: 0.137129 valid_1's auc: 0.833971 valid_1's binary_logloss: 0.146757 [9] valid_0's auc: 0.848442 valid_0's binary_logloss: 0.135908 valid_1's auc: 0.835976 valid_1's binary_logloss: 0.145685 [10] valid_0's auc: 0.849759 valid_0's binary_logloss: 0.134781 valid_1's auc: 0.836214 valid_1's binary_logloss: 0.144769 [11] valid_0's auc: 0.852238 valid_0's binary_logloss: 0.133835 valid_1's auc: 0.837243 valid_1's binary_logloss: 0.143925 [12] valid_0's auc: 0.853743 valid_0's binary_logloss: 0.132972 valid_1's auc: 0.836647 valid_1's binary_logloss: 0.143391 [13] valid_0's auc: 0.854568 valid_0's binary_logloss: 0.132256 valid_1's auc: 0.837182 valid_1's binary_logloss: 0.142849 [14] valid_0's auc: 0.855928 valid_0's binary_logloss: 0.131554 valid_1's auc: 0.835941 valid_1's binary_logloss: 0.142474 [15] valid_0's auc: 0.85712 valid_0's binary_logloss: 0.130984 valid_1's auc: 0.834938 valid_1's binary_logloss: 0.142198 [16] valid_0's auc: 0.858721 valid_0's binary_logloss: 0.130371 valid_1's auc: 0.83561 valid_1's binary_logloss: 0.141802 [17] valid_0's auc: 0.859281 valid_0's binary_logloss: 0.129877 valid_1's auc: 0.835146 valid_1's binary_logloss: 0.141605 [18] valid_0's auc: 0.859881 valid_0's binary_logloss: 0.129417 valid_1's auc: 0.835386 valid_1's binary_logloss: 0.14132 [19] valid_0's auc: 0.861409 valid_0's binary_logloss: 0.128929 valid_1's auc: 0.834974 valid_1's binary_logloss: 0.141151 [20] valid_0's auc: 0.862574 valid_0's binary_logloss: 0.128458 valid_1's auc: 0.834949 valid_1's binary_logloss: 0.140968 [21] valid_0's auc: 0.863262 valid_0's binary_logloss: 0.128069 valid_1's auc: 0.834616 valid_1's binary_logloss: 0.14086 [22] valid_0's auc: 0.864655 valid_0's binary_logloss: 0.127684 valid_1's auc: 0.834363 valid_1's binary_logloss: 0.140766 [23] valid_0's auc: 0.865247 valid_0's binary_logloss: 0.127349 valid_1's auc: 0.834317 valid_1's binary_logloss: 0.140688 [24] valid_0's auc: 0.865882 valid_0's binary_logloss: 0.12704 valid_1's auc: 0.833543 valid_1's binary_logloss: 0.14068 [25] valid_0's auc: 0.867496 valid_0's binary_logloss: 0.126629 valid_1's auc: 0.834195 valid_1's binary_logloss: 0.140539 [26] valid_0's auc: 0.867923 valid_0's binary_logloss: 0.126353 valid_1's auc: 0.834028 valid_1's binary_logloss: 0.140506 [27] valid_0's auc: 0.868685 valid_0's binary_logloss: 0.126058 valid_1's auc: 0.834718 valid_1's binary_logloss: 0.140359 [28] valid_0's auc: 0.869304 valid_0's binary_logloss: 0.125764 valid_1's auc: 0.834935 valid_1's binary_logloss: 0.140287 [29] valid_0's auc: 0.870037 valid_0's binary_logloss: 0.125514 valid_1's auc: 0.834481 valid_1's binary_logloss: 0.140258 [30] valid_0's auc: 0.870785 valid_0's binary_logloss: 0.125254 valid_1's auc: 0.834179 valid_1's binary_logloss: 0.140275 [31] valid_0's auc: 0.871706 valid_0's binary_logloss: 0.124992 valid_1's auc: 0.834475 valid_1's binary_logloss: 0.140205 [32] valid_0's auc: 0.872582 valid_0's binary_logloss: 0.124728 valid_1's auc: 0.834353 valid_1's binary_logloss: 0.140189 [33] valid_0's auc: 0.873445 valid_0's binary_logloss: 0.124481 valid_1's auc: 0.834592 valid_1's binary_logloss: 0.140082 [34] valid_0's auc: 0.874095 valid_0's binary_logloss: 0.12426 valid_1's auc: 0.83436 valid_1's binary_logloss: 0.140101 [35] valid_0's auc: 0.874869 valid_0's binary_logloss: 0.123982 valid_1's auc: 0.834045 valid_1's binary_logloss: 0.140151 [36] valid_0's auc: 0.875446 valid_0's binary_logloss: 0.123753 valid_1's auc: 0.834073 valid_1's binary_logloss: 0.140125 [37] valid_0's auc: 0.875763 valid_0's binary_logloss: 0.123587 valid_1's auc: 0.833611 valid_1's binary_logloss: 0.140201 [38] valid_0's auc: 0.876603 valid_0's binary_logloss: 0.123335 valid_1's auc: 0.833805 valid_1's binary_logloss: 0.140159 [39] valid_0's auc: 0.877126 valid_0's binary_logloss: 0.123134 valid_1's auc: 0.834422 valid_1's binary_logloss: 0.140048 [40] valid_0's auc: 0.877575 valid_0's binary_logloss: 0.123013 valid_1's auc: 0.834343 valid_1's binary_logloss: 0.140069 [41] valid_0's auc: 0.87809 valid_0's binary_logloss: 0.122813 valid_1's auc: 0.834199 valid_1's binary_logloss: 0.140085
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.821831 valid_0's binary_logloss: 0.156466 valid_1's auc: 0.817525 valid_1's binary_logloss: 0.165186 [2] valid_0's auc: 0.831974 valid_0's binary_logloss: 0.151137 valid_1's auc: 0.82532 valid_1's binary_logloss: 0.159691 [3] valid_0's auc: 0.839496 valid_0's binary_logloss: 0.14733 valid_1's auc: 0.831946 valid_1's binary_logloss: 0.156 [4] valid_0's auc: 0.843984 valid_0's binary_logloss: 0.144371 valid_1's auc: 0.834064 valid_1's binary_logloss: 0.153082 [5] valid_0's auc: 0.845854 valid_0's binary_logloss: 0.142024 valid_1's auc: 0.836918 valid_1's binary_logloss: 0.150735 [6] valid_0's auc: 0.848041 valid_0's binary_logloss: 0.140009 valid_1's auc: 0.838831 valid_1's binary_logloss: 0.148771 [7] valid_0's auc: 0.849655 valid_0's binary_logloss: 0.138307 valid_1's auc: 0.839111 valid_1's binary_logloss: 0.147373 [8] valid_0's auc: 0.85185 valid_0's binary_logloss: 0.136891 valid_1's auc: 0.838955 valid_1's binary_logloss: 0.146094 [9] valid_0's auc: 0.853067 valid_0's binary_logloss: 0.135655 valid_1's auc: 0.838081 valid_1's binary_logloss: 0.14516 [10] valid_0's auc: 0.853922 valid_0's binary_logloss: 0.134622 valid_1's auc: 0.837333 valid_1's binary_logloss: 0.144318 [11] valid_0's auc: 0.854729 valid_0's binary_logloss: 0.133702 valid_1's auc: 0.83725 valid_1's binary_logloss: 0.143512 [12] valid_0's auc: 0.856303 valid_0's binary_logloss: 0.132789 valid_1's auc: 0.837602 valid_1's binary_logloss: 0.142833 [13] valid_0's auc: 0.857206 valid_0's binary_logloss: 0.132038 valid_1's auc: 0.837364 valid_1's binary_logloss: 0.142245 [14] valid_0's auc: 0.858161 valid_0's binary_logloss: 0.131391 valid_1's auc: 0.83777 valid_1's binary_logloss: 0.141759 [15] valid_0's auc: 0.858975 valid_0's binary_logloss: 0.130772 valid_1's auc: 0.837831 valid_1's binary_logloss: 0.14139 [16] valid_0's auc: 0.859623 valid_0's binary_logloss: 0.130219 valid_1's auc: 0.837953 valid_1's binary_logloss: 0.141016 [17] valid_0's auc: 0.860576 valid_0's binary_logloss: 0.129684 valid_1's auc: 0.837985 valid_1's binary_logloss: 0.140713 [18] valid_0's auc: 0.861311 valid_0's binary_logloss: 0.129202 valid_1's auc: 0.83796 valid_1's binary_logloss: 0.140452 [19] valid_0's auc: 0.862347 valid_0's binary_logloss: 0.128715 valid_1's auc: 0.838506 valid_1's binary_logloss: 0.140189 [20] valid_0's auc: 0.86305 valid_0's binary_logloss: 0.128312 valid_1's auc: 0.837702 valid_1's binary_logloss: 0.140094 [21] valid_0's auc: 0.863758 valid_0's binary_logloss: 0.127907 valid_1's auc: 0.838127 valid_1's binary_logloss: 0.139858 [22] valid_0's auc: 0.864635 valid_0's binary_logloss: 0.127525 valid_1's auc: 0.838331 valid_1's binary_logloss: 0.139696 [23] valid_0's auc: 0.865866 valid_0's binary_logloss: 0.127143 valid_1's auc: 0.837841 valid_1's binary_logloss: 0.139625 [24] valid_0's auc: 0.867054 valid_0's binary_logloss: 0.126749 valid_1's auc: 0.838187 valid_1's binary_logloss: 0.139526 [25] valid_0's auc: 0.867553 valid_0's binary_logloss: 0.126476 valid_1's auc: 0.838308 valid_1's binary_logloss: 0.13949 [26] valid_0's auc: 0.868108 valid_0's binary_logloss: 0.126164 valid_1's auc: 0.838035 valid_1's binary_logloss: 0.139426 [27] valid_0's auc: 0.869014 valid_0's binary_logloss: 0.125868 valid_1's auc: 0.837545 valid_1's binary_logloss: 0.139445 [28] valid_0's auc: 0.869797 valid_0's binary_logloss: 0.12559 valid_1's auc: 0.837894 valid_1's binary_logloss: 0.139419 [29] valid_0's auc: 0.870435 valid_0's binary_logloss: 0.1253 valid_1's auc: 0.838103 valid_1's binary_logloss: 0.139321 [30] valid_0's auc: 0.87141 valid_0's binary_logloss: 0.125025 valid_1's auc: 0.838164 valid_1's binary_logloss: 0.139275 [31] valid_0's auc: 0.872143 valid_0's binary_logloss: 0.124769 valid_1's auc: 0.837843 valid_1's binary_logloss: 0.139285 [32] valid_0's auc: 0.872606 valid_0's binary_logloss: 0.124561 valid_1's auc: 0.837662 valid_1's binary_logloss: 0.139274 [33] valid_0's auc: 0.873337 valid_0's binary_logloss: 0.124346 valid_1's auc: 0.837661 valid_1's binary_logloss: 0.139284 [34] valid_0's auc: 0.873965 valid_0's binary_logloss: 0.124108 valid_1's auc: 0.837639 valid_1's binary_logloss: 0.139263 [35] valid_0's auc: 0.87457 valid_0's binary_logloss: 0.123857 valid_1's auc: 0.838159 valid_1's binary_logloss: 0.139137 [36] valid_0's auc: 0.874973 valid_0's binary_logloss: 0.123651 valid_1's auc: 0.838114 valid_1's binary_logloss: 0.139148 [37] valid_0's auc: 0.875657 valid_0's binary_logloss: 0.123447 valid_1's auc: 0.838519 valid_1's binary_logloss: 0.139109
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.821427 valid_0's binary_logloss: 0.156592 valid_1's auc: 0.81711 valid_1's binary_logloss: 0.165273 [2] valid_0's auc: 0.827893 valid_0's binary_logloss: 0.151336 valid_1's auc: 0.820533 valid_1's binary_logloss: 0.160243 [3] valid_0's auc: 0.83753 valid_0's binary_logloss: 0.147487 valid_1's auc: 0.82841 valid_1's binary_logloss: 0.156547 [4] valid_0's auc: 0.84038 valid_0's binary_logloss: 0.144428 valid_1's auc: 0.8313 valid_1's binary_logloss: 0.153575 [5] valid_0's auc: 0.842945 valid_0's binary_logloss: 0.142089 valid_1's auc: 0.833579 valid_1's binary_logloss: 0.151354 [6] valid_0's auc: 0.843246 valid_0's binary_logloss: 0.140186 valid_1's auc: 0.833781 valid_1's binary_logloss: 0.14953 [7] valid_0's auc: 0.844301 valid_0's binary_logloss: 0.138471 valid_1's auc: 0.834317 valid_1's binary_logloss: 0.147954 [8] valid_0's auc: 0.846945 valid_0's binary_logloss: 0.137078 valid_1's auc: 0.834895 valid_1's binary_logloss: 0.146786 [9] valid_0's auc: 0.849381 valid_0's binary_logloss: 0.135906 valid_1's auc: 0.834922 valid_1's binary_logloss: 0.145762 [10] valid_0's auc: 0.850944 valid_0's binary_logloss: 0.134855 valid_1's auc: 0.835441 valid_1's binary_logloss: 0.144958 [11] valid_0's auc: 0.852557 valid_0's binary_logloss: 0.133895 valid_1's auc: 0.835103 valid_1's binary_logloss: 0.144293 [12] valid_0's auc: 0.854609 valid_0's binary_logloss: 0.133013 valid_1's auc: 0.835686 valid_1's binary_logloss: 0.143793 [13] valid_0's auc: 0.855817 valid_0's binary_logloss: 0.132247 valid_1's auc: 0.835296 valid_1's binary_logloss: 0.143302 [14] valid_0's auc: 0.857501 valid_0's binary_logloss: 0.131545 valid_1's auc: 0.836432 valid_1's binary_logloss: 0.142761 [15] valid_0's auc: 0.858907 valid_0's binary_logloss: 0.130878 valid_1's auc: 0.836329 valid_1's binary_logloss: 0.142383 [16] valid_0's auc: 0.859887 valid_0's binary_logloss: 0.130287 valid_1's auc: 0.836611 valid_1's binary_logloss: 0.141883 [17] valid_0's auc: 0.860889 valid_0's binary_logloss: 0.129757 valid_1's auc: 0.836848 valid_1's binary_logloss: 0.141535 [18] valid_0's auc: 0.861827 valid_0's binary_logloss: 0.129301 valid_1's auc: 0.837106 valid_1's binary_logloss: 0.141257 [19] valid_0's auc: 0.862972 valid_0's binary_logloss: 0.128826 valid_1's auc: 0.837185 valid_1's binary_logloss: 0.141043 [20] valid_0's auc: 0.864083 valid_0's binary_logloss: 0.128369 valid_1's auc: 0.837509 valid_1's binary_logloss: 0.140794 [21] valid_0's auc: 0.864747 valid_0's binary_logloss: 0.127959 valid_1's auc: 0.837888 valid_1's binary_logloss: 0.140626 [22] valid_0's auc: 0.865769 valid_0's binary_logloss: 0.127562 valid_1's auc: 0.837811 valid_1's binary_logloss: 0.140487 [23] valid_0's auc: 0.866657 valid_0's binary_logloss: 0.127217 valid_1's auc: 0.837884 valid_1's binary_logloss: 0.140328 [24] valid_0's auc: 0.867293 valid_0's binary_logloss: 0.126875 valid_1's auc: 0.838481 valid_1's binary_logloss: 0.140215 [25] valid_0's auc: 0.867983 valid_0's binary_logloss: 0.126562 valid_1's auc: 0.838239 valid_1's binary_logloss: 0.140124 [26] valid_0's auc: 0.868559 valid_0's binary_logloss: 0.126248 valid_1's auc: 0.837903 valid_1's binary_logloss: 0.140092 [27] valid_0's auc: 0.869394 valid_0's binary_logloss: 0.125936 valid_1's auc: 0.837493 valid_1's binary_logloss: 0.14006 [28] valid_0's auc: 0.87048 valid_0's binary_logloss: 0.125677 valid_1's auc: 0.837623 valid_1's binary_logloss: 0.140007 [29] valid_0's auc: 0.87105 valid_0's binary_logloss: 0.125405 valid_1's auc: 0.838216 valid_1's binary_logloss: 0.13986 [30] valid_0's auc: 0.871749 valid_0's binary_logloss: 0.125147 valid_1's auc: 0.838898 valid_1's binary_logloss: 0.139742 [31] valid_0's auc: 0.87247 valid_0's binary_logloss: 0.124907 valid_1's auc: 0.838959 valid_1's binary_logloss: 0.139727 [32] valid_0's auc: 0.87282 valid_0's binary_logloss: 0.124724 valid_1's auc: 0.838675 valid_1's binary_logloss: 0.139761 [33] valid_0's auc: 0.874106 valid_0's binary_logloss: 0.124412 valid_1's auc: 0.838893 valid_1's binary_logloss: 0.139687 [34] valid_0's auc: 0.874887 valid_0's binary_logloss: 0.124169 valid_1's auc: 0.838801 valid_1's binary_logloss: 0.139672 [35] valid_0's auc: 0.875447 valid_0's binary_logloss: 0.123934 valid_1's auc: 0.838835 valid_1's binary_logloss: 0.139667 [36] valid_0's auc: 0.87617 valid_0's binary_logloss: 0.123693 valid_1's auc: 0.838505 valid_1's binary_logloss: 0.139699 [37] valid_0's auc: 0.876793 valid_0's binary_logloss: 0.12346 valid_1's auc: 0.838104 valid_1's binary_logloss: 0.139783 [38] valid_0's auc: 0.877265 valid_0's binary_logloss: 0.123251 valid_1's auc: 0.838267 valid_1's binary_logloss: 0.139787 [39] valid_0's auc: 0.877869 valid_0's binary_logloss: 0.123018 valid_1's auc: 0.838004 valid_1's binary_logloss: 0.139806 [40] valid_0's auc: 0.878509 valid_0's binary_logloss: 0.122803 valid_1's auc: 0.838086 valid_1's binary_logloss: 0.139745 [41] valid_0's auc: 0.879077 valid_0's binary_logloss: 0.122585 valid_1's auc: 0.838538 valid_1's binary_logloss: 0.139694 [42] valid_0's auc: 0.879515 valid_0's binary_logloss: 0.122368 valid_1's auc: 0.838647 valid_1's binary_logloss: 0.139655 [43] valid_0's auc: 0.879985 valid_0's binary_logloss: 0.122166 valid_1's auc: 0.838495 valid_1's binary_logloss: 0.139653 [44] valid_0's auc: 0.88041 valid_0's binary_logloss: 0.121985 valid_1's auc: 0.838221 valid_1's binary_logloss: 0.139755 [45] valid_0's auc: 0.880907 valid_0's binary_logloss: 0.121777 valid_1's auc: 0.837981 valid_1's binary_logloss: 0.139769 [46] valid_0's auc: 0.881216 valid_0's binary_logloss: 0.121594 valid_1's auc: 0.838471 valid_1's binary_logloss: 0.139693 [47] valid_0's auc: 0.881591 valid_0's binary_logloss: 0.121422 valid_1's auc: 0.83861 valid_1's binary_logloss: 0.139687 [48] valid_0's auc: 0.881867 valid_0's binary_logloss: 0.121266 valid_1's auc: 0.838593 valid_1's binary_logloss: 0.139682 [49] valid_0's auc: 0.882285 valid_0's binary_logloss: 0.121041 valid_1's auc: 0.838317 valid_1's binary_logloss: 0.139741 [50] valid_0's auc: 0.882828 valid_0's binary_logloss: 0.120853 valid_1's auc: 0.838244 valid_1's binary_logloss: 0.139759 [51] valid_0's auc: 0.883154 valid_0's binary_logloss: 0.120688 valid_1's auc: 0.838222 valid_1's binary_logloss: 0.139803 [52] valid_0's auc: 0.883348 valid_0's binary_logloss: 0.120567 valid_1's auc: 0.838064 valid_1's binary_logloss: 0.139824 [53] valid_0's auc: 0.883583 valid_0's binary_logloss: 0.120424 valid_1's auc: 0.83788 valid_1's binary_logloss: 0.139844 [54] valid_0's auc: 0.884106 valid_0's binary_logloss: 0.120208 valid_1's auc: 0.837625 valid_1's binary_logloss: 0.139886 [55] valid_0's auc: 0.884777 valid_0's binary_logloss: 0.120039 valid_1's auc: 0.837585 valid_1's binary_logloss: 0.139902 [56] valid_0's auc: 0.88511 valid_0's binary_logloss: 0.11989 valid_1's auc: 0.837646 valid_1's binary_logloss: 0.139926 [57] valid_0's auc: 0.885365 valid_0's binary_logloss: 0.11975 valid_1's auc: 0.837639 valid_1's binary_logloss: 0.139934 [58] valid_0's auc: 0.885606 valid_0's binary_logloss: 0.119595 valid_1's auc: 0.837726 valid_1's binary_logloss: 0.139938 [59] valid_0's auc: 0.885965 valid_0's binary_logloss: 0.119403 valid_1's auc: 0.837558 valid_1's binary_logloss: 0.140007 [60] valid_0's auc: 0.886208 valid_0's binary_logloss: 0.119263 valid_1's auc: 0.83744 valid_1's binary_logloss: 0.140079 [61] valid_0's auc: 0.886458 valid_0's binary_logloss: 0.119118 valid_1's auc: 0.837349 valid_1's binary_logloss: 0.140059
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.824873 valid_0's binary_logloss: 0.156222 valid_1's auc: 0.817791 valid_1's binary_logloss: 0.165072 [2] valid_0's auc: 0.828725 valid_0's binary_logloss: 0.151244 valid_1's auc: 0.822586 valid_1's binary_logloss: 0.160253 [3] valid_0's auc: 0.83594 valid_0's binary_logloss: 0.147423 valid_1's auc: 0.828474 valid_1's binary_logloss: 0.156542 [4] valid_0's auc: 0.839489 valid_0's binary_logloss: 0.144426 valid_1's auc: 0.831396 valid_1's binary_logloss: 0.153706 [5] valid_0's auc: 0.843358 valid_0's binary_logloss: 0.142067 valid_1's auc: 0.833466 valid_1's binary_logloss: 0.151399 [6] valid_0's auc: 0.845601 valid_0's binary_logloss: 0.14009 valid_1's auc: 0.833857 valid_1's binary_logloss: 0.149488 [7] valid_0's auc: 0.846477 valid_0's binary_logloss: 0.138491 valid_1's auc: 0.833143 valid_1's binary_logloss: 0.148023 [8] valid_0's auc: 0.847725 valid_0's binary_logloss: 0.137129 valid_1's auc: 0.833971 valid_1's binary_logloss: 0.146757 [9] valid_0's auc: 0.848442 valid_0's binary_logloss: 0.135908 valid_1's auc: 0.835976 valid_1's binary_logloss: 0.145685 [10] valid_0's auc: 0.849759 valid_0's binary_logloss: 0.134781 valid_1's auc: 0.836214 valid_1's binary_logloss: 0.144769 [11] valid_0's auc: 0.852238 valid_0's binary_logloss: 0.133835 valid_1's auc: 0.837243 valid_1's binary_logloss: 0.143925 [12] valid_0's auc: 0.853743 valid_0's binary_logloss: 0.132972 valid_1's auc: 0.836647 valid_1's binary_logloss: 0.143391 [13] valid_0's auc: 0.854568 valid_0's binary_logloss: 0.132256 valid_1's auc: 0.837182 valid_1's binary_logloss: 0.142849 [14] valid_0's auc: 0.855928 valid_0's binary_logloss: 0.131554 valid_1's auc: 0.835941 valid_1's binary_logloss: 0.142474 [15] valid_0's auc: 0.85712 valid_0's binary_logloss: 0.130984 valid_1's auc: 0.834938 valid_1's binary_logloss: 0.142198 [16] valid_0's auc: 0.858721 valid_0's binary_logloss: 0.130371 valid_1's auc: 0.83561 valid_1's binary_logloss: 0.141802 [17] valid_0's auc: 0.859281 valid_0's binary_logloss: 0.129877 valid_1's auc: 0.835146 valid_1's binary_logloss: 0.141605 [18] valid_0's auc: 0.859881 valid_0's binary_logloss: 0.129417 valid_1's auc: 0.835386 valid_1's binary_logloss: 0.14132 [19] valid_0's auc: 0.861409 valid_0's binary_logloss: 0.128929 valid_1's auc: 0.834974 valid_1's binary_logloss: 0.141151 [20] valid_0's auc: 0.862574 valid_0's binary_logloss: 0.128458 valid_1's auc: 0.834949 valid_1's binary_logloss: 0.140968 [21] valid_0's auc: 0.863262 valid_0's binary_logloss: 0.128069 valid_1's auc: 0.834616 valid_1's binary_logloss: 0.14086 [22] valid_0's auc: 0.864655 valid_0's binary_logloss: 0.127684 valid_1's auc: 0.834363 valid_1's binary_logloss: 0.140766 [23] valid_0's auc: 0.865247 valid_0's binary_logloss: 0.127349 valid_1's auc: 0.834317 valid_1's binary_logloss: 0.140688 [24] valid_0's auc: 0.865882 valid_0's binary_logloss: 0.12704 valid_1's auc: 0.833543 valid_1's binary_logloss: 0.14068 [25] valid_0's auc: 0.867496 valid_0's binary_logloss: 0.126629 valid_1's auc: 0.834195 valid_1's binary_logloss: 0.140539 [26] valid_0's auc: 0.867923 valid_0's binary_logloss: 0.126353 valid_1's auc: 0.834028 valid_1's binary_logloss: 0.140506 [27] valid_0's auc: 0.868685 valid_0's binary_logloss: 0.126058 valid_1's auc: 0.834718 valid_1's binary_logloss: 0.140359 [28] valid_0's auc: 0.869304 valid_0's binary_logloss: 0.125764 valid_1's auc: 0.834935 valid_1's binary_logloss: 0.140287 [29] valid_0's auc: 0.870037 valid_0's binary_logloss: 0.125514 valid_1's auc: 0.834481 valid_1's binary_logloss: 0.140258 [30] valid_0's auc: 0.870785 valid_0's binary_logloss: 0.125254 valid_1's auc: 0.834179 valid_1's binary_logloss: 0.140275 [31] valid_0's auc: 0.871706 valid_0's binary_logloss: 0.124992 valid_1's auc: 0.834475 valid_1's binary_logloss: 0.140205 [32] valid_0's auc: 0.872582 valid_0's binary_logloss: 0.124728 valid_1's auc: 0.834353 valid_1's binary_logloss: 0.140189 [33] valid_0's auc: 0.873445 valid_0's binary_logloss: 0.124481 valid_1's auc: 0.834592 valid_1's binary_logloss: 0.140082 [34] valid_0's auc: 0.874095 valid_0's binary_logloss: 0.12426 valid_1's auc: 0.83436 valid_1's binary_logloss: 0.140101 [35] valid_0's auc: 0.874869 valid_0's binary_logloss: 0.123982 valid_1's auc: 0.834045 valid_1's binary_logloss: 0.140151 [36] valid_0's auc: 0.875446 valid_0's binary_logloss: 0.123753 valid_1's auc: 0.834073 valid_1's binary_logloss: 0.140125 [37] valid_0's auc: 0.875763 valid_0's binary_logloss: 0.123587 valid_1's auc: 0.833611 valid_1's binary_logloss: 0.140201 [38] valid_0's auc: 0.876603 valid_0's binary_logloss: 0.123335 valid_1's auc: 0.833805 valid_1's binary_logloss: 0.140159 [39] valid_0's auc: 0.877126 valid_0's binary_logloss: 0.123134 valid_1's auc: 0.834422 valid_1's binary_logloss: 0.140048 [40] valid_0's auc: 0.877575 valid_0's binary_logloss: 0.123013 valid_1's auc: 0.834343 valid_1's binary_logloss: 0.140069 [41] valid_0's auc: 0.87809 valid_0's binary_logloss: 0.122813 valid_1's auc: 0.834199 valid_1's binary_logloss: 0.140085
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.821831 valid_0's binary_logloss: 0.156466 valid_1's auc: 0.817525 valid_1's binary_logloss: 0.165186 [2] valid_0's auc: 0.831974 valid_0's binary_logloss: 0.151137 valid_1's auc: 0.82532 valid_1's binary_logloss: 0.159691 [3] valid_0's auc: 0.839496 valid_0's binary_logloss: 0.14733 valid_1's auc: 0.831946 valid_1's binary_logloss: 0.156 [4] valid_0's auc: 0.843984 valid_0's binary_logloss: 0.144371 valid_1's auc: 0.834064 valid_1's binary_logloss: 0.153082 [5] valid_0's auc: 0.845854 valid_0's binary_logloss: 0.142024 valid_1's auc: 0.836918 valid_1's binary_logloss: 0.150735 [6] valid_0's auc: 0.848041 valid_0's binary_logloss: 0.140009 valid_1's auc: 0.838831 valid_1's binary_logloss: 0.148771 [7] valid_0's auc: 0.849655 valid_0's binary_logloss: 0.138307 valid_1's auc: 0.839111 valid_1's binary_logloss: 0.147373 [8] valid_0's auc: 0.85185 valid_0's binary_logloss: 0.136891 valid_1's auc: 0.838955 valid_1's binary_logloss: 0.146094 [9] valid_0's auc: 0.853067 valid_0's binary_logloss: 0.135655 valid_1's auc: 0.838081 valid_1's binary_logloss: 0.14516 [10] valid_0's auc: 0.853922 valid_0's binary_logloss: 0.134622 valid_1's auc: 0.837333 valid_1's binary_logloss: 0.144318 [11] valid_0's auc: 0.854729 valid_0's binary_logloss: 0.133702 valid_1's auc: 0.83725 valid_1's binary_logloss: 0.143512 [12] valid_0's auc: 0.856303 valid_0's binary_logloss: 0.132789 valid_1's auc: 0.837602 valid_1's binary_logloss: 0.142833 [13] valid_0's auc: 0.857206 valid_0's binary_logloss: 0.132038 valid_1's auc: 0.837364 valid_1's binary_logloss: 0.142245 [14] valid_0's auc: 0.858161 valid_0's binary_logloss: 0.131391 valid_1's auc: 0.83777 valid_1's binary_logloss: 0.141759 [15] valid_0's auc: 0.858975 valid_0's binary_logloss: 0.130772 valid_1's auc: 0.837831 valid_1's binary_logloss: 0.14139 [16] valid_0's auc: 0.859623 valid_0's binary_logloss: 0.130219 valid_1's auc: 0.837953 valid_1's binary_logloss: 0.141016 [17] valid_0's auc: 0.860576 valid_0's binary_logloss: 0.129684 valid_1's auc: 0.837985 valid_1's binary_logloss: 0.140713 [18] valid_0's auc: 0.861311 valid_0's binary_logloss: 0.129202 valid_1's auc: 0.83796 valid_1's binary_logloss: 0.140452 [19] valid_0's auc: 0.862347 valid_0's binary_logloss: 0.128715 valid_1's auc: 0.838506 valid_1's binary_logloss: 0.140189 [20] valid_0's auc: 0.86305 valid_0's binary_logloss: 0.128312 valid_1's auc: 0.837702 valid_1's binary_logloss: 0.140094 [21] valid_0's auc: 0.863758 valid_0's binary_logloss: 0.127907 valid_1's auc: 0.838127 valid_1's binary_logloss: 0.139858 [22] valid_0's auc: 0.864635 valid_0's binary_logloss: 0.127525 valid_1's auc: 0.838331 valid_1's binary_logloss: 0.139696 [23] valid_0's auc: 0.865866 valid_0's binary_logloss: 0.127143 valid_1's auc: 0.837841 valid_1's binary_logloss: 0.139625 [24] valid_0's auc: 0.867054 valid_0's binary_logloss: 0.126749 valid_1's auc: 0.838187 valid_1's binary_logloss: 0.139526 [25] valid_0's auc: 0.867553 valid_0's binary_logloss: 0.126476 valid_1's auc: 0.838308 valid_1's binary_logloss: 0.13949 [26] valid_0's auc: 0.868108 valid_0's binary_logloss: 0.126164 valid_1's auc: 0.838035 valid_1's binary_logloss: 0.139426 [27] valid_0's auc: 0.869014 valid_0's binary_logloss: 0.125868 valid_1's auc: 0.837545 valid_1's binary_logloss: 0.139445 [28] valid_0's auc: 0.869797 valid_0's binary_logloss: 0.12559 valid_1's auc: 0.837894 valid_1's binary_logloss: 0.139419 [29] valid_0's auc: 0.870435 valid_0's binary_logloss: 0.1253 valid_1's auc: 0.838103 valid_1's binary_logloss: 0.139321 [30] valid_0's auc: 0.87141 valid_0's binary_logloss: 0.125025 valid_1's auc: 0.838164 valid_1's binary_logloss: 0.139275 [31] valid_0's auc: 0.872143 valid_0's binary_logloss: 0.124769 valid_1's auc: 0.837843 valid_1's binary_logloss: 0.139285 [32] valid_0's auc: 0.872606 valid_0's binary_logloss: 0.124561 valid_1's auc: 0.837662 valid_1's binary_logloss: 0.139274 [33] valid_0's auc: 0.873337 valid_0's binary_logloss: 0.124346 valid_1's auc: 0.837661 valid_1's binary_logloss: 0.139284 [34] valid_0's auc: 0.873965 valid_0's binary_logloss: 0.124108 valid_1's auc: 0.837639 valid_1's binary_logloss: 0.139263 [35] valid_0's auc: 0.87457 valid_0's binary_logloss: 0.123857 valid_1's auc: 0.838159 valid_1's binary_logloss: 0.139137 [36] valid_0's auc: 0.874973 valid_0's binary_logloss: 0.123651 valid_1's auc: 0.838114 valid_1's binary_logloss: 0.139148 [37] valid_0's auc: 0.875657 valid_0's binary_logloss: 0.123447 valid_1's auc: 0.838519 valid_1's binary_logloss: 0.139109
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.821427 valid_0's binary_logloss: 0.156592 valid_1's auc: 0.81711 valid_1's binary_logloss: 0.165273 [2] valid_0's auc: 0.827893 valid_0's binary_logloss: 0.151336 valid_1's auc: 0.820533 valid_1's binary_logloss: 0.160243 [3] valid_0's auc: 0.83753 valid_0's binary_logloss: 0.147487 valid_1's auc: 0.82841 valid_1's binary_logloss: 0.156547 [4] valid_0's auc: 0.84038 valid_0's binary_logloss: 0.144428 valid_1's auc: 0.8313 valid_1's binary_logloss: 0.153575 [5] valid_0's auc: 0.842945 valid_0's binary_logloss: 0.142089 valid_1's auc: 0.833579 valid_1's binary_logloss: 0.151354 [6] valid_0's auc: 0.843246 valid_0's binary_logloss: 0.140186 valid_1's auc: 0.833781 valid_1's binary_logloss: 0.14953 [7] valid_0's auc: 0.844301 valid_0's binary_logloss: 0.138471 valid_1's auc: 0.834317 valid_1's binary_logloss: 0.147954 [8] valid_0's auc: 0.846945 valid_0's binary_logloss: 0.137078 valid_1's auc: 0.834895 valid_1's binary_logloss: 0.146786 [9] valid_0's auc: 0.849381 valid_0's binary_logloss: 0.135906 valid_1's auc: 0.834922 valid_1's binary_logloss: 0.145762 [10] valid_0's auc: 0.850944 valid_0's binary_logloss: 0.134855 valid_1's auc: 0.835441 valid_1's binary_logloss: 0.144958 [11] valid_0's auc: 0.852557 valid_0's binary_logloss: 0.133895 valid_1's auc: 0.835103 valid_1's binary_logloss: 0.144293 [12] valid_0's auc: 0.854609 valid_0's binary_logloss: 0.133013 valid_1's auc: 0.835686 valid_1's binary_logloss: 0.143793 [13] valid_0's auc: 0.855817 valid_0's binary_logloss: 0.132247 valid_1's auc: 0.835296 valid_1's binary_logloss: 0.143302 [14] valid_0's auc: 0.857501 valid_0's binary_logloss: 0.131545 valid_1's auc: 0.836432 valid_1's binary_logloss: 0.142761 [15] valid_0's auc: 0.858907 valid_0's binary_logloss: 0.130878 valid_1's auc: 0.836329 valid_1's binary_logloss: 0.142383 [16] valid_0's auc: 0.859887 valid_0's binary_logloss: 0.130287 valid_1's auc: 0.836611 valid_1's binary_logloss: 0.141883 [17] valid_0's auc: 0.860889 valid_0's binary_logloss: 0.129757 valid_1's auc: 0.836848 valid_1's binary_logloss: 0.141535 [18] valid_0's auc: 0.861827 valid_0's binary_logloss: 0.129301 valid_1's auc: 0.837106 valid_1's binary_logloss: 0.141257 [19] valid_0's auc: 0.862972 valid_0's binary_logloss: 0.128826 valid_1's auc: 0.837185 valid_1's binary_logloss: 0.141043 [20] valid_0's auc: 0.864083 valid_0's binary_logloss: 0.128369 valid_1's auc: 0.837509 valid_1's binary_logloss: 0.140794 [21] valid_0's auc: 0.864747 valid_0's binary_logloss: 0.127959 valid_1's auc: 0.837888 valid_1's binary_logloss: 0.140626 [22] valid_0's auc: 0.865769 valid_0's binary_logloss: 0.127562 valid_1's auc: 0.837811 valid_1's binary_logloss: 0.140487 [23] valid_0's auc: 0.866657 valid_0's binary_logloss: 0.127217 valid_1's auc: 0.837884 valid_1's binary_logloss: 0.140328 [24] valid_0's auc: 0.867293 valid_0's binary_logloss: 0.126875 valid_1's auc: 0.838481 valid_1's binary_logloss: 0.140215 [25] valid_0's auc: 0.867983 valid_0's binary_logloss: 0.126562 valid_1's auc: 0.838239 valid_1's binary_logloss: 0.140124 [26] valid_0's auc: 0.868559 valid_0's binary_logloss: 0.126248 valid_1's auc: 0.837903 valid_1's binary_logloss: 0.140092 [27] valid_0's auc: 0.869394 valid_0's binary_logloss: 0.125936 valid_1's auc: 0.837493 valid_1's binary_logloss: 0.14006 [28] valid_0's auc: 0.87048 valid_0's binary_logloss: 0.125677 valid_1's auc: 0.837623 valid_1's binary_logloss: 0.140007 [29] valid_0's auc: 0.87105 valid_0's binary_logloss: 0.125405 valid_1's auc: 0.838216 valid_1's binary_logloss: 0.13986 [30] valid_0's auc: 0.871749 valid_0's binary_logloss: 0.125147 valid_1's auc: 0.838898 valid_1's binary_logloss: 0.139742 [31] valid_0's auc: 0.87247 valid_0's binary_logloss: 0.124907 valid_1's auc: 0.838959 valid_1's binary_logloss: 0.139727 [32] valid_0's auc: 0.87282 valid_0's binary_logloss: 0.124724 valid_1's auc: 0.838675 valid_1's binary_logloss: 0.139761 [33] valid_0's auc: 0.874106 valid_0's binary_logloss: 0.124412 valid_1's auc: 0.838893 valid_1's binary_logloss: 0.139687 [34] valid_0's auc: 0.874887 valid_0's binary_logloss: 0.124169 valid_1's auc: 0.838801 valid_1's binary_logloss: 0.139672 [35] valid_0's auc: 0.875447 valid_0's binary_logloss: 0.123934 valid_1's auc: 0.838835 valid_1's binary_logloss: 0.139667 [36] valid_0's auc: 0.87617 valid_0's binary_logloss: 0.123693 valid_1's auc: 0.838505 valid_1's binary_logloss: 0.139699 [37] valid_0's auc: 0.876793 valid_0's binary_logloss: 0.12346 valid_1's auc: 0.838104 valid_1's binary_logloss: 0.139783 [38] valid_0's auc: 0.877265 valid_0's binary_logloss: 0.123251 valid_1's auc: 0.838267 valid_1's binary_logloss: 0.139787 [39] valid_0's auc: 0.877869 valid_0's binary_logloss: 0.123018 valid_1's auc: 0.838004 valid_1's binary_logloss: 0.139806 [40] valid_0's auc: 0.878509 valid_0's binary_logloss: 0.122803 valid_1's auc: 0.838086 valid_1's binary_logloss: 0.139745 [41] valid_0's auc: 0.879077 valid_0's binary_logloss: 0.122585 valid_1's auc: 0.838538 valid_1's binary_logloss: 0.139694 [42] valid_0's auc: 0.879515 valid_0's binary_logloss: 0.122368 valid_1's auc: 0.838647 valid_1's binary_logloss: 0.139655 [43] valid_0's auc: 0.879985 valid_0's binary_logloss: 0.122166 valid_1's auc: 0.838495 valid_1's binary_logloss: 0.139653 [44] valid_0's auc: 0.88041 valid_0's binary_logloss: 0.121985 valid_1's auc: 0.838221 valid_1's binary_logloss: 0.139755 [45] valid_0's auc: 0.880907 valid_0's binary_logloss: 0.121777 valid_1's auc: 0.837981 valid_1's binary_logloss: 0.139769 [46] valid_0's auc: 0.881216 valid_0's binary_logloss: 0.121594 valid_1's auc: 0.838471 valid_1's binary_logloss: 0.139693 [47] valid_0's auc: 0.881591 valid_0's binary_logloss: 0.121422 valid_1's auc: 0.83861 valid_1's binary_logloss: 0.139687 [48] valid_0's auc: 0.881867 valid_0's binary_logloss: 0.121266 valid_1's auc: 0.838593 valid_1's binary_logloss: 0.139682 [49] valid_0's auc: 0.882285 valid_0's binary_logloss: 0.121041 valid_1's auc: 0.838317 valid_1's binary_logloss: 0.139741 [50] valid_0's auc: 0.882828 valid_0's binary_logloss: 0.120853 valid_1's auc: 0.838244 valid_1's binary_logloss: 0.139759 [51] valid_0's auc: 0.883154 valid_0's binary_logloss: 0.120688 valid_1's auc: 0.838222 valid_1's binary_logloss: 0.139803 [52] valid_0's auc: 0.883348 valid_0's binary_logloss: 0.120567 valid_1's auc: 0.838064 valid_1's binary_logloss: 0.139824 [53] valid_0's auc: 0.883583 valid_0's binary_logloss: 0.120424 valid_1's auc: 0.83788 valid_1's binary_logloss: 0.139844 [54] valid_0's auc: 0.884106 valid_0's binary_logloss: 0.120208 valid_1's auc: 0.837625 valid_1's binary_logloss: 0.139886 [55] valid_0's auc: 0.884777 valid_0's binary_logloss: 0.120039 valid_1's auc: 0.837585 valid_1's binary_logloss: 0.139902 [56] valid_0's auc: 0.88511 valid_0's binary_logloss: 0.11989 valid_1's auc: 0.837646 valid_1's binary_logloss: 0.139926 [57] valid_0's auc: 0.885365 valid_0's binary_logloss: 0.11975 valid_1's auc: 0.837639 valid_1's binary_logloss: 0.139934 [58] valid_0's auc: 0.885606 valid_0's binary_logloss: 0.119595 valid_1's auc: 0.837726 valid_1's binary_logloss: 0.139938 [59] valid_0's auc: 0.885965 valid_0's binary_logloss: 0.119403 valid_1's auc: 0.837558 valid_1's binary_logloss: 0.140007 [60] valid_0's auc: 0.886208 valid_0's binary_logloss: 0.119263 valid_1's auc: 0.83744 valid_1's binary_logloss: 0.140079 [61] valid_0's auc: 0.886458 valid_0's binary_logloss: 0.119118 valid_1's auc: 0.837349 valid_1's binary_logloss: 0.140059
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.835412 valid_0's binary_logloss: 0.155721 valid_1's auc: 0.81973 valid_1's binary_logloss: 0.164844 [2] valid_0's auc: 0.841188 valid_0's binary_logloss: 0.150354 valid_1's auc: 0.823402 valid_1's binary_logloss: 0.16006 [3] valid_0's auc: 0.846758 valid_0's binary_logloss: 0.146288 valid_1's auc: 0.824811 valid_1's binary_logloss: 0.15621 [4] valid_0's auc: 0.850398 valid_0's binary_logloss: 0.143088 valid_1's auc: 0.830278 valid_1's binary_logloss: 0.153352 [5] valid_0's auc: 0.853086 valid_0's binary_logloss: 0.140514 valid_1's auc: 0.833574 valid_1's binary_logloss: 0.151071 [6] valid_0's auc: 0.855915 valid_0's binary_logloss: 0.138329 valid_1's auc: 0.834881 valid_1's binary_logloss: 0.149277 [7] valid_0's auc: 0.858115 valid_0's binary_logloss: 0.136481 valid_1's auc: 0.833603 valid_1's binary_logloss: 0.14786 [8] valid_0's auc: 0.859479 valid_0's binary_logloss: 0.134947 valid_1's auc: 0.834093 valid_1's binary_logloss: 0.146607 [9] valid_0's auc: 0.86143 valid_0's binary_logloss: 0.133519 valid_1's auc: 0.833898 valid_1's binary_logloss: 0.14559 [10] valid_0's auc: 0.862964 valid_0's binary_logloss: 0.132331 valid_1's auc: 0.835026 valid_1's binary_logloss: 0.144789 [11] valid_0's auc: 0.864277 valid_0's binary_logloss: 0.13126 valid_1's auc: 0.834957 valid_1's binary_logloss: 0.144152 [12] valid_0's auc: 0.865572 valid_0's binary_logloss: 0.130304 valid_1's auc: 0.833693 valid_1's binary_logloss: 0.143697 [13] valid_0's auc: 0.867519 valid_0's binary_logloss: 0.129385 valid_1's auc: 0.833158 valid_1's binary_logloss: 0.143184 [14] valid_0's auc: 0.869354 valid_0's binary_logloss: 0.128524 valid_1's auc: 0.833598 valid_1's binary_logloss: 0.142668 [15] valid_0's auc: 0.870553 valid_0's binary_logloss: 0.127746 valid_1's auc: 0.833467 valid_1's binary_logloss: 0.142302 [16] valid_0's auc: 0.871816 valid_0's binary_logloss: 0.126943 valid_1's auc: 0.83329 valid_1's binary_logloss: 0.142022 [17] valid_0's auc: 0.872964 valid_0's binary_logloss: 0.126266 valid_1's auc: 0.83279 valid_1's binary_logloss: 0.141891 [18] valid_0's auc: 0.874047 valid_0's binary_logloss: 0.125646 valid_1's auc: 0.831917 valid_1's binary_logloss: 0.141748 [19] valid_0's auc: 0.875336 valid_0's binary_logloss: 0.125072 valid_1's auc: 0.831274 valid_1's binary_logloss: 0.141658 [20] valid_0's auc: 0.876959 valid_0's binary_logloss: 0.124484 valid_1's auc: 0.831275 valid_1's binary_logloss: 0.141511 [21] valid_0's auc: 0.878049 valid_0's binary_logloss: 0.123928 valid_1's auc: 0.830813 valid_1's binary_logloss: 0.141459 [22] valid_0's auc: 0.878905 valid_0's binary_logloss: 0.123447 valid_1's auc: 0.83012 valid_1's binary_logloss: 0.141449 [23] valid_0's auc: 0.879827 valid_0's binary_logloss: 0.12295 valid_1's auc: 0.829554 valid_1's binary_logloss: 0.141492 [24] valid_0's auc: 0.880692 valid_0's binary_logloss: 0.122479 valid_1's auc: 0.829256 valid_1's binary_logloss: 0.141487 [25] valid_0's auc: 0.881715 valid_0's binary_logloss: 0.121994 valid_1's auc: 0.829326 valid_1's binary_logloss: 0.141362 [26] valid_0's auc: 0.883014 valid_0's binary_logloss: 0.121527 valid_1's auc: 0.829553 valid_1's binary_logloss: 0.14132 [27] valid_0's auc: 0.884245 valid_0's binary_logloss: 0.121024 valid_1's auc: 0.829624 valid_1's binary_logloss: 0.14127 [28] valid_0's auc: 0.885238 valid_0's binary_logloss: 0.12058 valid_1's auc: 0.829417 valid_1's binary_logloss: 0.141237 [29] valid_0's auc: 0.88602 valid_0's binary_logloss: 0.120198 valid_1's auc: 0.82917 valid_1's binary_logloss: 0.141201 [30] valid_0's auc: 0.88684 valid_0's binary_logloss: 0.119831 valid_1's auc: 0.82962 valid_1's binary_logloss: 0.141121 [31] valid_0's auc: 0.887965 valid_0's binary_logloss: 0.119437 valid_1's auc: 0.83035 valid_1's binary_logloss: 0.14101 [32] valid_0's auc: 0.88868 valid_0's binary_logloss: 0.119086 valid_1's auc: 0.82975 valid_1's binary_logloss: 0.141093 [33] valid_0's auc: 0.889895 valid_0's binary_logloss: 0.118649 valid_1's auc: 0.829977 valid_1's binary_logloss: 0.141037 [34] valid_0's auc: 0.890626 valid_0's binary_logloss: 0.118328 valid_1's auc: 0.829368 valid_1's binary_logloss: 0.141161 [35] valid_0's auc: 0.89116 valid_0's binary_logloss: 0.11806 valid_1's auc: 0.829262 valid_1's binary_logloss: 0.141183 [36] valid_0's auc: 0.891999 valid_0's binary_logloss: 0.11775 valid_1's auc: 0.828947 valid_1's binary_logloss: 0.14129 [37] valid_0's auc: 0.892306 valid_0's binary_logloss: 0.117477 valid_1's auc: 0.828544 valid_1's binary_logloss: 0.141389 [38] valid_0's auc: 0.892937 valid_0's binary_logloss: 0.117192 valid_1's auc: 0.827983 valid_1's binary_logloss: 0.141516 [39] valid_0's auc: 0.893563 valid_0's binary_logloss: 0.116869 valid_1's auc: 0.828068 valid_1's binary_logloss: 0.141517 [40] valid_0's auc: 0.893942 valid_0's binary_logloss: 0.11662 valid_1's auc: 0.827852 valid_1's binary_logloss: 0.141621
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.830474 valid_0's binary_logloss: 0.155928 valid_1's auc: 0.817343 valid_1's binary_logloss: 0.164928 [2] valid_0's auc: 0.842931 valid_0's binary_logloss: 0.1503 valid_1's auc: 0.82699 valid_1's binary_logloss: 0.15948 [3] valid_0's auc: 0.850877 valid_0's binary_logloss: 0.14631 valid_1's auc: 0.832212 valid_1's binary_logloss: 0.155775 [4] valid_0's auc: 0.854431 valid_0's binary_logloss: 0.143104 valid_1's auc: 0.83392 valid_1's binary_logloss: 0.152698 [5] valid_0's auc: 0.85663 valid_0's binary_logloss: 0.140582 valid_1's auc: 0.835094 valid_1's binary_logloss: 0.150349 [6] valid_0's auc: 0.859142 valid_0's binary_logloss: 0.138289 valid_1's auc: 0.836166 valid_1's binary_logloss: 0.148424 [7] valid_0's auc: 0.861364 valid_0's binary_logloss: 0.136413 valid_1's auc: 0.837184 valid_1's binary_logloss: 0.146912 [8] valid_0's auc: 0.862199 valid_0's binary_logloss: 0.134841 valid_1's auc: 0.837545 valid_1's binary_logloss: 0.145726 [9] valid_0's auc: 0.864095 valid_0's binary_logloss: 0.133364 valid_1's auc: 0.837242 valid_1's binary_logloss: 0.144736 [10] valid_0's auc: 0.866024 valid_0's binary_logloss: 0.132096 valid_1's auc: 0.837719 valid_1's binary_logloss: 0.143766 [11] valid_0's auc: 0.867454 valid_0's binary_logloss: 0.131002 valid_1's auc: 0.837865 valid_1's binary_logloss: 0.143009 [12] valid_0's auc: 0.868329 valid_0's binary_logloss: 0.130024 valid_1's auc: 0.837259 valid_1's binary_logloss: 0.14244 [13] valid_0's auc: 0.869137 valid_0's binary_logloss: 0.129145 valid_1's auc: 0.837689 valid_1's binary_logloss: 0.141896 [14] valid_0's auc: 0.870957 valid_0's binary_logloss: 0.128226 valid_1's auc: 0.838226 valid_1's binary_logloss: 0.141392 [15] valid_0's auc: 0.872273 valid_0's binary_logloss: 0.12745 valid_1's auc: 0.837906 valid_1's binary_logloss: 0.141019 [16] valid_0's auc: 0.873243 valid_0's binary_logloss: 0.12672 valid_1's auc: 0.837761 valid_1's binary_logloss: 0.140677 [17] valid_0's auc: 0.874251 valid_0's binary_logloss: 0.126044 valid_1's auc: 0.83701 valid_1's binary_logloss: 0.140582 [18] valid_0's auc: 0.875622 valid_0's binary_logloss: 0.125387 valid_1's auc: 0.836179 valid_1's binary_logloss: 0.140485 [19] valid_0's auc: 0.877031 valid_0's binary_logloss: 0.124759 valid_1's auc: 0.836188 valid_1's binary_logloss: 0.14029 [20] valid_0's auc: 0.878046 valid_0's binary_logloss: 0.124156 valid_1's auc: 0.836531 valid_1's binary_logloss: 0.140133 [21] valid_0's auc: 0.879478 valid_0's binary_logloss: 0.123507 valid_1's auc: 0.837068 valid_1's binary_logloss: 0.13995 [22] valid_0's auc: 0.880423 valid_0's binary_logloss: 0.123029 valid_1's auc: 0.836817 valid_1's binary_logloss: 0.139912 [23] valid_0's auc: 0.881684 valid_0's binary_logloss: 0.122492 valid_1's auc: 0.836983 valid_1's binary_logloss: 0.139762 [24] valid_0's auc: 0.882873 valid_0's binary_logloss: 0.121986 valid_1's auc: 0.837319 valid_1's binary_logloss: 0.139659 [25] valid_0's auc: 0.883597 valid_0's binary_logloss: 0.121566 valid_1's auc: 0.837154 valid_1's binary_logloss: 0.139623 [26] valid_0's auc: 0.884814 valid_0's binary_logloss: 0.121104 valid_1's auc: 0.836302 valid_1's binary_logloss: 0.139668 [27] valid_0's auc: 0.886026 valid_0's binary_logloss: 0.120635 valid_1's auc: 0.836521 valid_1's binary_logloss: 0.139601 [28] valid_0's auc: 0.887071 valid_0's binary_logloss: 0.120222 valid_1's auc: 0.836646 valid_1's binary_logloss: 0.139557 [29] valid_0's auc: 0.887946 valid_0's binary_logloss: 0.119804 valid_1's auc: 0.836735 valid_1's binary_logloss: 0.139518 [30] valid_0's auc: 0.88898 valid_0's binary_logloss: 0.119416 valid_1's auc: 0.836858 valid_1's binary_logloss: 0.139499 [31] valid_0's auc: 0.889792 valid_0's binary_logloss: 0.119058 valid_1's auc: 0.836917 valid_1's binary_logloss: 0.139463 [32] valid_0's auc: 0.890876 valid_0's binary_logloss: 0.118631 valid_1's auc: 0.836346 valid_1's binary_logloss: 0.139532 [33] valid_0's auc: 0.891629 valid_0's binary_logloss: 0.118259 valid_1's auc: 0.836206 valid_1's binary_logloss: 0.139603 [34] valid_0's auc: 0.892446 valid_0's binary_logloss: 0.117893 valid_1's auc: 0.836005 valid_1's binary_logloss: 0.139603 [35] valid_0's auc: 0.893407 valid_0's binary_logloss: 0.11752 valid_1's auc: 0.8361 valid_1's binary_logloss: 0.139574 [36] valid_0's auc: 0.893836 valid_0's binary_logloss: 0.117247 valid_1's auc: 0.836147 valid_1's binary_logloss: 0.139608 [37] valid_0's auc: 0.894774 valid_0's binary_logloss: 0.116913 valid_1's auc: 0.836601 valid_1's binary_logloss: 0.139569 [38] valid_0's auc: 0.895494 valid_0's binary_logloss: 0.116611 valid_1's auc: 0.836232 valid_1's binary_logloss: 0.139645 [39] valid_0's auc: 0.896102 valid_0's binary_logloss: 0.116275 valid_1's auc: 0.836415 valid_1's binary_logloss: 0.139653 [40] valid_0's auc: 0.896715 valid_0's binary_logloss: 0.115934 valid_1's auc: 0.836463 valid_1's binary_logloss: 0.139671 [41] valid_0's auc: 0.897232 valid_0's binary_logloss: 0.115612 valid_1's auc: 0.836223 valid_1's binary_logloss: 0.139762 [42] valid_0's auc: 0.897875 valid_0's binary_logloss: 0.11528 valid_1's auc: 0.836151 valid_1's binary_logloss: 0.139777 [43] valid_0's auc: 0.898493 valid_0's binary_logloss: 0.114999 valid_1's auc: 0.836216 valid_1's binary_logloss: 0.139761 [44] valid_0's auc: 0.899179 valid_0's binary_logloss: 0.114703 valid_1's auc: 0.836328 valid_1's binary_logloss: 0.139755
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.834724 valid_0's binary_logloss: 0.15607 valid_1's auc: 0.822983 valid_1's binary_logloss: 0.165104 [2] valid_0's auc: 0.842835 valid_0's binary_logloss: 0.150494 valid_1's auc: 0.830472 valid_1's binary_logloss: 0.159671 [3] valid_0's auc: 0.847187 valid_0's binary_logloss: 0.146306 valid_1's auc: 0.830873 valid_1's binary_logloss: 0.155985 [4] valid_0's auc: 0.850394 valid_0's binary_logloss: 0.143088 valid_1's auc: 0.830975 valid_1's binary_logloss: 0.15321 [5] valid_0's auc: 0.853379 valid_0's binary_logloss: 0.140508 valid_1's auc: 0.832135 valid_1's binary_logloss: 0.150854 [6] valid_0's auc: 0.855463 valid_0's binary_logloss: 0.138297 valid_1's auc: 0.833116 valid_1's binary_logloss: 0.149013 [7] valid_0's auc: 0.856723 valid_0's binary_logloss: 0.136504 valid_1's auc: 0.833811 valid_1's binary_logloss: 0.147577 [8] valid_0's auc: 0.858076 valid_0's binary_logloss: 0.13495 valid_1's auc: 0.835315 valid_1's binary_logloss: 0.146273 [9] valid_0's auc: 0.861024 valid_0's binary_logloss: 0.133583 valid_1's auc: 0.835042 valid_1's binary_logloss: 0.145374 [10] valid_0's auc: 0.862281 valid_0's binary_logloss: 0.132357 valid_1's auc: 0.834154 valid_1's binary_logloss: 0.144649 [11] valid_0's auc: 0.864612 valid_0's binary_logloss: 0.131283 valid_1's auc: 0.834587 valid_1's binary_logloss: 0.143941 [12] valid_0's auc: 0.866377 valid_0's binary_logloss: 0.130299 valid_1's auc: 0.834242 valid_1's binary_logloss: 0.143366 [13] valid_0's auc: 0.868343 valid_0's binary_logloss: 0.129417 valid_1's auc: 0.833273 valid_1's binary_logloss: 0.142976 [14] valid_0's auc: 0.86957 valid_0's binary_logloss: 0.128593 valid_1's auc: 0.833783 valid_1's binary_logloss: 0.142567 [15] valid_0's auc: 0.871109 valid_0's binary_logloss: 0.127759 valid_1's auc: 0.834057 valid_1's binary_logloss: 0.142234 [16] valid_0's auc: 0.872893 valid_0's binary_logloss: 0.126996 valid_1's auc: 0.835329 valid_1's binary_logloss: 0.141809 [17] valid_0's auc: 0.874236 valid_0's binary_logloss: 0.12631 valid_1's auc: 0.834985 valid_1's binary_logloss: 0.141613 [18] valid_0's auc: 0.875324 valid_0's binary_logloss: 0.125725 valid_1's auc: 0.834942 valid_1's binary_logloss: 0.141363 [19] valid_0's auc: 0.876659 valid_0's binary_logloss: 0.125068 valid_1's auc: 0.835024 valid_1's binary_logloss: 0.141162 [20] valid_0's auc: 0.877885 valid_0's binary_logloss: 0.124484 valid_1's auc: 0.835893 valid_1's binary_logloss: 0.140933 [21] valid_0's auc: 0.879121 valid_0's binary_logloss: 0.12391 valid_1's auc: 0.837029 valid_1's binary_logloss: 0.140651 [22] valid_0's auc: 0.880116 valid_0's binary_logloss: 0.123339 valid_1's auc: 0.837366 valid_1's binary_logloss: 0.140547 [23] valid_0's auc: 0.881224 valid_0's binary_logloss: 0.12282 valid_1's auc: 0.837357 valid_1's binary_logloss: 0.140445 [24] valid_0's auc: 0.882014 valid_0's binary_logloss: 0.122386 valid_1's auc: 0.837343 valid_1's binary_logloss: 0.140371 [25] valid_0's auc: 0.88318 valid_0's binary_logloss: 0.121861 valid_1's auc: 0.83723 valid_1's binary_logloss: 0.140313 [26] valid_0's auc: 0.884008 valid_0's binary_logloss: 0.121441 valid_1's auc: 0.837761 valid_1's binary_logloss: 0.140173 [27] valid_0's auc: 0.884676 valid_0's binary_logloss: 0.121001 valid_1's auc: 0.838046 valid_1's binary_logloss: 0.140086 [28] valid_0's auc: 0.885524 valid_0's binary_logloss: 0.120598 valid_1's auc: 0.838029 valid_1's binary_logloss: 0.140051 [29] valid_0's auc: 0.886461 valid_0's binary_logloss: 0.120157 valid_1's auc: 0.837775 valid_1's binary_logloss: 0.140057 [30] valid_0's auc: 0.887053 valid_0's binary_logloss: 0.119807 valid_1's auc: 0.837472 valid_1's binary_logloss: 0.140111 [31] valid_0's auc: 0.888177 valid_0's binary_logloss: 0.119425 valid_1's auc: 0.837575 valid_1's binary_logloss: 0.140093 [32] valid_0's auc: 0.889072 valid_0's binary_logloss: 0.119055 valid_1's auc: 0.837158 valid_1's binary_logloss: 0.140195 [33] valid_0's auc: 0.889782 valid_0's binary_logloss: 0.118676 valid_1's auc: 0.837296 valid_1's binary_logloss: 0.140221 [34] valid_0's auc: 0.890876 valid_0's binary_logloss: 0.118304 valid_1's auc: 0.837481 valid_1's binary_logloss: 0.140165 [35] valid_0's auc: 0.891448 valid_0's binary_logloss: 0.11798 valid_1's auc: 0.837953 valid_1's binary_logloss: 0.140085 [36] valid_0's auc: 0.892165 valid_0's binary_logloss: 0.11764 valid_1's auc: 0.837794 valid_1's binary_logloss: 0.140112 [37] valid_0's auc: 0.892798 valid_0's binary_logloss: 0.117321 valid_1's auc: 0.837291 valid_1's binary_logloss: 0.140221 [38] valid_0's auc: 0.893318 valid_0's binary_logloss: 0.117028 valid_1's auc: 0.837278 valid_1's binary_logloss: 0.140221 [39] valid_0's auc: 0.894018 valid_0's binary_logloss: 0.116742 valid_1's auc: 0.83724 valid_1's binary_logloss: 0.140232 [40] valid_0's auc: 0.894781 valid_0's binary_logloss: 0.116373 valid_1's auc: 0.836901 valid_1's binary_logloss: 0.140328 [41] valid_0's auc: 0.895222 valid_0's binary_logloss: 0.116075 valid_1's auc: 0.836655 valid_1's binary_logloss: 0.140422 [42] valid_0's auc: 0.895842 valid_0's binary_logloss: 0.115755 valid_1's auc: 0.836383 valid_1's binary_logloss: 0.140503 [43] valid_0's auc: 0.896389 valid_0's binary_logloss: 0.115503 valid_1's auc: 0.836348 valid_1's binary_logloss: 0.140505 [44] valid_0's auc: 0.896843 valid_0's binary_logloss: 0.115204 valid_1's auc: 0.836521 valid_1's binary_logloss: 0.140518 [45] valid_0's auc: 0.897272 valid_0's binary_logloss: 0.114886 valid_1's auc: 0.836311 valid_1's binary_logloss: 0.140581 [46] valid_0's auc: 0.898034 valid_0's binary_logloss: 0.114544 valid_1's auc: 0.835871 valid_1's binary_logloss: 0.140663 [47] valid_0's auc: 0.898562 valid_0's binary_logloss: 0.114262 valid_1's auc: 0.835926 valid_1's binary_logloss: 0.140642 [48] valid_0's auc: 0.898919 valid_0's binary_logloss: 0.114006 valid_1's auc: 0.835849 valid_1's binary_logloss: 0.140687 [49] valid_0's auc: 0.899111 valid_0's binary_logloss: 0.113791 valid_1's auc: 0.835874 valid_1's binary_logloss: 0.140728 [50] valid_0's auc: 0.89987 valid_0's binary_logloss: 0.113543 valid_1's auc: 0.835915 valid_1's binary_logloss: 0.14075 [51] valid_0's auc: 0.90004 valid_0's binary_logloss: 0.113342 valid_1's auc: 0.835947 valid_1's binary_logloss: 0.140748 [52] valid_0's auc: 0.900405 valid_0's binary_logloss: 0.113087 valid_1's auc: 0.836011 valid_1's binary_logloss: 0.140767 [53] valid_0's auc: 0.900828 valid_0's binary_logloss: 0.112831 valid_1's auc: 0.836259 valid_1's binary_logloss: 0.140771 [54] valid_0's auc: 0.901597 valid_0's binary_logloss: 0.112604 valid_1's auc: 0.836296 valid_1's binary_logloss: 0.14078 [55] valid_0's auc: 0.901645 valid_0's binary_logloss: 0.112429 valid_1's auc: 0.836095 valid_1's binary_logloss: 0.140822 [56] valid_0's auc: 0.902162 valid_0's binary_logloss: 0.112169 valid_1's auc: 0.835965 valid_1's binary_logloss: 0.14086 [57] valid_0's auc: 0.902422 valid_0's binary_logloss: 0.111944 valid_1's auc: 0.835493 valid_1's binary_logloss: 0.140993
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.835412 valid_0's binary_logloss: 0.155721 valid_1's auc: 0.81973 valid_1's binary_logloss: 0.164844 [2] valid_0's auc: 0.841188 valid_0's binary_logloss: 0.150354 valid_1's auc: 0.823402 valid_1's binary_logloss: 0.16006 [3] valid_0's auc: 0.846758 valid_0's binary_logloss: 0.146288 valid_1's auc: 0.824811 valid_1's binary_logloss: 0.15621 [4] valid_0's auc: 0.850398 valid_0's binary_logloss: 0.143088 valid_1's auc: 0.830278 valid_1's binary_logloss: 0.153352 [5] valid_0's auc: 0.853086 valid_0's binary_logloss: 0.140514 valid_1's auc: 0.833574 valid_1's binary_logloss: 0.151071 [6] valid_0's auc: 0.855915 valid_0's binary_logloss: 0.138329 valid_1's auc: 0.834881 valid_1's binary_logloss: 0.149277 [7] valid_0's auc: 0.858115 valid_0's binary_logloss: 0.136481 valid_1's auc: 0.833603 valid_1's binary_logloss: 0.14786 [8] valid_0's auc: 0.859479 valid_0's binary_logloss: 0.134947 valid_1's auc: 0.834093 valid_1's binary_logloss: 0.146607 [9] valid_0's auc: 0.86143 valid_0's binary_logloss: 0.133519 valid_1's auc: 0.833898 valid_1's binary_logloss: 0.14559 [10] valid_0's auc: 0.862964 valid_0's binary_logloss: 0.132331 valid_1's auc: 0.835026 valid_1's binary_logloss: 0.144789 [11] valid_0's auc: 0.864277 valid_0's binary_logloss: 0.13126 valid_1's auc: 0.834957 valid_1's binary_logloss: 0.144152 [12] valid_0's auc: 0.865572 valid_0's binary_logloss: 0.130304 valid_1's auc: 0.833693 valid_1's binary_logloss: 0.143697 [13] valid_0's auc: 0.867519 valid_0's binary_logloss: 0.129385 valid_1's auc: 0.833158 valid_1's binary_logloss: 0.143184 [14] valid_0's auc: 0.869354 valid_0's binary_logloss: 0.128524 valid_1's auc: 0.833598 valid_1's binary_logloss: 0.142668 [15] valid_0's auc: 0.870553 valid_0's binary_logloss: 0.127746 valid_1's auc: 0.833467 valid_1's binary_logloss: 0.142302 [16] valid_0's auc: 0.871816 valid_0's binary_logloss: 0.126943 valid_1's auc: 0.83329 valid_1's binary_logloss: 0.142022 [17] valid_0's auc: 0.872964 valid_0's binary_logloss: 0.126266 valid_1's auc: 0.83279 valid_1's binary_logloss: 0.141891 [18] valid_0's auc: 0.874047 valid_0's binary_logloss: 0.125646 valid_1's auc: 0.831917 valid_1's binary_logloss: 0.141748 [19] valid_0's auc: 0.875336 valid_0's binary_logloss: 0.125072 valid_1's auc: 0.831274 valid_1's binary_logloss: 0.141658 [20] valid_0's auc: 0.876959 valid_0's binary_logloss: 0.124484 valid_1's auc: 0.831275 valid_1's binary_logloss: 0.141511 [21] valid_0's auc: 0.878049 valid_0's binary_logloss: 0.123928 valid_1's auc: 0.830813 valid_1's binary_logloss: 0.141459 [22] valid_0's auc: 0.878905 valid_0's binary_logloss: 0.123447 valid_1's auc: 0.83012 valid_1's binary_logloss: 0.141449 [23] valid_0's auc: 0.879827 valid_0's binary_logloss: 0.12295 valid_1's auc: 0.829554 valid_1's binary_logloss: 0.141492 [24] valid_0's auc: 0.880692 valid_0's binary_logloss: 0.122479 valid_1's auc: 0.829256 valid_1's binary_logloss: 0.141487 [25] valid_0's auc: 0.881715 valid_0's binary_logloss: 0.121994 valid_1's auc: 0.829326 valid_1's binary_logloss: 0.141362 [26] valid_0's auc: 0.883014 valid_0's binary_logloss: 0.121527 valid_1's auc: 0.829553 valid_1's binary_logloss: 0.14132 [27] valid_0's auc: 0.884245 valid_0's binary_logloss: 0.121024 valid_1's auc: 0.829624 valid_1's binary_logloss: 0.14127 [28] valid_0's auc: 0.885238 valid_0's binary_logloss: 0.12058 valid_1's auc: 0.829417 valid_1's binary_logloss: 0.141237 [29] valid_0's auc: 0.88602 valid_0's binary_logloss: 0.120198 valid_1's auc: 0.82917 valid_1's binary_logloss: 0.141201 [30] valid_0's auc: 0.88684 valid_0's binary_logloss: 0.119831 valid_1's auc: 0.82962 valid_1's binary_logloss: 0.141121 [31] valid_0's auc: 0.887965 valid_0's binary_logloss: 0.119437 valid_1's auc: 0.83035 valid_1's binary_logloss: 0.14101 [32] valid_0's auc: 0.88868 valid_0's binary_logloss: 0.119086 valid_1's auc: 0.82975 valid_1's binary_logloss: 0.141093 [33] valid_0's auc: 0.889895 valid_0's binary_logloss: 0.118649 valid_1's auc: 0.829977 valid_1's binary_logloss: 0.141037 [34] valid_0's auc: 0.890626 valid_0's binary_logloss: 0.118328 valid_1's auc: 0.829368 valid_1's binary_logloss: 0.141161 [35] valid_0's auc: 0.89116 valid_0's binary_logloss: 0.11806 valid_1's auc: 0.829262 valid_1's binary_logloss: 0.141183 [36] valid_0's auc: 0.891999 valid_0's binary_logloss: 0.11775 valid_1's auc: 0.828947 valid_1's binary_logloss: 0.14129 [37] valid_0's auc: 0.892306 valid_0's binary_logloss: 0.117477 valid_1's auc: 0.828544 valid_1's binary_logloss: 0.141389 [38] valid_0's auc: 0.892937 valid_0's binary_logloss: 0.117192 valid_1's auc: 0.827983 valid_1's binary_logloss: 0.141516 [39] valid_0's auc: 0.893563 valid_0's binary_logloss: 0.116869 valid_1's auc: 0.828068 valid_1's binary_logloss: 0.141517 [40] valid_0's auc: 0.893942 valid_0's binary_logloss: 0.11662 valid_1's auc: 0.827852 valid_1's binary_logloss: 0.141621
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.830474 valid_0's binary_logloss: 0.155928 valid_1's auc: 0.817343 valid_1's binary_logloss: 0.164928 [2] valid_0's auc: 0.842931 valid_0's binary_logloss: 0.1503 valid_1's auc: 0.82699 valid_1's binary_logloss: 0.15948 [3] valid_0's auc: 0.850877 valid_0's binary_logloss: 0.14631 valid_1's auc: 0.832212 valid_1's binary_logloss: 0.155775 [4] valid_0's auc: 0.854431 valid_0's binary_logloss: 0.143104 valid_1's auc: 0.83392 valid_1's binary_logloss: 0.152698 [5] valid_0's auc: 0.85663 valid_0's binary_logloss: 0.140582 valid_1's auc: 0.835094 valid_1's binary_logloss: 0.150349 [6] valid_0's auc: 0.859142 valid_0's binary_logloss: 0.138289 valid_1's auc: 0.836166 valid_1's binary_logloss: 0.148424 [7] valid_0's auc: 0.861364 valid_0's binary_logloss: 0.136413 valid_1's auc: 0.837184 valid_1's binary_logloss: 0.146912 [8] valid_0's auc: 0.862199 valid_0's binary_logloss: 0.134841 valid_1's auc: 0.837545 valid_1's binary_logloss: 0.145726 [9] valid_0's auc: 0.864095 valid_0's binary_logloss: 0.133364 valid_1's auc: 0.837242 valid_1's binary_logloss: 0.144736 [10] valid_0's auc: 0.866024 valid_0's binary_logloss: 0.132096 valid_1's auc: 0.837719 valid_1's binary_logloss: 0.143766 [11] valid_0's auc: 0.867454 valid_0's binary_logloss: 0.131002 valid_1's auc: 0.837865 valid_1's binary_logloss: 0.143009 [12] valid_0's auc: 0.868329 valid_0's binary_logloss: 0.130024 valid_1's auc: 0.837259 valid_1's binary_logloss: 0.14244 [13] valid_0's auc: 0.869137 valid_0's binary_logloss: 0.129145 valid_1's auc: 0.837689 valid_1's binary_logloss: 0.141896 [14] valid_0's auc: 0.870957 valid_0's binary_logloss: 0.128226 valid_1's auc: 0.838226 valid_1's binary_logloss: 0.141392 [15] valid_0's auc: 0.872273 valid_0's binary_logloss: 0.12745 valid_1's auc: 0.837906 valid_1's binary_logloss: 0.141019 [16] valid_0's auc: 0.873243 valid_0's binary_logloss: 0.12672 valid_1's auc: 0.837761 valid_1's binary_logloss: 0.140677 [17] valid_0's auc: 0.874251 valid_0's binary_logloss: 0.126044 valid_1's auc: 0.83701 valid_1's binary_logloss: 0.140582 [18] valid_0's auc: 0.875622 valid_0's binary_logloss: 0.125387 valid_1's auc: 0.836179 valid_1's binary_logloss: 0.140485 [19] valid_0's auc: 0.877031 valid_0's binary_logloss: 0.124759 valid_1's auc: 0.836188 valid_1's binary_logloss: 0.14029 [20] valid_0's auc: 0.878046 valid_0's binary_logloss: 0.124156 valid_1's auc: 0.836531 valid_1's binary_logloss: 0.140133 [21] valid_0's auc: 0.879478 valid_0's binary_logloss: 0.123507 valid_1's auc: 0.837068 valid_1's binary_logloss: 0.13995 [22] valid_0's auc: 0.880423 valid_0's binary_logloss: 0.123029 valid_1's auc: 0.836817 valid_1's binary_logloss: 0.139912 [23] valid_0's auc: 0.881684 valid_0's binary_logloss: 0.122492 valid_1's auc: 0.836983 valid_1's binary_logloss: 0.139762 [24] valid_0's auc: 0.882873 valid_0's binary_logloss: 0.121986 valid_1's auc: 0.837319 valid_1's binary_logloss: 0.139659 [25] valid_0's auc: 0.883597 valid_0's binary_logloss: 0.121566 valid_1's auc: 0.837154 valid_1's binary_logloss: 0.139623 [26] valid_0's auc: 0.884814 valid_0's binary_logloss: 0.121104 valid_1's auc: 0.836302 valid_1's binary_logloss: 0.139668 [27] valid_0's auc: 0.886026 valid_0's binary_logloss: 0.120635 valid_1's auc: 0.836521 valid_1's binary_logloss: 0.139601 [28] valid_0's auc: 0.887071 valid_0's binary_logloss: 0.120222 valid_1's auc: 0.836646 valid_1's binary_logloss: 0.139557 [29] valid_0's auc: 0.887946 valid_0's binary_logloss: 0.119804 valid_1's auc: 0.836735 valid_1's binary_logloss: 0.139518 [30] valid_0's auc: 0.88898 valid_0's binary_logloss: 0.119416 valid_1's auc: 0.836858 valid_1's binary_logloss: 0.139499 [31] valid_0's auc: 0.889792 valid_0's binary_logloss: 0.119058 valid_1's auc: 0.836917 valid_1's binary_logloss: 0.139463 [32] valid_0's auc: 0.890876 valid_0's binary_logloss: 0.118631 valid_1's auc: 0.836346 valid_1's binary_logloss: 0.139532 [33] valid_0's auc: 0.891629 valid_0's binary_logloss: 0.118259 valid_1's auc: 0.836206 valid_1's binary_logloss: 0.139603 [34] valid_0's auc: 0.892446 valid_0's binary_logloss: 0.117893 valid_1's auc: 0.836005 valid_1's binary_logloss: 0.139603 [35] valid_0's auc: 0.893407 valid_0's binary_logloss: 0.11752 valid_1's auc: 0.8361 valid_1's binary_logloss: 0.139574 [36] valid_0's auc: 0.893836 valid_0's binary_logloss: 0.117247 valid_1's auc: 0.836147 valid_1's binary_logloss: 0.139608 [37] valid_0's auc: 0.894774 valid_0's binary_logloss: 0.116913 valid_1's auc: 0.836601 valid_1's binary_logloss: 0.139569 [38] valid_0's auc: 0.895494 valid_0's binary_logloss: 0.116611 valid_1's auc: 0.836232 valid_1's binary_logloss: 0.139645 [39] valid_0's auc: 0.896102 valid_0's binary_logloss: 0.116275 valid_1's auc: 0.836415 valid_1's binary_logloss: 0.139653 [40] valid_0's auc: 0.896715 valid_0's binary_logloss: 0.115934 valid_1's auc: 0.836463 valid_1's binary_logloss: 0.139671 [41] valid_0's auc: 0.897232 valid_0's binary_logloss: 0.115612 valid_1's auc: 0.836223 valid_1's binary_logloss: 0.139762 [42] valid_0's auc: 0.897875 valid_0's binary_logloss: 0.11528 valid_1's auc: 0.836151 valid_1's binary_logloss: 0.139777 [43] valid_0's auc: 0.898493 valid_0's binary_logloss: 0.114999 valid_1's auc: 0.836216 valid_1's binary_logloss: 0.139761 [44] valid_0's auc: 0.899179 valid_0's binary_logloss: 0.114703 valid_1's auc: 0.836328 valid_1's binary_logloss: 0.139755
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] valid_0's auc: 0.834724 valid_0's binary_logloss: 0.15607 valid_1's auc: 0.822983 valid_1's binary_logloss: 0.165104 [2] valid_0's auc: 0.842835 valid_0's binary_logloss: 0.150494 valid_1's auc: 0.830472 valid_1's binary_logloss: 0.159671 [3] valid_0's auc: 0.847187 valid_0's binary_logloss: 0.146306 valid_1's auc: 0.830873 valid_1's binary_logloss: 0.155985 [4] valid_0's auc: 0.850394 valid_0's binary_logloss: 0.143088 valid_1's auc: 0.830975 valid_1's binary_logloss: 0.15321 [5] valid_0's auc: 0.853379 valid_0's binary_logloss: 0.140508 valid_1's auc: 0.832135 valid_1's binary_logloss: 0.150854 [6] valid_0's auc: 0.855463 valid_0's binary_logloss: 0.138297 valid_1's auc: 0.833116 valid_1's binary_logloss: 0.149013 [7] valid_0's auc: 0.856723 valid_0's binary_logloss: 0.136504 valid_1's auc: 0.833811 valid_1's binary_logloss: 0.147577 [8] valid_0's auc: 0.858076 valid_0's binary_logloss: 0.13495 valid_1's auc: 0.835315 valid_1's binary_logloss: 0.146273 [9] valid_0's auc: 0.861024 valid_0's binary_logloss: 0.133583 valid_1's auc: 0.835042 valid_1's binary_logloss: 0.145374 [10] valid_0's auc: 0.862281 valid_0's binary_logloss: 0.132357 valid_1's auc: 0.834154 valid_1's binary_logloss: 0.144649 [11] valid_0's auc: 0.864612 valid_0's binary_logloss: 0.131283 valid_1's auc: 0.834587 valid_1's binary_logloss: 0.143941 [12] valid_0's auc: 0.866377 valid_0's binary_logloss: 0.130299 valid_1's auc: 0.834242 valid_1's binary_logloss: 0.143366 [13] valid_0's auc: 0.868343 valid_0's binary_logloss: 0.129417 valid_1's auc: 0.833273 valid_1's binary_logloss: 0.142976 [14] valid_0's auc: 0.86957 valid_0's binary_logloss: 0.128593 valid_1's auc: 0.833783 valid_1's binary_logloss: 0.142567 [15] valid_0's auc: 0.871109 valid_0's binary_logloss: 0.127759 valid_1's auc: 0.834057 valid_1's binary_logloss: 0.142234 [16] valid_0's auc: 0.872893 valid_0's binary_logloss: 0.126996 valid_1's auc: 0.835329 valid_1's binary_logloss: 0.141809 [17] valid_0's auc: 0.874236 valid_0's binary_logloss: 0.12631 valid_1's auc: 0.834985 valid_1's binary_logloss: 0.141613 [18] valid_0's auc: 0.875324 valid_0's binary_logloss: 0.125725 valid_1's auc: 0.834942 valid_1's binary_logloss: 0.141363 [19] valid_0's auc: 0.876659 valid_0's binary_logloss: 0.125068 valid_1's auc: 0.835024 valid_1's binary_logloss: 0.141162 [20] valid_0's auc: 0.877885 valid_0's binary_logloss: 0.124484 valid_1's auc: 0.835893 valid_1's binary_logloss: 0.140933 [21] valid_0's auc: 0.879121 valid_0's binary_logloss: 0.12391 valid_1's auc: 0.837029 valid_1's binary_logloss: 0.140651 [22] valid_0's auc: 0.880116 valid_0's binary_logloss: 0.123339 valid_1's auc: 0.837366 valid_1's binary_logloss: 0.140547 [23] valid_0's auc: 0.881224 valid_0's binary_logloss: 0.12282 valid_1's auc: 0.837357 valid_1's binary_logloss: 0.140445 [24] valid_0's auc: 0.882014 valid_0's binary_logloss: 0.122386 valid_1's auc: 0.837343 valid_1's binary_logloss: 0.140371 [25] valid_0's auc: 0.88318 valid_0's binary_logloss: 0.121861 valid_1's auc: 0.83723 valid_1's binary_logloss: 0.140313 [26] valid_0's auc: 0.884008 valid_0's binary_logloss: 0.121441 valid_1's auc: 0.837761 valid_1's binary_logloss: 0.140173 [27] valid_0's auc: 0.884676 valid_0's binary_logloss: 0.121001 valid_1's auc: 0.838046 valid_1's binary_logloss: 0.140086 [28] valid_0's auc: 0.885524 valid_0's binary_logloss: 0.120598 valid_1's auc: 0.838029 valid_1's binary_logloss: 0.140051 [29] valid_0's auc: 0.886461 valid_0's binary_logloss: 0.120157 valid_1's auc: 0.837775 valid_1's binary_logloss: 0.140057 [30] valid_0's auc: 0.887053 valid_0's binary_logloss: 0.119807 valid_1's auc: 0.837472 valid_1's binary_logloss: 0.140111 [31] valid_0's auc: 0.888177 valid_0's binary_logloss: 0.119425 valid_1's auc: 0.837575 valid_1's binary_logloss: 0.140093 [32] valid_0's auc: 0.889072 valid_0's binary_logloss: 0.119055 valid_1's auc: 0.837158 valid_1's binary_logloss: 0.140195 [33] valid_0's auc: 0.889782 valid_0's binary_logloss: 0.118676 valid_1's auc: 0.837296 valid_1's binary_logloss: 0.140221 [34] valid_0's auc: 0.890876 valid_0's binary_logloss: 0.118304 valid_1's auc: 0.837481 valid_1's binary_logloss: 0.140165 [35] valid_0's auc: 0.891448 valid_0's binary_logloss: 0.11798 valid_1's auc: 0.837953 valid_1's binary_logloss: 0.140085 [36] valid_0's auc: 0.892165 valid_0's binary_logloss: 0.11764 valid_1's auc: 0.837794 valid_1's binary_logloss: 0.140112 [37] valid_0's auc: 0.892798 valid_0's binary_logloss: 0.117321 valid_1's auc: 0.837291 valid_1's binary_logloss: 0.140221 [38] valid_0's auc: 0.893318 valid_0's binary_logloss: 0.117028 valid_1's auc: 0.837278 valid_1's binary_logloss: 0.140221 [39] valid_0's auc: 0.894018 valid_0's binary_logloss: 0.116742 valid_1's auc: 0.83724 valid_1's binary_logloss: 0.140232 [40] valid_0's auc: 0.894781 valid_0's binary_logloss: 0.116373 valid_1's auc: 0.836901 valid_1's binary_logloss: 0.140328 [41] valid_0's auc: 0.895222 valid_0's binary_logloss: 0.116075 valid_1's auc: 0.836655 valid_1's binary_logloss: 0.140422 [42] valid_0's auc: 0.895842 valid_0's binary_logloss: 0.115755 valid_1's auc: 0.836383 valid_1's binary_logloss: 0.140503 [43] valid_0's auc: 0.896389 valid_0's binary_logloss: 0.115503 valid_1's auc: 0.836348 valid_1's binary_logloss: 0.140505 [44] valid_0's auc: 0.896843 valid_0's binary_logloss: 0.115204 valid_1's auc: 0.836521 valid_1's binary_logloss: 0.140518 [45] valid_0's auc: 0.897272 valid_0's binary_logloss: 0.114886 valid_1's auc: 0.836311 valid_1's binary_logloss: 0.140581 [46] valid_0's auc: 0.898034 valid_0's binary_logloss: 0.114544 valid_1's auc: 0.835871 valid_1's binary_logloss: 0.140663 [47] valid_0's auc: 0.898562 valid_0's binary_logloss: 0.114262 valid_1's auc: 0.835926 valid_1's binary_logloss: 0.140642 [48] valid_0's auc: 0.898919 valid_0's binary_logloss: 0.114006 valid_1's auc: 0.835849 valid_1's binary_logloss: 0.140687 [49] valid_0's auc: 0.899111 valid_0's binary_logloss: 0.113791 valid_1's auc: 0.835874 valid_1's binary_logloss: 0.140728 [50] valid_0's auc: 0.89987 valid_0's binary_logloss: 0.113543 valid_1's auc: 0.835915 valid_1's binary_logloss: 0.14075 [51] valid_0's auc: 0.90004 valid_0's binary_logloss: 0.113342 valid_1's auc: 0.835947 valid_1's binary_logloss: 0.140748 [52] valid_0's auc: 0.900405 valid_0's binary_logloss: 0.113087 valid_1's auc: 0.836011 valid_1's binary_logloss: 0.140767 [53] valid_0's auc: 0.900828 valid_0's binary_logloss: 0.112831 valid_1's auc: 0.836259 valid_1's binary_logloss: 0.140771 [54] valid_0's auc: 0.901597 valid_0's binary_logloss: 0.112604 valid_1's auc: 0.836296 valid_1's binary_logloss: 0.14078 [55] valid_0's auc: 0.901645 valid_0's binary_logloss: 0.112429 valid_1's auc: 0.836095 valid_1's binary_logloss: 0.140822 [56] valid_0's auc: 0.902162 valid_0's binary_logloss: 0.112169 valid_1's auc: 0.835965 valid_1's binary_logloss: 0.14086 [57] valid_0's auc: 0.902422 valid_0's binary_logloss: 0.111944 valid_1's auc: 0.835493 valid_1's binary_logloss: 0.140993
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. "
[1] training's auc: 0.824305 training's binary_logloss: 0.156217 valid_1's auc: 0.819488 valid_1's binary_logloss: 0.165016 [2] training's auc: 0.828798 training's binary_logloss: 0.150959 valid_1's auc: 0.822075 valid_1's binary_logloss: 0.159734 [3] training's auc: 0.839609 training's binary_logloss: 0.147147 valid_1's auc: 0.829436 valid_1's binary_logloss: 0.156119 [4] training's auc: 0.845158 training's binary_logloss: 0.144107 valid_1's auc: 0.836147 valid_1's binary_logloss: 0.153073 [5] training's auc: 0.847711 training's binary_logloss: 0.14162 valid_1's auc: 0.839041 valid_1's binary_logloss: 0.150773 [6] training's auc: 0.849184 training's binary_logloss: 0.139622 valid_1's auc: 0.839076 valid_1's binary_logloss: 0.148948 [7] training's auc: 0.85094 training's binary_logloss: 0.13786 valid_1's auc: 0.839943 valid_1's binary_logloss: 0.147346 [8] training's auc: 0.853386 training's binary_logloss: 0.136418 valid_1's auc: 0.84098 valid_1's binary_logloss: 0.146068 [9] training's auc: 0.854751 training's binary_logloss: 0.135188 valid_1's auc: 0.840686 valid_1's binary_logloss: 0.14506 [10] training's auc: 0.855887 training's binary_logloss: 0.134098 valid_1's auc: 0.841299 valid_1's binary_logloss: 0.144134 [11] training's auc: 0.856935 training's binary_logloss: 0.133117 valid_1's auc: 0.841659 valid_1's binary_logloss: 0.14327 [12] training's auc: 0.858464 training's binary_logloss: 0.132253 valid_1's auc: 0.841543 valid_1's binary_logloss: 0.14261 [13] training's auc: 0.859951 training's binary_logloss: 0.131471 valid_1's auc: 0.841645 valid_1's binary_logloss: 0.14205 [14] training's auc: 0.861343 training's binary_logloss: 0.130767 valid_1's auc: 0.841389 valid_1's binary_logloss: 0.14164 [15] training's auc: 0.863266 training's binary_logloss: 0.130102 valid_1's auc: 0.84154 valid_1's binary_logloss: 0.141254 [16] training's auc: 0.864645 training's binary_logloss: 0.129469 valid_1's auc: 0.841108 valid_1's binary_logloss: 0.140999 [17] training's auc: 0.865605 training's binary_logloss: 0.128901 valid_1's auc: 0.840563 valid_1's binary_logloss: 0.140752 [18] training's auc: 0.866635 training's binary_logloss: 0.128334 valid_1's auc: 0.839571 valid_1's binary_logloss: 0.140569 [19] training's auc: 0.867769 training's binary_logloss: 0.127836 valid_1's auc: 0.839656 valid_1's binary_logloss: 0.14032 [20] training's auc: 0.868754 training's binary_logloss: 0.127334 valid_1's auc: 0.839451 valid_1's binary_logloss: 0.140153 [21] training's auc: 0.86983 training's binary_logloss: 0.12692 valid_1's auc: 0.839806 valid_1's binary_logloss: 0.139937 [22] training's auc: 0.870884 training's binary_logloss: 0.126484 valid_1's auc: 0.839529 valid_1's binary_logloss: 0.13983 [23] training's auc: 0.871649 training's binary_logloss: 0.126082 valid_1's auc: 0.839217 valid_1's binary_logloss: 0.139727 [24] training's auc: 0.872461 training's binary_logloss: 0.125727 valid_1's auc: 0.838771 valid_1's binary_logloss: 0.139684 [25] training's auc: 0.873292 training's binary_logloss: 0.125365 valid_1's auc: 0.838891 valid_1's binary_logloss: 0.139609 [26] training's auc: 0.874599 training's binary_logloss: 0.124992 valid_1's auc: 0.839175 valid_1's binary_logloss: 0.139492 [27] training's auc: 0.875485 training's binary_logloss: 0.124654 valid_1's auc: 0.83916 valid_1's binary_logloss: 0.139441 [28] training's auc: 0.876195 training's binary_logloss: 0.124346 valid_1's auc: 0.838877 valid_1's binary_logloss: 0.139445 [29] training's auc: 0.877178 training's binary_logloss: 0.124027 valid_1's auc: 0.839368 valid_1's binary_logloss: 0.139322 [30] training's auc: 0.878447 training's binary_logloss: 0.123667 valid_1's auc: 0.838922 valid_1's binary_logloss: 0.139324 [31] training's auc: 0.879197 training's binary_logloss: 0.123402 valid_1's auc: 0.838453 valid_1's binary_logloss: 0.139316 [32] training's auc: 0.880183 training's binary_logloss: 0.123092 valid_1's auc: 0.838572 valid_1's binary_logloss: 0.139283 [33] training's auc: 0.881377 training's binary_logloss: 0.122805 valid_1's auc: 0.838535 valid_1's binary_logloss: 0.139271 [34] training's auc: 0.882181 training's binary_logloss: 0.122567 valid_1's auc: 0.83825 valid_1's binary_logloss: 0.139275 [35] training's auc: 0.883237 training's binary_logloss: 0.122275 valid_1's auc: 0.838533 valid_1's binary_logloss: 0.139208 [36] training's auc: 0.884433 training's binary_logloss: 0.121989 valid_1's auc: 0.838446 valid_1's binary_logloss: 0.139217 [37] training's auc: 0.885423 training's binary_logloss: 0.121707 valid_1's auc: 0.838379 valid_1's binary_logloss: 0.139221 [38] training's auc: 0.88628 training's binary_logloss: 0.121411 valid_1's auc: 0.838156 valid_1's binary_logloss: 0.139254 [39] training's auc: 0.886985 training's binary_logloss: 0.121175 valid_1's auc: 0.838432 valid_1's binary_logloss: 0.139181 [40] training's auc: 0.887543 training's binary_logloss: 0.120933 valid_1's auc: 0.838247 valid_1's binary_logloss: 0.139215 [41] training's auc: 0.888425 training's binary_logloss: 0.120677 valid_1's auc: 0.83826 valid_1's binary_logloss: 0.139218 GridSearchCV 최적 파라미터: {'max_depth': 128, 'min_child_samples': 100, 'num_leaves': 32, 'subsample': 0.8} ROC AUC: 0.8417
### 최적화된 파라미터를 LightGBM에 다시 적용
lgbm_clf = LGBMClassifier(n_estimators = 1000, num_leaves = 32, sumbsample = 0.8,
min_child_samples = 100, max_depth = 128)
evals = [(X_test, y_test)]
lgbm_clf.fit(X_train, y_train, early_stopping_rounds = 100,
eval_metric = "auc", eval_set = evals, verbose = True)
lgbm_roc_score = roc_auc_score(y_test, lgbm_clf.predict_proba(X_test)[:,1],average = 'macro')
print('ROC AUC: {0:.4f}'.format(lgbm_roc_score))
/usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:726: UserWarning: 'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. Pass 'early_stopping()' callback via 'callbacks' argument instead. _log_warning("'early_stopping_rounds' argument is deprecated and will be removed in a future release of LightGBM. " /usr/local/lib/python3.10/dist-packages/lightgbm/sklearn.py:736: UserWarning: 'verbose' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead. _log_warning("'verbose' argument is deprecated and will be removed in a future release of LightGBM. "
[LightGBM] [Warning] Unknown parameter: sumbsample [1] valid_0's auc: 0.819488 valid_0's binary_logloss: 0.165016 [2] valid_0's auc: 0.822075 valid_0's binary_logloss: 0.159734 [3] valid_0's auc: 0.829436 valid_0's binary_logloss: 0.156119 [4] valid_0's auc: 0.836147 valid_0's binary_logloss: 0.153073 [5] valid_0's auc: 0.839041 valid_0's binary_logloss: 0.150773 [6] valid_0's auc: 0.839076 valid_0's binary_logloss: 0.148948 [7] valid_0's auc: 0.839943 valid_0's binary_logloss: 0.147346 [8] valid_0's auc: 0.84098 valid_0's binary_logloss: 0.146068 [9] valid_0's auc: 0.840686 valid_0's binary_logloss: 0.14506 [10] valid_0's auc: 0.841299 valid_0's binary_logloss: 0.144134 [11] valid_0's auc: 0.841659 valid_0's binary_logloss: 0.14327 [12] valid_0's auc: 0.841543 valid_0's binary_logloss: 0.14261 [13] valid_0's auc: 0.841645 valid_0's binary_logloss: 0.14205 [14] valid_0's auc: 0.841389 valid_0's binary_logloss: 0.14164 [15] valid_0's auc: 0.84154 valid_0's binary_logloss: 0.141254 [16] valid_0's auc: 0.841108 valid_0's binary_logloss: 0.140999 [17] valid_0's auc: 0.840563 valid_0's binary_logloss: 0.140752 [18] valid_0's auc: 0.839571 valid_0's binary_logloss: 0.140569 [19] valid_0's auc: 0.839656 valid_0's binary_logloss: 0.14032 [20] valid_0's auc: 0.839451 valid_0's binary_logloss: 0.140153 [21] valid_0's auc: 0.839806 valid_0's binary_logloss: 0.139937 [22] valid_0's auc: 0.839529 valid_0's binary_logloss: 0.13983 [23] valid_0's auc: 0.839217 valid_0's binary_logloss: 0.139727 [24] valid_0's auc: 0.838771 valid_0's binary_logloss: 0.139684 [25] valid_0's auc: 0.838891 valid_0's binary_logloss: 0.139609 [26] valid_0's auc: 0.839175 valid_0's binary_logloss: 0.139492 [27] valid_0's auc: 0.83916 valid_0's binary_logloss: 0.139441 [28] valid_0's auc: 0.838877 valid_0's binary_logloss: 0.139445 [29] valid_0's auc: 0.839368 valid_0's binary_logloss: 0.139322 [30] valid_0's auc: 0.838922 valid_0's binary_logloss: 0.139324 [31] valid_0's auc: 0.838453 valid_0's binary_logloss: 0.139316 [32] valid_0's auc: 0.838572 valid_0's binary_logloss: 0.139283 [33] valid_0's auc: 0.838535 valid_0's binary_logloss: 0.139271 [34] valid_0's auc: 0.83825 valid_0's binary_logloss: 0.139275 [35] valid_0's auc: 0.838533 valid_0's binary_logloss: 0.139208 [36] valid_0's auc: 0.838446 valid_0's binary_logloss: 0.139217 [37] valid_0's auc: 0.838379 valid_0's binary_logloss: 0.139221 [38] valid_0's auc: 0.838156 valid_0's binary_logloss: 0.139254 [39] valid_0's auc: 0.838432 valid_0's binary_logloss: 0.139181 [40] valid_0's auc: 0.838247 valid_0's binary_logloss: 0.139215 [41] valid_0's auc: 0.83826 valid_0's binary_logloss: 0.139218 [42] valid_0's auc: 0.838578 valid_0's binary_logloss: 0.139154 [43] valid_0's auc: 0.83859 valid_0's binary_logloss: 0.139169 [44] valid_0's auc: 0.838508 valid_0's binary_logloss: 0.139168 [45] valid_0's auc: 0.838529 valid_0's binary_logloss: 0.139115 [46] valid_0's auc: 0.838474 valid_0's binary_logloss: 0.139123 [47] valid_0's auc: 0.839008 valid_0's binary_logloss: 0.139046 [48] valid_0's auc: 0.838863 valid_0's binary_logloss: 0.139068 [49] valid_0's auc: 0.83888 valid_0's binary_logloss: 0.13906 [50] valid_0's auc: 0.838809 valid_0's binary_logloss: 0.139075 [51] valid_0's auc: 0.83859 valid_0's binary_logloss: 0.139096 [52] valid_0's auc: 0.838282 valid_0's binary_logloss: 0.139147 [53] valid_0's auc: 0.838288 valid_0's binary_logloss: 0.139141 [54] valid_0's auc: 0.838278 valid_0's binary_logloss: 0.139142 [55] valid_0's auc: 0.838434 valid_0's binary_logloss: 0.139125 [56] valid_0's auc: 0.838412 valid_0's binary_logloss: 0.139145 [57] valid_0's auc: 0.838626 valid_0's binary_logloss: 0.139127 [58] valid_0's auc: 0.838384 valid_0's binary_logloss: 0.139188 [59] valid_0's auc: 0.838063 valid_0's binary_logloss: 0.139236 [60] valid_0's auc: 0.838145 valid_0's binary_logloss: 0.13923 [61] valid_0's auc: 0.837988 valid_0's binary_logloss: 0.139245 [62] valid_0's auc: 0.838005 valid_0's binary_logloss: 0.139256 [63] valid_0's auc: 0.837845 valid_0's binary_logloss: 0.139268 [64] valid_0's auc: 0.837656 valid_0's binary_logloss: 0.139293 [65] valid_0's auc: 0.837549 valid_0's binary_logloss: 0.139317 [66] valid_0's auc: 0.83779 valid_0's binary_logloss: 0.139279 [67] valid_0's auc: 0.837827 valid_0's binary_logloss: 0.1393 [68] valid_0's auc: 0.837746 valid_0's binary_logloss: 0.13934 [69] valid_0's auc: 0.837685 valid_0's binary_logloss: 0.139328 [70] valid_0's auc: 0.837589 valid_0's binary_logloss: 0.139355 [71] valid_0's auc: 0.837816 valid_0's binary_logloss: 0.139335 [72] valid_0's auc: 0.837883 valid_0's binary_logloss: 0.139324 [73] valid_0's auc: 0.837614 valid_0's binary_logloss: 0.139416 [74] valid_0's auc: 0.837565 valid_0's binary_logloss: 0.139416 [75] valid_0's auc: 0.837359 valid_0's binary_logloss: 0.139475 [76] valid_0's auc: 0.837393 valid_0's binary_logloss: 0.139458 [77] valid_0's auc: 0.837358 valid_0's binary_logloss: 0.13945 [78] valid_0's auc: 0.837298 valid_0's binary_logloss: 0.139506 [79] valid_0's auc: 0.837193 valid_0's binary_logloss: 0.139552 [80] valid_0's auc: 0.836945 valid_0's binary_logloss: 0.1396 [81] valid_0's auc: 0.836893 valid_0's binary_logloss: 0.139662 [82] valid_0's auc: 0.836907 valid_0's binary_logloss: 0.139651 [83] valid_0's auc: 0.836779 valid_0's binary_logloss: 0.139714 [84] valid_0's auc: 0.836733 valid_0's binary_logloss: 0.139704 [85] valid_0's auc: 0.836814 valid_0's binary_logloss: 0.139706 [86] valid_0's auc: 0.836638 valid_0's binary_logloss: 0.139738 [87] valid_0's auc: 0.836471 valid_0's binary_logloss: 0.139816 [88] valid_0's auc: 0.836511 valid_0's binary_logloss: 0.139833 [89] valid_0's auc: 0.836355 valid_0's binary_logloss: 0.13986 [90] valid_0's auc: 0.836314 valid_0's binary_logloss: 0.139907 [91] valid_0's auc: 0.836143 valid_0's binary_logloss: 0.139945 [92] valid_0's auc: 0.836124 valid_0's binary_logloss: 0.139954 [93] valid_0's auc: 0.836073 valid_0's binary_logloss: 0.139961 [94] valid_0's auc: 0.83596 valid_0's binary_logloss: 0.139981 [95] valid_0's auc: 0.835924 valid_0's binary_logloss: 0.140014 [96] valid_0's auc: 0.835947 valid_0's binary_logloss: 0.140001 [97] valid_0's auc: 0.835798 valid_0's binary_logloss: 0.140069 [98] valid_0's auc: 0.835699 valid_0's binary_logloss: 0.140112 [99] valid_0's auc: 0.835598 valid_0's binary_logloss: 0.140139 [100] valid_0's auc: 0.835567 valid_0's binary_logloss: 0.140156 [101] valid_0's auc: 0.83541 valid_0's binary_logloss: 0.140183 [102] valid_0's auc: 0.835235 valid_0's binary_logloss: 0.140234 [103] valid_0's auc: 0.835304 valid_0's binary_logloss: 0.140213 [104] valid_0's auc: 0.834946 valid_0's binary_logloss: 0.140301 [105] valid_0's auc: 0.834578 valid_0's binary_logloss: 0.140374 [106] valid_0's auc: 0.834617 valid_0's binary_logloss: 0.140395 [107] valid_0's auc: 0.834575 valid_0's binary_logloss: 0.14042 [108] valid_0's auc: 0.834393 valid_0's binary_logloss: 0.140467 [109] valid_0's auc: 0.834307 valid_0's binary_logloss: 0.14051 [110] valid_0's auc: 0.834382 valid_0's binary_logloss: 0.14051 [111] valid_0's auc: 0.83436 valid_0's binary_logloss: 0.14054 ROC AUC: 0.8417
- 테스트 데이터에서의 ROC-AUC가 약 로 측정됨