You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I compile the code, and there is a error in learn.py, I can't figure out how to fix it.
Here is the traceback
OS: OS X 10.11.6
XGBoost: 0.6
Training classifier [0]: PropertyInspectionXGBRegressor(base_score=0.5, colsample_bytree=0.7,
div69=False, early_stopping_rounds=360, eval_set_size=4000,
gamma=0, learning_rate=0.005, max_delta_step=0,
max_depth=9, min_child_weight=6, missing=None,
n_estimators=10000, nthread=10, objective='reg:linear',
seed=0, silent=1, subsample=0.7, take_log=False,
take_pow2=False, take_root3=False, take_sqrt=False)
Traceback (most recent call last):
File "learn.py", line 168, in
clf.fit_cv(X_train, Y_train, [(X_cv, Y_cv)])
File "/Users/able/Contest/Liberty/kaggle-lmgpip-master/xgbex.py", line 87, in fit_cv
return self.fit(X, y, eval_set)
File "/Users/able/Contest/Liberty/kaggle-lmgpip-master/xgbex.py", line 82, in fit
early_stopping_rounds=self.early_stopping_rounds, verbose=0)
File "//anaconda/lib/python2.7/site-packages/xgboost/sklearn.py", line 251, in fit
verbose_eval=verbose)
File "//anaconda/lib/python2.7/site-packages/xgboost/training.py", line 205, in train
xgb_model=xgb_model, callbacks=callbacks)
File "//anaconda/lib/python2.7/site-packages/xgboost/training.py", line 92, in _train_internal
evaluation_result_list = [(k, float(v)) for k, v in res[1:]]
ValueError: need more than 1 value to unpack
The text was updated successfully, but these errors were encountered:
rraabb
changed the title
IndexError: list index out of range
ValueError: need more than 1 value to unpack
Aug 21, 2016
Hi, I compile the code, and there is a error in learn.py, I can't figure out how to fix it.
Here is the traceback
OS: OS X 10.11.6
XGBoost: 0.6
Training classifier [0]: PropertyInspectionXGBRegressor(base_score=0.5, colsample_bytree=0.7,
div69=False, early_stopping_rounds=360, eval_set_size=4000,
gamma=0, learning_rate=0.005, max_delta_step=0,
max_depth=9, min_child_weight=6, missing=None,
n_estimators=10000, nthread=10, objective='reg:linear',
seed=0, silent=1, subsample=0.7, take_log=False,
take_pow2=False, take_root3=False, take_sqrt=False)
Traceback (most recent call last):
File "learn.py", line 168, in
clf.fit_cv(X_train, Y_train, [(X_cv, Y_cv)])
File "/Users/able/Contest/Liberty/kaggle-lmgpip-master/xgbex.py", line 87, in fit_cv
return self.fit(X, y, eval_set)
File "/Users/able/Contest/Liberty/kaggle-lmgpip-master/xgbex.py", line 82, in fit
early_stopping_rounds=self.early_stopping_rounds, verbose=0)
File "//anaconda/lib/python2.7/site-packages/xgboost/sklearn.py", line 251, in fit
verbose_eval=verbose)
File "//anaconda/lib/python2.7/site-packages/xgboost/training.py", line 205, in train
xgb_model=xgb_model, callbacks=callbacks)
File "//anaconda/lib/python2.7/site-packages/xgboost/training.py", line 92, in _train_internal
evaluation_result_list = [(k, float(v)) for k, v in res[1:]]
ValueError: need more than 1 value to unpack
The text was updated successfully, but these errors were encountered: