Reference

xgbtune.tune.tune_xgb_model(params, x_train, y_train, x_val=None, y_val=None, nfold=3, stratified=False, folds=None, shuffle=True, tune_params={}, max_round_count=5000, loss_compare=<built-in function lt>, pass_count=2, verbose=True)

Tunes a XGBoost model

Examples

>>> params, round_count = tune_xgb_model(x, y, x_val, y_val, model_params)
Parameters:
  • params – A dictionary with the base xgboost parameters to use
  • x_train – Train set
  • y_train – Train labels
  • x_val – Validation set
  • y_val – Validation labels
  • nfold – Number of folds for cv
  • stratified – Perform stratified sampling
  • folds – Sklearn KFolds or StratifiedKFolds object
  • shuffle – shuffle data on cross validation
  • tune_params – dictionary containing list of values to test to each parameter
  • max_round_count – Maximum number of rounds during training
  • pass_count – Number of tuning pass to do
Returns:

A tuple of tuned parameters and round count.