Data Science Asked by Donald S on September 25, 2021
I am trying to use GridSearchCV on a CatBoostRegressor algorithm, but get some "unexpected keyword" errors on 3 different params (classes_count, auto_class_weights, and bayesian_matrix_reg) when using these params in the grid. I got the list of params from the algorithm itself, after calling fit(), so these params should exist.
I double checked with the website, and can find the first 2 in the documentation, but can’t find the bayesian_matrix_reg parameter. Not sure where this came from, but the model is using it.
I am currently commenting out these 3 params, but would like to avoid having to manually remove these parameters that exist, but still give unexpected keyword errors when testing.
I am using catboost 0.23.2
My code:
CBoost = CatBoostRegressor(logging_level='Silent')
CBoost.fit(train, y_train)
CBoost.get_all_params()
{'nan_mode': 'Min',
'eval_metric': 'RMSE',
'iterations': 1000,
'sampling_frequency': 'PerTree',
'leaf_estimation_method': 'Newton',
'grow_policy': 'SymmetricTree',
'penalties_coefficient': 1,
'boosting_type': 'Plain',
'model_shrink_mode': 'Constant',
'feature_border_type': 'GreedyLogSum',
'bayesian_matrix_reg': 0.10000000149011612,
'l2_leaf_reg': 3,
'random_strength': 1,
'rsm': 1,
'boost_from_average': True,
'model_size_reg': 0.5,
'subsample': 0.800000011920929,
'use_best_model': False,
'random_seed': 0,
'depth': 6,
'border_count': 254,
'classes_count': 0,
'auto_class_weights': 'None',
'sparse_features_conflict_fraction': 0,
'leaf_estimation_backtracking': 'AnyImprovement',
'best_model_min_trees': 1,
'model_shrink_rate': 0,
'min_data_in_leaf': 1,
'loss_function': 'RMSE',
'learning_rate': 0.04174000024795532,
'score_function': 'Cosine',
'task_type': 'CPU',
'leaf_estimation_iterations': 1,
'bootstrap_type': 'MVS',
'max_leaves': 64}
...
cb = CatBoostRegressor()
...
gsc = GridSearchCV(
estimator=cb,
param_grid=param_grid,
scoring='neg_mean_squared_error',
cv=inner_cv,
verbose=0,
return_train_score=True,
refit='nmse'
)
...
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-76-a22257b2d93a> in <module>
113 )
114
--> 115 grid_result = runGSCV(2, train, y_train)
<ipython-input-9-594f5556108c> in runGSCV(num_trials, features, y_values)
7 with MyTimer():
8 #grid_result = gsc.fit(train, y_train)
----> 9 grid_result = gsc.fit(features, y_values)
10 non_nested_scores[i] = grid_result.best_score_
11 if (competition == 'SR'):
~miniconda3envsds_tensorflowlibsite-packagessklearnutilsvalidation.py in inner_f(*args, **kwargs)
71 FutureWarning)
72 kwargs.update({k: arg for k, arg in zip(sig.parameters, args)})
---> 73 return f(**kwargs)
74 return inner_f
75
~miniconda3envsds_tensorflowlibsite-packagessklearnmodel_selection_search.py in fit(self, X, y, groups, **fit_params)
760 # of the params are estimators as well.
761 self.best_estimator_ = clone(clone(base_estimator).set_params(
--> 762 **self.best_params_))
763 refit_start_time = time.time()
764 if y is not None:
~miniconda3envsds_tensorflowlibsite-packagessklearnutilsvalidation.py in inner_f(*args, **kwargs)
71 FutureWarning)
72 kwargs.update({k: arg for k, arg in zip(sig.parameters, args)})
---> 73 return f(**kwargs)
74 return inner_f
75
~miniconda3envsds_tensorflowlibsite-packagessklearnbase.py in clone(estimator, safe)
86 for name, param in new_object_params.items():
87 new_object_params[name] = clone(param, safe=False)
---> 88 new_object = klass(**new_object_params)
89 params_set = new_object.get_params(deep=False)
90
TypeError: __init__() got an unexpected keyword argument 'bayesian_matrix_reg'
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP