TransWikia.com

Determine model hyper-parameter values for grid search

Data Science Asked by randunu galhena on December 18, 2020

I built machine learning model for Ridge,lasso, elastic net and linear regression, for that I used gridsearch for the parameter tuning, i want to know how give value range for **params Ridge ** below code? example consider alpha parameter there i uses for alpha 1,0.1,0.01,0.001,0.0001,0 but i haven’t idea how this values determine each models.(ridge/lasso/elastic) can some one explain these things?

 from sklearn.linear_model import Ridge
    ridge_reg = Ridge()
    from sklearn.model_selection import GridSearchCV
    params_Ridge = {'alpha': [1,0.1,0.01,0.001,0.0001,0] , "fit_intercept": [True, False], "solver": ['svd', 'cholesky', 'lsqr', 'sparse_cg', 'sag', 'saga']}
    Ridge_GS = GridSearchCV(ridge_reg, param_grid=params_Ridge, n_jobs=-1)
    Ridge_GS.fit(x_train,y_train)
    Ridge_GS.best_params_

One Answer

I am not sure if I understand your question correctly, but in your model, you are tuning your "alpha" parameter, you have a range from 1 to 0. (1 -> 0.1 -> 0.01 -> 0.001 -> 0.0001 -> 0).

The grid search will evaluate each algorithm (SVD, CHOLESKY,...) with each possible value of your "alpha" parameter. It will define the score for each alpha parameter (eg. accuracy / auc). The score metric depends on your estimator / choice

Documentation:

https://scikit-learn.org/stable/modules/generated/sklearn.model_selection.GridSearchCV.html

Answered by aze45sq6d on December 18, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP