Data Science Asked by hasanfarooq on August 17, 2021
I want to quantify compute cost of hyper-parameter search for xgboost model. One way can be to measure training time with one particular hyper parameter configuration chosen for training and use it as proxy for compute cost. Can we quantify compute cost based on hyper parameters of this model depending upon value of hyper parameters chosen e.g., analytical expression based on max depth, num of estimators, min child weight, gamma etc or can you suggest some other way to quantify this compute cost?
I want to measure for each particular hyper parameters chosen for training on same set of data what will be the model performance and compute cost!
If you have the time on hand, you could simply measure the time taken for all combinations of hyper parameter values in a Grid Search, preferably with repetition. It's unlikely that any theoretical analytical expression will provide adequate accuracy for predicting the compute cost, as there as so many factors that contribute noise to the compute time.
You could even build a regression model to predict the computing cost of new hyper parameter combinations if you wanted to.
Answered by Cameron Chandler on August 17, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP