You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to optimize the hyperparameters in an xgboost model using Bayesian optimization (mlrmbo R package). The code below seem to produce reasonable results, but the problem I keep facing is that the results are not reproducible, despite using a consistent seed.
I am including some simplified code below, and some fake data. As you can see in the output (bayes_1 and bayes_2), the algorithm gives consistent results (i.e. same hyperparameters values) for the design phase and for the first few steps in the optimization process, but then the values from the 2 identical runs diverge.
I am trying to optimize the hyperparameters in an xgboost model using Bayesian optimization (mlrmbo R package). The code below seem to produce reasonable results, but the problem I keep facing is that the results are not reproducible, despite using a consistent seed.
I am including some simplified code below, and some fake data. As you can see in the output (bayes_1 and bayes_2), the algorithm gives consistent results (i.e. same hyperparameters values) for the design phase and for the first few steps in the optimization process, but then the values from the 2 identical runs diverge.
I'd appreciate any feedback you may have!
The text was updated successfully, but these errors were encountered: