You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Observe that the output of the two runs of the ParametrizedMetaModel differ, despite providing the same seed. While the two runs of MetaModel give the exact same results.
Observed Results
ParametrizedMetaModel(multivariate_optimizer=CmaFmin2) produced different results for two runs on the same problem, despite using the same seed.
It does not seem to matter how the seed is set. E.g., instead setting it with np.random.seed(seed) has the same effect.
Expected Results
When providing the same seed, two runs of the same algorithm on the same problem should give exactly the same results.
(This is the case for all other algorithms I checked, e.g., MetaModel)
Ideally I think this should be fixed, so it is seedable, but I think this should at least be in the documentation and output a warning when it is used. I also note that CmaFmin2 is not listed under optimizers in the documentation at all, nor is it included in the list generated by sorted(nevergrad.optimizers.registry.keys()) which the documentation claims should list all optimisers.
Steps to reproduce
Observed Results
np.random.seed(seed)
has the same effect.Expected Results
Relevant Code
The text was updated successfully, but these errors were encountered: