Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

updating branch #1377

Merged
merged 34 commits into from
Mar 11, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
426a2a7
Add architecture search for Gym (#1315)
teytaud Dec 15, 2021
de7efba
Adding optimization specifically for RL (#1303)
teytaud Dec 15, 2021
f5dc4f4
Import benchmarks online for fixing dependencies (#1310)
teytaud Dec 15, 2021
7b99d1a
XPs with sparsity on Open AI Gym (#1319)
teytaud Dec 15, 2021
ab97a6e
Adding state of the art results on OpenAI Gym results (#1318)
teytaud Dec 15, 2021
e1f92d7
Adding example for OpenAI Gym (#1320)
teytaud Dec 15, 2021
3dde638
testing more algorithms for permutations (#1321)
teytaud Dec 15, 2021
d3b467f
Adding yahdlbbbob and simplifying stuff (#1145)
teytaud Dec 15, 2021
e98e1c1
Add Olympus benchmark (#1190)
teytaud Dec 16, 2021
31fc166
Documentation for permutations (#1322)
teytaud Dec 16, 2021
5c1c284
change sparsity in gym (#1328)
teytaud Dec 23, 2021
666e009
Unstack ngopt (#1330)
teytaud Jan 12, 2022
c1e83e2
fix plot (#1332)
teytaud Jan 13, 2022
e88580b
Fix gym: thread safety (#1338)
teytaud Jan 20, 2022
8a92aae
Avoid bool check on params (#1349)
jrapin Jan 25, 2022
5acd82d
Keep only final optimizer in portfolio (#1350)
jrapin Jan 26, 2022
c7e0d67
Fix nan in cma (#1347)
teytaud Jan 26, 2022
43d79b5
Keep data in cache for CMA (#1351)
jrapin Jan 26, 2022
4aac4df
Add installation instruction for MuJoCo (#1352)
mathuvu Feb 1, 2022
3778042
enable_pickling for NGOpt (#1356)
bottler Feb 8, 2022
fc93fdc
Bugfix in scrambled progressive optimization. (#1357)
teytaud Feb 8, 2022
0631dc6
Add anisotropic progressive optimization (#1344)
teytaud Feb 8, 2022
bcf33c2
Finish adding enable_pickling for all optimizers (Rescaled) (#1358)
bottler Feb 9, 2022
e8c1188
Residual controllers in RL. (#1359)
teytaud Feb 14, 2022
e0b586f
Bump version to 0.4.3.post10 (#1364)
jrapin Feb 21, 2022
4909e3f
Removing an incomplete sentence from the doc (#1367)
teytaud Mar 2, 2022
f737189
Fix broken CI (#1370)
jrapin Mar 4, 2022
3e1e173
docs: add GH button in support of Ukraine (#1369)
dmitryvinn Mar 4, 2022
c343bfa
Add the FAO crop model (#1343)
teytaud Mar 7, 2022
cb921f6
Use up-to-date headers (#1371)
jrapin Mar 7, 2022
3bda011
Add NLOPT as a solver (#1340)
teytaud Mar 8, 2022
2796b69
Update version and changelog to 0.5.0 (#1372)
jrapin Mar 8, 2022
3a5e9a0
Deactivate mutation test in CI (#1374)
jrapin Mar 9, 2022
c605a47
Reduce noise in gym (#1333)
teytaud Mar 11, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

## main

## 0.5.0 (2022-03-08)

### Breaking changes

- `copy()` method of a `Parameter` does not change the parameters's random state anymore (it used to reset it to `None` [#1048](https://github.com/facebookresearch/nevergrad/pull/1048)
Expand Down Expand Up @@ -70,6 +72,11 @@
[#1197](https://github.com/facebookresearch/nevergrad/pull/1197).
- An interface with [BayesOptim](https://github.com/wangronin/Bayesian-Optimization) optimizers has been added
[#1179](https://github.com/facebookresearch/nevergrad/pull/1179).
- Fix for abnormally slow iterations for large budgets using CMA in a portfolio
[#1350](https://github.com/facebookresearch/nevergrad/pull/1350).
- A new `enable_pickling` option was added to optimizers. This is only necessary for some of them (among which `scipy`-based optimizer), and comes at the cost of additional memory usage
[#1356](https://github.com/facebookresearch/nevergrad/pull/1356)
[#1358](https://github.com/facebookresearch/nevergrad/pull/1358).

## 0.4.3 (2021-01-28)

Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
[![CircleCI](https://circleci.com/gh/facebookresearch/nevergrad/tree/main.svg?style=svg)](https://circleci.com/gh/facebookresearch/nevergrad/tree/main)
[![Support Ukraine](https://img.shields.io/badge/Support-Ukraine-FFD500?style=flat&labelColor=005BBB)](https://opensource.fb.com/support-ukraine) [![CircleCI](https://circleci.com/gh/facebookresearch/nevergrad/tree/main.svg?style=svg)](https://circleci.com/gh/facebookresearch/nevergrad/tree/main)

# Nevergrad - A gradient-free optimization platform

Expand Down
2 changes: 1 addition & 1 deletion docs/benchmarking.rst
Original file line number Diff line number Diff line change
Expand Up @@ -74,4 +74,4 @@ Functions used for the experiments must derive from :code:`nevergrad.functions.E

See the docstrings for more information, and `arcoating/core.py <https://github.com/facebookresearch/nevergrad/blob/main/nevergrad/functions/arcoating/core.py>`_ and `example.py <https://github.com/facebookresearch/nevergrad/blob/main/nevergrad/benchmark/additional/example.py>`_ for examples.

If you want your experiment plan to be seedable, be extra careful as to how you handle randomness in the experiment generator, since each individual experiment may be run in any order. See `experiments.py <https://github.com/facebookresearch/nevergrad/blob/main/nevergrad/benchmark/experiments.py>`_ for examples of seedable experiment plans. If you do not care for it. For simplicity's sake, the experiment plan generator is however not required to have a seed parameter (but will not be reproducible in this case).
If you want your experiment plan to be seedable, be extra careful as to how you handle randomness in the experiment generator, since each individual experiment may be run in any order. See `experiments.py <https://github.com/facebookresearch/nevergrad/blob/main/nevergrad/benchmark/experiments.py>`_ for examples of seedable experiment plans. For simplicity's sake, the experiment plan generator is however not required to have a seed parameter (but will not be reproducible in this case).
5 changes: 5 additions & 0 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,10 @@ Nevergrad - A gradient-free optimization platform

.. image:: ./resources/Nevergrad-LogoMark.png

.. image:: https://img.shields.io/badge/Support-Ukraine-FFD500?style=flat&labelColor=005BBB
:alt: Support Ukraine - Help Provide Humanitarian Aid to Ukraine.
:target: https://opensource.fb.com/support-ukraine

This documentation is a work in progress, feel free to help us update/improve/restucture it!

Quick start
Expand Down Expand Up @@ -82,6 +86,7 @@ License
-------

:code:`nevergrad` is released under the MIT license. See `LICENSE <https://github.com/facebookresearch/nevergrad/blob/main/LICENSE>`_ for additional details about it, as well as our `Terms of Use <https://opensource.facebook.com/legal/terms>`_ and `Privacy Policy <https://opensource.facebook.com/legal/privacy>`_.
Copyright © Meta Platforms, Inc.

Indices and tables
------------------
Expand Down
1 change: 1 addition & 0 deletions docs/machinelearning.rst
Original file line number Diff line number Diff line change
Expand Up @@ -250,6 +250,7 @@ Optimization of parameters for reinforcement learning
We do not average evaluations over multiple episodes - the algorithm is in charge of averaging, if need be.
:code:`TBPSA`, based on population-control mechanisms, performs quite well in this case.

If you want to run Open AI Gym, see `One-line for learning state-of-the-art OpenAI Gym controllers with Nevergrad <https://docs.google.com/document/d/1noubQ_ZTZ4PZeQ1St7Asi1Af02q7k0nRoX_Pipu9ZKs/edit?usp=sharing/>`_

.. code-block:: python

Expand Down
8 changes: 8 additions & 0 deletions docs/optimization.rst
Original file line number Diff line number Diff line change
Expand Up @@ -185,6 +185,14 @@ Or if you want something more aimed at robustly outperforming random search in h
- Use :code:`ScrHammersleySearchPlusMiddlePoint` (:code:`PlusMiddlePoint` only if you have continuous parameters or good default values for discrete parameters).


Example with permutation
------------------------

SimpleTSP and ComplexTSP are two cases of optimization on a domain of permutations:
`example here. <https://docs.google.com/document/d/1B5yVOx1H1nnjY3EOf14487hAr8CzwJ9zEkDwQnZ5nbE/edit?usp=sharing>`_
This is relevant when you optimize a single big permutation.
Also includes cases with many small permutations.

Example of chaining, or inoculation, or initialization of an evolutionary algorithm
-----------------------------------------------------------------------------------

Expand Down
4 changes: 2 additions & 2 deletions mypy.ini
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
[mypy]

[mypy-scipy.*,requests,pandas,compiler_gym,compiler_gym.*,gym,gym.*,gym_anm,matplotlib.*,pytest,cma,bayes_opt.*,torchvision.models,torch.*,mpl_toolkits.*,fcmaes.*,tqdm,pillow,PIL,PIL.Image,sklearn.*,pyomo.*,pyproj,IOHexperimenter.*,tensorflow,koncept.models,cv2,imquality,imquality.brisque,lpips,mixsimulator.*,networkx.*,cdt.*,pymoo,pymoo.*,bayes_optim.*]
[mypy-scipy.*,requests,pandas,compiler_gym,compiler_gym.*,gym,gym.*,gym_anm,matplotlib.*,pytest,cma,bayes_opt.*,torchvision.models,torch.*,mpl_toolkits.*,fcmaes.*,tqdm,pillow,PIL,PIL.Image,sklearn.*,pyomo.*,pyproj,IOHexperimenter.*,tensorflow,koncept.models,cv2,imquality,imquality.brisque,lpips,mixsimulator.*,networkx.*,cdt.*,pymoo,pymoo.*,bayes_optim.*,olympus.*]
ignore_missing_imports = True

[mypy-nevergrad.functions.rl.agents,torchvision,torchvision.*,nevergrad.functions.games.*,nevergrad.functions.multiobjective.pyhv,nevergrad.optimization.test_doc,,pymoo,pymoo.*,pybullet,pybullet_envs,pybulletgym,pyvirtualdisplay]
[mypy-nevergrad.functions.rl.agents,torchvision,torchvision.*,nevergrad.functions.games.*,nevergrad.functions.multiobjective.pyhv,nevergrad.optimization.test_doc,,pymoo,pymoo.*,pybullet,pybullet_envs,pybulletgym,pyvirtualdisplay,nlopt,aquacrop.*]
ignore_missing_imports = True
ignore_errors = True

Expand Down
4 changes: 2 additions & 2 deletions nevergrad/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand All @@ -15,4 +15,4 @@
__all__ = ["optimizers", "families", "callbacks", "p", "typing", "errors", "ops"]


__version__ = "0.4.3.post9"
__version__ = "0.5.0"
2 changes: 1 addition & 1 deletion nevergrad/benchmark/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand Down
2 changes: 1 addition & 1 deletion nevergrad/benchmark/__main__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand Down
2 changes: 1 addition & 1 deletion nevergrad/benchmark/additional/example.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand Down
2 changes: 1 addition & 1 deletion nevergrad/benchmark/core.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand Down
2 changes: 1 addition & 1 deletion nevergrad/benchmark/execution.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand Down
104 changes: 96 additions & 8 deletions nevergrad/benchmark/experiments.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand All @@ -21,14 +21,14 @@
from nevergrad.functions.arcoating import ARCoating
from nevergrad.functions import images as imagesxp
from nevergrad.functions.powersystems import PowerSystem
from nevergrad.functions.ac import NgAquacrop
from nevergrad.functions.stsp import STSP
from nevergrad.functions.rocket import Rocket
from nevergrad.functions.mixsimulator import OptimizeMix
from nevergrad.functions.unitcommitment import UnitCommitmentProblem
from nevergrad.functions import control
from nevergrad.functions import rl
from nevergrad.functions.games import game
from nevergrad.functions.causaldiscovery import CausalDiscovery
from nevergrad.functions import iohprofiler
from nevergrad.functions import helpers
from .xpbase import Experiment as Experiment
Expand Down Expand Up @@ -619,8 +619,8 @@ def yabbob(
hd: bool = False,
constraint_case: int = 0,
split: bool = False,
tiny: bool = False,
tuning: bool = False,
reduction_factor: int = 1,
bounded: bool = False,
box: bool = False,
) -> tp.Iterator[Experiment]:
Expand Down Expand Up @@ -689,8 +689,9 @@ def yabbob(
[100, 1000, 3000] if hd else ([2, 5, 10, 15] if tuning else ([40] if bounded else [2, 10, 50]))
)
]
if tiny:
functions = functions[::13]

assert reduction_factor in [1, 7, 13, 17] # needs to be a cofactor
functions = functions[::reduction_factor]

# We possibly add constraints.
max_num_constraints = 4
Expand Down Expand Up @@ -735,6 +736,12 @@ def yahdlbbbob(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
return yabbob(seed, hd=True, small=True)


@registry.register
def reduced_yahdlbbbob(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
"""Counterpart of yabbob with HD and low budget."""
return yabbob(seed, hd=True, small=True, reduction_factor=17)


@registry.register
def yanoisysplitbbob(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
"""Counterpart of yabbob with more budget."""
Expand Down Expand Up @@ -782,13 +789,13 @@ def yahdsplitbbob(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
@registry.register
def yatuningbbob(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
"""Counterpart of yabbob with less budget."""
return yabbob(seed, parallel=False, big=False, small=True, tiny=True, tuning=True)
return yabbob(seed, parallel=False, big=False, small=True, reduction_factor=13, tuning=True)


@registry.register
def yatinybbob(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
"""Counterpart of yabbob with less budget."""
return yabbob(seed, parallel=False, big=False, small=True, tiny=True)
return yabbob(seed, parallel=False, big=False, small=True, reduction_factor=13)


@registry.register
Expand Down Expand Up @@ -1151,6 +1158,23 @@ def realworld(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
yield xp


@registry.register
def aquacrop_fao(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
"""FAO Crop simulator. Maximize yield."""

funcs = [NgAquacrop(i, 300.0 + 150.0 * np.cos(i)) for i in range(3, 7)]
seedg = create_seed_generator(seed)
optims = get_optimizers("basics", seed=next(seedg))
for budget in [25, 50, 100, 200, 400, 800, 1600]:
for num_workers in [1, 30]:
if num_workers < budget:
for algo in optims:
for fu in funcs:
xp = Experiment(fu, algo, budget, num_workers=num_workers, seed=next(seedg))
if not xp.is_incoherent:
yield xp


@registry.register
def rocket(seed: tp.Optional[int] = None, seq: bool = False) -> tp.Iterator[Experiment]:
"""Rocket simulator. Maximize max altitude by choosing the thrust schedule, given a total thrust.
Expand Down Expand Up @@ -1263,6 +1287,52 @@ def neuro_control_problem(seed: tp.Optional[int] = None) -> tp.Iterator[Experime
yield xp


@registry.register
def olympus_surfaces(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
"""Olympus surfaces"""
from nevergrad.functions.olympussurfaces import OlympusSurface

funcs = []
for kind in OlympusSurface.SURFACE_KINDS:
for k in range(2, 5):
for noise in ["GaussianNoise", "UniformNoise", "GammaNoise"]:
for noise_scale in [0.5, 1]:
funcs.append(OlympusSurface(kind, 10 ** k, noise, noise_scale))

seedg = create_seed_generator(seed)
optims = get_optimizers("basics", "noisy", seed=next(seedg))
for budget in [25, 50, 100, 200, 400, 800, 1600, 3200, 6400, 12800, 25600]:
for num_workers in [1]: # , 10, 100]:
if num_workers < budget:
for algo in optims:
for fu in funcs:
xp = Experiment(fu, algo, budget, num_workers=num_workers, seed=next(seedg))
if not xp.is_incoherent:
yield xp


@registry.register
def olympus_emulators(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
"""Olympus emulators"""
from nevergrad.functions.olympussurfaces import OlympusEmulator

funcs = []
for dataset_kind in OlympusEmulator.DATASETS:
for model_kind in ["BayesNeuralNet", "NeuralNet"]:
funcs.append(OlympusEmulator(dataset_kind, model_kind))

seedg = create_seed_generator(seed)
optims = get_optimizers("basics", "noisy", seed=next(seedg))
for budget in [25, 50, 100, 200, 400, 800, 1600, 3200, 6400, 12800, 25600]:
for num_workers in [1]: # , 10, 100]:
if num_workers < budget:
for algo in optims:
for fu in funcs:
xp = Experiment(fu, algo, budget, num_workers=num_workers, seed=next(seedg))
if not xp.is_incoherent:
yield xp


@registry.register
def simple_tsp(seed: tp.Optional[int] = None, complex_tsp: bool = False) -> tp.Iterator[Experiment]:
"""Simple TSP problems. Please note that the methods we use could be applied or complex variants, whereas
Expand All @@ -1274,7 +1344,22 @@ def simple_tsp(seed: tp.Optional[int] = None, complex_tsp: bool = False) -> tp.I
"""
funcs = [STSP(10 ** k, complex_tsp) for k in range(2, 6)]
seedg = create_seed_generator(seed)
optims = get_optimizers("basics", "noisy", seed=next(seedg))
optims = [
"RotatedTwoPointsDE",
"DiscreteLenglerOnePlusOne",
"DiscreteDoerrOnePlusOne",
"DiscreteBSOOnePlusOne",
"AdaptiveDiscreteOnePlusOne",
"GeneticDE",
"RotatedTwoPointsDE",
"DE",
"TwoPointsDE",
"DiscreteOnePlusOne",
"NGOpt38",
"CMA",
"MetaModel",
"DiagonalCMA",
]
for budget in [25, 50, 100, 200, 400, 800, 1600, 3200, 6400, 12800, 25600]:
for num_workers in [1]: # , 10, 100]:
if num_workers < budget:
Expand Down Expand Up @@ -1851,6 +1936,9 @@ def pbo_suite(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:

def causal_similarity(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
"""Finding the best causal graph"""
# pylint: disable=import-outside-toplevel
from nevergrad.functions.causaldiscovery import CausalDiscovery

seedg = create_seed_generator(seed)
optims = ["CMA", "NGOpt8", "DE", "PSO", "RecES", "RecMixES", "RecMutDE", "ParametrizationDE"]
func = CausalDiscovery()
Expand Down
2 changes: 1 addition & 1 deletion nevergrad/benchmark/exporttable.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand Down
2 changes: 1 addition & 1 deletion nevergrad/benchmark/frozenexperiments.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand Down
Loading