Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Restructure Parameters around Layered class #1045

Merged
merged 42 commits into from
Feb 15, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
42 commits
Select commit Hold shift + click to select a range
ec2d41a
Restructure around overridable
jrapin Feb 11, 2021
d3c567c
fix
jrapin Feb 11, 2021
85ecb91
useless
jrapin Feb 11, 2021
5032364
wip
jrapin Feb 11, 2021
2bfd6e7
fix
jrapin Feb 11, 2021
2ae1610
propagatelayers
jrapin Feb 12, 2021
22ec116
doc
jrapin Feb 12, 2021
cf02f80
Prepare integer
jrapin Feb 12, 2021
47ed796
name
jrapin Feb 12, 2021
bd1fb0d
fix
jrapin Feb 12, 2021
c88203f
fixes
jrapin Feb 12, 2021
bfbc9e0
step
jrapin Feb 12, 2021
a0cdf50
working_revert
jrapin Feb 12, 2021
c775aed
reformat
jrapin Feb 12, 2021
9c37dbe
somefixes
jrapin Feb 12, 2021
20586ba
working
jrapin Feb 12, 2021
1a23440
cleaning
jrapin Feb 12, 2021
6c552a2
prints
jrapin Feb 12, 2021
db2902c
move
jrapin Feb 12, 2021
dae5f91
header
jrapin Feb 12, 2021
f7f0a38
remove_attr
jrapin Feb 12, 2021
9e43ad5
prints
jrapin Feb 12, 2021
65d010c
print
jrapin Feb 12, 2021
ba3eed7
fix
jrapin Feb 12, 2021
d2e3370
mypy
jrapin Feb 12, 2021
0e94d22
fix
jrapin Feb 12, 2021
fe93831
module
jrapin Feb 12, 2021
b121776
enum
jrapin Feb 12, 2021
2295e10
Add array casting
jrapin Feb 12, 2021
89ec03f
Start cache deletion
jrapin Feb 12, 2021
6e82ce2
deep
jrapin Feb 12, 2021
4e0ac95
bound_layer
jrapin Feb 12, 2021
58883f3
wip
jrapin Feb 14, 2021
1f5bc65
skip
jrapin Feb 14, 2021
6d14420
sample
jrapin Feb 14, 2021
88856f0
wip
jrapin Feb 14, 2021
146a59c
Merge branch 'master' into overridable
jrapin Feb 15, 2021
1b50443
Merge branch 'master' into overridable
jrapin Feb 15, 2021
74785b4
merge
jrapin Feb 15, 2021
816a254
move_logic
jrapin Feb 15, 2021
6cb68ae
fix
jrapin Feb 15, 2021
fb8acf7
nits
jrapin Feb 15, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 18 additions & 5 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,17 +2,30 @@

## master

### Breaking changes

- `copy()` method of a `Parameter` does not change the parameters's random state anymore (it used to reset it to `None` [#1048](https://github.com/facebookresearch/nevergrad/pull/1048)
- `MultiobjectiveFunction` does not exist anymore [#1034](https://github.com/facebookresearch/nevergrad/pull/1034).
- the new `nevergrad.errors` module gathers errors and warnings used throughout the package (WIP) [#1031](https://github.com/facebookresearch/nevergrad/pull/1031).
- `EvolutionStrategy` now defaults to NSGA2 selection in the multiobjective case
- `Parameter` classes are undergoing heavy changes (

### Important changes

- `Parameter` classes are undergoing heavy changes, please open an issue if you encounter any problem.
The midterm aim is to allow for simpler constraint management.
- `Parameter` have been updated have undergone heavy changes to ease the handling of their tree structure (
[#1029](https://github.com/facebookresearch/nevergrad/pull/1029)
[#1036](https://github.com/facebookresearch/nevergrad/pull/1036)
[#1038](https://github.com/facebookresearch/nevergrad/pull/1038)
[#1043](https://github.com/facebookresearch/nevergrad/pull/1043)
[#1044](https://github.com/facebookresearch/nevergrad/pull/1044)
and more to come), please open an issue if you encounter any problem. The midterm aim is to allow for simpler constraint management.
- `copy()` method of a `Parameter` does not change the parameters's random state anymore (it used to reset it to `None` [#1048](https://github.com/facebookresearch/nevergrad/pull/1048)
)
- `Parameter` classes have now a layer structure [#1045](https://github.com/facebookresearch/nevergrad/pull/1045)
which simplifies changing their behavior. In future PRs this system will take charge of bounds, other constraints,
sampling etc.

### Other changes

- the new `nevergrad.errors` module gathers errors and warnings used throughout the package (WIP) [#1031](https://github.com/facebookresearch/nevergrad/pull/1031).
- `EvolutionStrategy` now defaults to NSGA2 selection in the multiobjective case

## 0.4.3 (2021-01-28)

Expand Down
26 changes: 21 additions & 5 deletions nevergrad/common/errors.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,26 +20,42 @@ class NevergradWarning(Warning):
# errors


class TellNotAskedNotSupportedError(NotImplementedError, NevergradError):
class NevergradRuntimeError(RuntimeError, NevergradError):
"""Runtime error raised by Nevergrad"""


class NevergradTypeError(TypeError, NevergradError):
"""Runtime error raised by Nevergrad"""


class NevergradValueError(ValueError, NevergradError):
"""Runtime error raised by Nevergrad"""


class NevergradNotImplementedError(NotImplementedError, NevergradError):
"""Not implemented functionality"""


class TellNotAskedNotSupportedError(NevergradNotImplementedError):
"""To be raised by optimizers which do not support the tell_not_asked interface."""


class ExperimentFunctionCopyError(NotImplementedError, NevergradError):
class ExperimentFunctionCopyError(NevergradNotImplementedError):
"""Raised when the experiment function fails to copy itself (for benchmarks)"""


class UnsupportedExperiment(RuntimeError, unittest.SkipTest, NevergradError):
class UnsupportedExperiment(unittest.SkipTest, NevergradRuntimeError):
"""Raised if the experiment is not compatible with the current settings:
Eg: missing data, missing import, unsupported OS etc
This automatically skips tests.
"""


class NevergradDeprecationError(RuntimeError, NevergradError):
class NevergradDeprecationError(NevergradRuntimeError):
"""Deprecated function/class"""


class UnsupportedParameterOperationError(RuntimeError, NevergradError):
class UnsupportedParameterOperationError(NevergradRuntimeError):
"""This type of operation is not supported by the parameter"""


Expand Down
212 changes: 212 additions & 0 deletions nevergrad/parametrization/_layering.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,212 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.

import copy
import bisect
from enum import Enum
import numpy as np
from nevergrad.common import errors
import nevergrad.common.typing as tp


L = tp.TypeVar("L", bound="Layered")
X = tp.TypeVar("X")


class Level(Enum):
"""Lower level is deeper in the structure"""

ROOT = 0
OPERATION = 10

# final
ARRAY_CASTING = 900
INTEGER_CASTING = 1000 # must be the last layer


class Layered:
"""Hidden API for overriding/modifying the behavior of a Parameter,
which is itself a Layered object.

Layers can be added and will be ordered depending on their level
"""

_LAYER_LEVEL = Level.OPERATION

def __init__(self) -> None:
self._layers = [self]
self._index = 0
self._name: tp.Optional[str] = None

def add_layer(self: L, other: "Layered") -> L:
"""Adds a layer which will modify the object behavior"""
if self is not self._layers[0] or self._LAYER_LEVEL != Level.ROOT:
raise errors.NevergradRuntimeError("Layers can only be added from the root.")
if len(other._layers) > 1:
raise errors.NevergradRuntimeError("Cannot append multiple layers at once")
if other._LAYER_LEVEL.value >= self._layers[-1]._LAYER_LEVEL.value:
other._index = len(self._layers)
self._layers.append(other)
else:
levels = [x._LAYER_LEVEL.value for x in self._layers]
ind = bisect.bisect_right(levels, other._LAYER_LEVEL.value)
self._layers.insert(ind, other)
for k, x in enumerate(self._layers):
x._index = k
other._layers = self._layers
return self

def _call_deeper(self, name: str, *args: tp.Any, **kwargs: tp.Any) -> tp.Any:
if not name.startswith("_layered_"):
raise errors.NevergradValueError("For consistency, only _layered functions can be used.")
if self._layers[self._index] is not self:
layers = [f"{l.name}({l._index})" for l in self._layers]
raise errors.NevergradRuntimeError(
"Layer indexing has changed for an unknown reason. Please open an issue:\n"
f"Caller at index {self._index}: {self.name}"
f"Layers: {layers}.\n"
)
for index in reversed(range(self._index)):
func = getattr(self._layers[index], name)
if func.__func__ is not getattr(Layered, name): # skip unecessary stack calls
return func(*args, **kwargs)
types = [type(x) for x in self._layers]
raise errors.NevergradNotImplementedError(f"No implementation for {name} on layers: {types}.")
# ALTERNATIVE (stacking all calls):
# if not self._index: # root must have an implementation
# raise errors.NevergradNotImplementedError
# return getattr(self._layers[self._index - 1], name)(*args, **kwargs)

def _layered_get_value(self) -> tp.Any:
return self._call_deeper("_layered_get_value")

def _layered_set_value(self, value: tp.Any) -> tp.Any:
return self._call_deeper("_layered_set_value", value)

def _layered_del_value(self) -> None:
pass # called independently on each layer

def _layered_sample(self) -> "Layered":
return self._call_deeper("_layered_sample") # type: ignore

def copy(self: L) -> L:
"""Creates a new unattached layer with the same behavior"""
new = copy.copy(self)
new._layers = [new]
new._index = 0
if not self._index: # attach sublayers if root
for layer in self._layers[1:]:
new.add_layer(layer.copy())
return new

# naming capacity

def _get_name(self) -> str:
"""Internal implementation of parameter name. This should be value independant, and should not account
for internal/model parameters.
"""
return self.__class__.__name__

def __repr__(self) -> str:
return self.name

@property
def name(self) -> str:
"""Name of the parameter
This is used to keep track of how this Parameter is configured (included through internal/model parameters),
mostly for reproducibility A default version is always provided, but can be overriden directly
through the attribute, or through the set_name method (which allows chaining).
"""
if self._name is not None:
return self._name
return self._get_name()

@name.setter
def name(self, name: str) -> None:
self.set_name(name) # with_name allows chaining

def set_name(self: L, name: str) -> L:
"""Sets a name and return the current instrumentation (for chaining)

Parameters
----------
name: str
new name to use to represent the Parameter
"""
self._name = name
return self


class ValueProperty(tp.Generic[X]):
"""Typed property (descriptor) object so that the value attribute of
Parameter objects fetches _layered_get_value and _layered_set_value methods
"""

# This uses the descriptor protocol, like a property:
# See https://docs.python.org/3/howto/descriptor.html
#
# Basically parameter.value calls parameter.value.__get__
# and then parameter._layered_get_value
def __init__(self) -> None:
self.__doc__ = """Value of the Parameter, which should be sent to the function
to optimize.

Example
-------
>>> ng.p.Array(shape=(2,)).value
array([0., 0.])
"""

def __get__(self, obj: Layered, objtype: tp.Optional[tp.Type[object]] = None) -> X:
return obj._layers[-1]._layered_get_value() # type: ignore

def __set__(self, obj: Layered, value: X) -> None:
obj._layers[-1]._layered_set_value(value)

def __delete__(self, obj: Layered) -> None:
for layer in obj._layers:
layer._layered_del_value()


# Basic data layers


class _ScalarCasting(Layered):
"""Cast Array as a scalar"""

_LAYER_LEVEL = Level.INTEGER_CASTING

def _layered_get_value(self) -> float:
out = super()._layered_get_value() # pulls from previous layer
if not isinstance(out, np.ndarray) or not out.size == 1:
raise errors.NevergradRuntimeError("Scalar casting can only be applied to size=1 Data parameters")
integer = np.issubdtype(out.dtype, np.integer)
out = (int if integer else float)(out[0])
return out # type: ignore

def _layered_set_value(self, value: tp.Any) -> None:
if not isinstance(value, (float, int, np.float, np.int)):
raise TypeError(f"Received a {type(value)} in place of a scalar (float, int)")
super()._layered_set_value(np.array([value], dtype=float))


class ArrayCasting(Layered):
"""Cast inputs of type tuple/list etc to array"""

_LAYER_LEVEL = Level.ARRAY_CASTING

def _layered_set_value(self, value: tp.ArrayLike) -> None:
if not isinstance(value, (np.ndarray, tuple, list)):
raise TypeError(f"Received a {type(value)} in place of a np.ndarray/tuple/list")
super()._layered_set_value(np.asarray(value))


class IntegerCasting(Layered):
"""Cast Data as integer (or integer array)"""

_LAYER_LEVEL = Level.OPERATION

def _layered_get_value(self) -> np.ndarray:
return np.round(super()._layered_get_value()).astype(int) # type: ignore
12 changes: 6 additions & 6 deletions nevergrad/parametrization/choice.py
Original file line number Diff line number Diff line change
Expand Up @@ -84,12 +84,12 @@ def choices(self) -> container.Tuple:
"""The different options, as a Tuple Parameter"""
return self["choices"] # type: ignore

def _get_value(self) -> tp.Any:
def _layered_get_value(self) -> tp.Any:
if self._repetitions is None:
return core.as_parameter(self.choices[self.index]).value
return tuple(core.as_parameter(self.choices[ind]).value for ind in self.indices)

def _set_value(self, value: tp.List[tp.Any]) -> np.ndarray:
def _layered_set_value(self, value: tp.List[tp.Any]) -> np.ndarray:
"""Must be adapted to each class
This handles a list of values, not just one
""" # TODO this is currenlty very messy, may need some improvement
Expand Down Expand Up @@ -197,8 +197,8 @@ def probabilities(self) -> np.ndarray:
exp = np.exp(self.weights.value)
return exp / np.sum(exp) # type: ignore

def _set_value(self, value: tp.Any) -> np.ndarray:
indices = super()._set_value(value)
def _layered_set_value(self, value: tp.Any) -> np.ndarray:
indices = super()._layered_set_value(value)
self._indices = indices
# force new probabilities
arity = self.weights.value.shape[1]
Expand Down Expand Up @@ -274,8 +274,8 @@ def __init__(
def indices(self) -> np.ndarray:
return np.minimum(len(self) - 1e-9, self.positions.value).astype(int) # type: ignore

def _set_value(self, value: tp.Any) -> np.ndarray:
indices = super()._set_value(value) # only one value for this class
def _layered_set_value(self, value: tp.Any) -> np.ndarray:
indices = super()._layered_set_value(value) # only one value for this class
self._set_index(indices)
return indices

Expand Down
11 changes: 5 additions & 6 deletions nevergrad/parametrization/container.py
Original file line number Diff line number Diff line change
Expand Up @@ -101,10 +101,9 @@ def _internal_set_standardized_data(
start = end
assert end == len(data), f"Finished at {end} but expected {len(data)}"

def sample(self: D) -> D:
def _layered_sample(self: D) -> D:
child = self.spawn_child()
child._content = {k: p.sample() for k, p in self._content.items()}
child.heritage["lineage"] = child.uid
return child


Expand Down Expand Up @@ -139,10 +138,10 @@ def items(self) -> tp.ItemsView[str, core.Parameter]:
def values(self) -> tp.ValuesView[core.Parameter]:
return self._content.values()

def _get_value(self) -> tp.Dict[str, tp.Any]:
def _layered_get_value(self) -> tp.Dict[str, tp.Any]:
return {k: p.value for k, p in self.items()}

def _set_value(self, value: tp.Dict[str, tp.Any]) -> None:
def _layered_set_value(self, value: tp.Dict[str, tp.Any]) -> None:
cls = self.__class__.__name__
if not isinstance(value, dict):
raise TypeError(f"{cls} value must be a dict, got: {value}\nCurrent value: {self.value}")
Expand Down Expand Up @@ -190,10 +189,10 @@ def __iter__(self) -> tp.Iterator[core.Parameter]:

value: core.ValueProperty[tp.Tuple[tp.Any]] = core.ValueProperty()

def _get_value(self) -> tp.Tuple[tp.Any, ...]:
def _layered_get_value(self) -> tp.Tuple[tp.Any, ...]:
return tuple(p.value for p in self)

def _set_value(self, value: tp.Tuple[tp.Any, ...]) -> None:
def _layered_set_value(self, value: tp.Tuple[tp.Any, ...]) -> None:
if not isinstance(value, tuple) or not len(value) == len(self):
cls = self.__class__.__name__
raise ValueError(
Expand Down
Loading