Skip to content

Commit

Permalink
update work on metamodel (#1379)
Browse files Browse the repository at this point in the history
* Check if proposed metamodel value looks good (#1313)

* tentative fixing master: fixing scikit-image et scikit-learn, and others >= version which is currently ok (#1317)

* tentative

* fix

* tentative

* tentative

* pouf

* simplify

* fix

* Add architecture search for Gym (#1315)

* as

* black

* add_Test

* type_fix

* typing

* fix

* fix_bug

* fix_weird

* wtf

* fix

* fix

* fix

* black

* switch_to_choice

* mypy

* fix

* fix

* fix

* fix

* fix

* fix_type

* Adding optimization specifically for RL (#1303)

* Adding optimization specifically for RL

* fix

* fix

* Import benchmarks online for fixing dependencies (#1310)

* fix_dependencies

* fix_tests

* : missing

* fix_lint

* fix

* fix

* fix

* XPs with sparsity on Open AI Gym (#1319)

* XPs with sparsity on Open AI Gym

* Update gymexperiments.py

* Adding state of the art results on OpenAI Gym results (#1318)

* Update plotting.py

* fix

* ok_good

* fix

* Adding example for OpenAI Gym (#1320)

* testing more algorithms for permutations (#1321)

* testing more algorithms for permutations

* fix

* Adding yahdlbbbob and simplifying stuff (#1145)

* Update experiments.py

* black

* fix_case_1

* fix

* Update experiments.py

* Add Olympus benchmark (#1190)

* Add olympus function and benchmark

* remove surfaces with error

* Add noise and use average of noise free surfaces for evaluation_finction

* fix static tests

* rename olympus to olympussurfaces

* fix test_core olympus

* replace olympus with olymp in requirements

* fix olymp version in requirements

* black reformatting

* Add discrete and GaussianMixture surfaces

* Add discrete and GaussianMixture surfaces

* Move surfaces in a class variable

* no_need_for_42_because_noiseless

* nowindowsforcarraz

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* bettertest

* Add olympus emulators

* add BayesNeuralNet

* reformat code

* minor fix

* minor fix

* minor fix

* add missing package and correct seed problem

* remove unused packages

* fix silence_tensorflow version

* try after merge

* fix

* gamma_fix

* fix

* fix

* fix_naming

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* Update requirements/bench.txt

Co-authored-by: Jérémy Rapin <[email protected]>

* YEEEEEEEES_it_works

Co-authored-by: ncarraz <[email protected]>
Co-authored-by: Jérémy Rapin <[email protected]>

* Documentation for permutations (#1322)

* change sparsity in gym (#1328)

* change sparsity in gym

* fix

* Unstack ngopt (#1330)

* fix plot (#1332)

* fix plot

* black

* Fix gym: thread safety (#1338)

* fixgym

* fix

* fix

* fix

* Avoid bool check on params (#1349)

* Keep only final optimizer in portfolio (#1350)

* Fix nan in cma (#1347)

* fixnan

* fixnan

* fixnan

* fo

* Keep data in cache for CMA (#1351)

* Add installation instruction for MuJoCo (#1352)

* adding installation instruction for MuJoCo

* raise error

* better error handling

Co-authored-by: Mathurin Videau <[email protected]>

* enable_pickling for NGOpt (#1356)

* Bugfix in scrambled progressive optimization. (#1357)

* Add anisotropic progressive optimization (#1344)

* Adding anisotropic progressive optimization

* black

* Finish adding enable_pickling for all optimizers (Rescaled) (#1358)

* Residual controllers in RL. (#1359)

* Residual controllers in RL.

It's not exactly residual, it's initializing with something closer to identity. It works pretty well.

* Update multigym.py

* fix

* fix

* fix

* black

* fix

* seed

* Update test_core.py

* fix

* fix

* fix

* Bump version to 0.4.3.post10 (#1364)

* Removing an incomplete sentence from the doc (#1367)

* Fix broken CI (#1370)

* docs: add GH button in support of Ukraine  (#1369)

* Add the FAO crop model (#1343)

* aquacrop

* fix

* fix

* fix

* Update ac.py

* black

* Update experiments.py (#1361)

* fix

* Update bench.txt

* fix

* fix

* fix

* tentative_pip3

* yet_another_tentative_fi

* yet_another_tentative_fi

* fix

* fix_suffering

* desperate_try

* desperate_try

* desperate_try

* desperate_try

* fix

* desperate_try

* desperate_try

* desperate_try

* desperate_try

* fix

* Update config.yml

* fix

* Update setup.py

* Update main.txt

* fix

* Use up-to-date headers (#1371)

* Add NLOPT as a solver (#1340)

* Update version and changelog to 0.5.0 (#1372)

* Deactivate mutation test in CI (#1374)

* Reduce noise in gym (#1333)

* Reduce noise in gym

* Update multigym.py

* Add comment

Co-authored-by: ncarraz <[email protected]>
Co-authored-by: Jérémy Rapin <[email protected]>
Co-authored-by: mathuvu <[email protected]>
Co-authored-by: Mathurin Videau <[email protected]>
Co-authored-by: Jeremy Reizenstein <[email protected]>
Co-authored-by: Dmitry Vinnik <[email protected]>
  • Loading branch information
7 people authored Mar 11, 2022
1 parent 5cf2d38 commit 503299a
Show file tree
Hide file tree
Showing 183 changed files with 998 additions and 366 deletions.
7 changes: 7 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

## main

## 0.5.0 (2022-03-08)

### Breaking changes

- `copy()` method of a `Parameter` does not change the parameters's random state anymore (it used to reset it to `None` [#1048](https://github.com/facebookresearch/nevergrad/pull/1048)
Expand Down Expand Up @@ -70,6 +72,11 @@
[#1197](https://github.com/facebookresearch/nevergrad/pull/1197).
- An interface with [BayesOptim](https://github.com/wangronin/Bayesian-Optimization) optimizers has been added
[#1179](https://github.com/facebookresearch/nevergrad/pull/1179).
- Fix for abnormally slow iterations for large budgets using CMA in a portfolio
[#1350](https://github.com/facebookresearch/nevergrad/pull/1350).
- A new `enable_pickling` option was added to optimizers. This is only necessary for some of them (among which `scipy`-based optimizer), and comes at the cost of additional memory usage
[#1356](https://github.com/facebookresearch/nevergrad/pull/1356)
[#1358](https://github.com/facebookresearch/nevergrad/pull/1358).

## 0.4.3 (2021-01-28)

Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
[![CircleCI](https://circleci.com/gh/facebookresearch/nevergrad/tree/main.svg?style=svg)](https://circleci.com/gh/facebookresearch/nevergrad/tree/main)
[![Support Ukraine](https://img.shields.io/badge/Support-Ukraine-FFD500?style=flat&labelColor=005BBB)](https://opensource.fb.com/support-ukraine) [![CircleCI](https://circleci.com/gh/facebookresearch/nevergrad/tree/main.svg?style=svg)](https://circleci.com/gh/facebookresearch/nevergrad/tree/main)

# Nevergrad - A gradient-free optimization platform

Expand Down
2 changes: 1 addition & 1 deletion docs/benchmarking.rst
Original file line number Diff line number Diff line change
Expand Up @@ -74,4 +74,4 @@ Functions used for the experiments must derive from :code:`nevergrad.functions.E

See the docstrings for more information, and `arcoating/core.py <https://github.com/facebookresearch/nevergrad/blob/main/nevergrad/functions/arcoating/core.py>`_ and `example.py <https://github.com/facebookresearch/nevergrad/blob/main/nevergrad/benchmark/additional/example.py>`_ for examples.

If you want your experiment plan to be seedable, be extra careful as to how you handle randomness in the experiment generator, since each individual experiment may be run in any order. See `experiments.py <https://github.com/facebookresearch/nevergrad/blob/main/nevergrad/benchmark/experiments.py>`_ for examples of seedable experiment plans. If you do not care for it. For simplicity's sake, the experiment plan generator is however not required to have a seed parameter (but will not be reproducible in this case).
If you want your experiment plan to be seedable, be extra careful as to how you handle randomness in the experiment generator, since each individual experiment may be run in any order. See `experiments.py <https://github.com/facebookresearch/nevergrad/blob/main/nevergrad/benchmark/experiments.py>`_ for examples of seedable experiment plans. For simplicity's sake, the experiment plan generator is however not required to have a seed parameter (but will not be reproducible in this case).
5 changes: 5 additions & 0 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,10 @@ Nevergrad - A gradient-free optimization platform

.. image:: ./resources/Nevergrad-LogoMark.png

.. image:: https://img.shields.io/badge/Support-Ukraine-FFD500?style=flat&labelColor=005BBB
:alt: Support Ukraine - Help Provide Humanitarian Aid to Ukraine.
:target: https://opensource.fb.com/support-ukraine

This documentation is a work in progress, feel free to help us update/improve/restucture it!

Quick start
Expand Down Expand Up @@ -82,6 +86,7 @@ License
-------

:code:`nevergrad` is released under the MIT license. See `LICENSE <https://github.com/facebookresearch/nevergrad/blob/main/LICENSE>`_ for additional details about it, as well as our `Terms of Use <https://opensource.facebook.com/legal/terms>`_ and `Privacy Policy <https://opensource.facebook.com/legal/privacy>`_.
Copyright © Meta Platforms, Inc.

Indices and tables
------------------
Expand Down
1 change: 1 addition & 0 deletions docs/machinelearning.rst
Original file line number Diff line number Diff line change
Expand Up @@ -250,6 +250,7 @@ Optimization of parameters for reinforcement learning
We do not average evaluations over multiple episodes - the algorithm is in charge of averaging, if need be.
:code:`TBPSA`, based on population-control mechanisms, performs quite well in this case.

If you want to run Open AI Gym, see `One-line for learning state-of-the-art OpenAI Gym controllers with Nevergrad <https://docs.google.com/document/d/1noubQ_ZTZ4PZeQ1St7Asi1Af02q7k0nRoX_Pipu9ZKs/edit?usp=sharing/>`_

.. code-block:: python
Expand Down
8 changes: 8 additions & 0 deletions docs/optimization.rst
Original file line number Diff line number Diff line change
Expand Up @@ -185,6 +185,14 @@ Or if you want something more aimed at robustly outperforming random search in h
- Use :code:`ScrHammersleySearchPlusMiddlePoint` (:code:`PlusMiddlePoint` only if you have continuous parameters or good default values for discrete parameters).


Example with permutation
------------------------

SimpleTSP and ComplexTSP are two cases of optimization on a domain of permutations:
`example here. <https://docs.google.com/document/d/1B5yVOx1H1nnjY3EOf14487hAr8CzwJ9zEkDwQnZ5nbE/edit?usp=sharing>`_
This is relevant when you optimize a single big permutation.
Also includes cases with many small permutations.

Example of chaining, or inoculation, or initialization of an evolutionary algorithm
-----------------------------------------------------------------------------------

Expand Down
4 changes: 2 additions & 2 deletions mypy.ini
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
[mypy]

[mypy-scipy.*,requests,pandas,compiler_gym,compiler_gym.*,gym,gym.*,gym_anm,matplotlib.*,pytest,cma,bayes_opt.*,torchvision.models,torch.*,mpl_toolkits.*,fcmaes.*,tqdm,pillow,PIL,PIL.Image,sklearn.*,pyomo.*,pyproj,IOHexperimenter.*,tensorflow,koncept.models,cv2,imquality,imquality.brisque,lpips,mixsimulator.*,networkx.*,cdt.*,pymoo,pymoo.*,bayes_optim.*]
[mypy-scipy.*,requests,pandas,compiler_gym,compiler_gym.*,gym,gym.*,gym_anm,matplotlib.*,pytest,cma,bayes_opt.*,torchvision.models,torch.*,mpl_toolkits.*,fcmaes.*,tqdm,pillow,PIL,PIL.Image,sklearn.*,pyomo.*,pyproj,IOHexperimenter.*,tensorflow,koncept.models,cv2,imquality,imquality.brisque,lpips,mixsimulator.*,networkx.*,cdt.*,pymoo,pymoo.*,bayes_optim.*,olympus.*]
ignore_missing_imports = True

[mypy-nevergrad.functions.rl.agents,torchvision,torchvision.*,nevergrad.functions.games.*,nevergrad.functions.multiobjective.pyhv,nevergrad.optimization.test_doc,,pymoo,pymoo.*,pybullet,pybullet_envs,pybulletgym,pyvirtualdisplay]
[mypy-nevergrad.functions.rl.agents,torchvision,torchvision.*,nevergrad.functions.games.*,nevergrad.functions.multiobjective.pyhv,nevergrad.optimization.test_doc,,pymoo,pymoo.*,pybullet,pybullet_envs,pybulletgym,pyvirtualdisplay,nlopt,aquacrop.*]
ignore_missing_imports = True
ignore_errors = True

Expand Down
4 changes: 2 additions & 2 deletions nevergrad/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand All @@ -15,4 +15,4 @@
__all__ = ["optimizers", "families", "callbacks", "p", "typing", "errors", "ops"]


__version__ = "0.4.3.post9"
__version__ = "0.5.0"
2 changes: 1 addition & 1 deletion nevergrad/benchmark/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand Down
2 changes: 1 addition & 1 deletion nevergrad/benchmark/__main__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand Down
2 changes: 1 addition & 1 deletion nevergrad/benchmark/additional/example.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand Down
2 changes: 1 addition & 1 deletion nevergrad/benchmark/core.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand Down
2 changes: 1 addition & 1 deletion nevergrad/benchmark/execution.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand Down
104 changes: 96 additions & 8 deletions nevergrad/benchmark/experiments.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand All @@ -21,14 +21,14 @@
from nevergrad.functions.arcoating import ARCoating
from nevergrad.functions import images as imagesxp
from nevergrad.functions.powersystems import PowerSystem
from nevergrad.functions.ac import NgAquacrop
from nevergrad.functions.stsp import STSP
from nevergrad.functions.rocket import Rocket
from nevergrad.functions.mixsimulator import OptimizeMix
from nevergrad.functions.unitcommitment import UnitCommitmentProblem
from nevergrad.functions import control
from nevergrad.functions import rl
from nevergrad.functions.games import game
from nevergrad.functions.causaldiscovery import CausalDiscovery
from nevergrad.functions import iohprofiler
from nevergrad.functions import helpers
from .xpbase import Experiment as Experiment
Expand Down Expand Up @@ -619,8 +619,8 @@ def yabbob(
hd: bool = False,
constraint_case: int = 0,
split: bool = False,
tiny: bool = False,
tuning: bool = False,
reduction_factor: int = 1,
bounded: bool = False,
box: bool = False,
) -> tp.Iterator[Experiment]:
Expand Down Expand Up @@ -689,8 +689,9 @@ def yabbob(
[100, 1000, 3000] if hd else ([2, 5, 10, 15] if tuning else ([40] if bounded else [2, 10, 50]))
)
]
if tiny:
functions = functions[::13]

assert reduction_factor in [1, 7, 13, 17] # needs to be a cofactor
functions = functions[::reduction_factor]

# We possibly add constraints.
max_num_constraints = 4
Expand Down Expand Up @@ -735,6 +736,12 @@ def yahdlbbbob(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
return yabbob(seed, hd=True, small=True)


@registry.register
def reduced_yahdlbbbob(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
"""Counterpart of yabbob with HD and low budget."""
return yabbob(seed, hd=True, small=True, reduction_factor=17)


@registry.register
def yanoisysplitbbob(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
"""Counterpart of yabbob with more budget."""
Expand Down Expand Up @@ -782,13 +789,13 @@ def yahdsplitbbob(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
@registry.register
def yatuningbbob(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
"""Counterpart of yabbob with less budget."""
return yabbob(seed, parallel=False, big=False, small=True, tiny=True, tuning=True)
return yabbob(seed, parallel=False, big=False, small=True, reduction_factor=13, tuning=True)


@registry.register
def yatinybbob(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
"""Counterpart of yabbob with less budget."""
return yabbob(seed, parallel=False, big=False, small=True, tiny=True)
return yabbob(seed, parallel=False, big=False, small=True, reduction_factor=13)


@registry.register
Expand Down Expand Up @@ -1151,6 +1158,23 @@ def realworld(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
yield xp


@registry.register
def aquacrop_fao(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
"""FAO Crop simulator. Maximize yield."""

funcs = [NgAquacrop(i, 300.0 + 150.0 * np.cos(i)) for i in range(3, 7)]
seedg = create_seed_generator(seed)
optims = get_optimizers("basics", seed=next(seedg))
for budget in [25, 50, 100, 200, 400, 800, 1600]:
for num_workers in [1, 30]:
if num_workers < budget:
for algo in optims:
for fu in funcs:
xp = Experiment(fu, algo, budget, num_workers=num_workers, seed=next(seedg))
if not xp.is_incoherent:
yield xp


@registry.register
def rocket(seed: tp.Optional[int] = None, seq: bool = False) -> tp.Iterator[Experiment]:
"""Rocket simulator. Maximize max altitude by choosing the thrust schedule, given a total thrust.
Expand Down Expand Up @@ -1263,6 +1287,52 @@ def neuro_control_problem(seed: tp.Optional[int] = None) -> tp.Iterator[Experime
yield xp


@registry.register
def olympus_surfaces(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
"""Olympus surfaces"""
from nevergrad.functions.olympussurfaces import OlympusSurface

funcs = []
for kind in OlympusSurface.SURFACE_KINDS:
for k in range(2, 5):
for noise in ["GaussianNoise", "UniformNoise", "GammaNoise"]:
for noise_scale in [0.5, 1]:
funcs.append(OlympusSurface(kind, 10 ** k, noise, noise_scale))

seedg = create_seed_generator(seed)
optims = get_optimizers("basics", "noisy", seed=next(seedg))
for budget in [25, 50, 100, 200, 400, 800, 1600, 3200, 6400, 12800, 25600]:
for num_workers in [1]: # , 10, 100]:
if num_workers < budget:
for algo in optims:
for fu in funcs:
xp = Experiment(fu, algo, budget, num_workers=num_workers, seed=next(seedg))
if not xp.is_incoherent:
yield xp


@registry.register
def olympus_emulators(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
"""Olympus emulators"""
from nevergrad.functions.olympussurfaces import OlympusEmulator

funcs = []
for dataset_kind in OlympusEmulator.DATASETS:
for model_kind in ["BayesNeuralNet", "NeuralNet"]:
funcs.append(OlympusEmulator(dataset_kind, model_kind))

seedg = create_seed_generator(seed)
optims = get_optimizers("basics", "noisy", seed=next(seedg))
for budget in [25, 50, 100, 200, 400, 800, 1600, 3200, 6400, 12800, 25600]:
for num_workers in [1]: # , 10, 100]:
if num_workers < budget:
for algo in optims:
for fu in funcs:
xp = Experiment(fu, algo, budget, num_workers=num_workers, seed=next(seedg))
if not xp.is_incoherent:
yield xp


@registry.register
def simple_tsp(seed: tp.Optional[int] = None, complex_tsp: bool = False) -> tp.Iterator[Experiment]:
"""Simple TSP problems. Please note that the methods we use could be applied or complex variants, whereas
Expand All @@ -1274,7 +1344,22 @@ def simple_tsp(seed: tp.Optional[int] = None, complex_tsp: bool = False) -> tp.I
"""
funcs = [STSP(10 ** k, complex_tsp) for k in range(2, 6)]
seedg = create_seed_generator(seed)
optims = get_optimizers("basics", "noisy", seed=next(seedg))
optims = [
"RotatedTwoPointsDE",
"DiscreteLenglerOnePlusOne",
"DiscreteDoerrOnePlusOne",
"DiscreteBSOOnePlusOne",
"AdaptiveDiscreteOnePlusOne",
"GeneticDE",
"RotatedTwoPointsDE",
"DE",
"TwoPointsDE",
"DiscreteOnePlusOne",
"NGOpt38",
"CMA",
"MetaModel",
"DiagonalCMA",
]
for budget in [25, 50, 100, 200, 400, 800, 1600, 3200, 6400, 12800, 25600]:
for num_workers in [1]: # , 10, 100]:
if num_workers < budget:
Expand Down Expand Up @@ -1851,6 +1936,9 @@ def pbo_suite(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:

def causal_similarity(seed: tp.Optional[int] = None) -> tp.Iterator[Experiment]:
"""Finding the best causal graph"""
# pylint: disable=import-outside-toplevel
from nevergrad.functions.causaldiscovery import CausalDiscovery

seedg = create_seed_generator(seed)
optims = ["CMA", "NGOpt8", "DE", "PSO", "RecES", "RecMixES", "RecMutDE", "ParametrizationDE"]
func = CausalDiscovery()
Expand Down
2 changes: 1 addition & 1 deletion nevergrad/benchmark/exporttable.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand Down
2 changes: 1 addition & 1 deletion nevergrad/benchmark/frozenexperiments.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
Expand Down
Loading

0 comments on commit 503299a

Please sign in to comment.