Logic synthesis oriented Bayesian optimsation library developped by Huawei Noah's Ark lab. Developped to carry out the experiments reported in BOiLS: Bayesian Optimisation for Logic Synthesis, accepted at DATE22 conference.
Antoine Grosnit, Cedric Malherbe, Rasul Tutunov, Xingchen Wan, Jun Wang, Haitham Bou-Ammar -- Huawei Noah's Ark lab.
Our experiments were performed on two machines with Intel Xeon CPU E5-2699 [email protected], 64GB RAM, running
Ubuntu 18.04.4 LTS and equipped with one NVIDIA Tesla
V100 GPU. All algorithms were implemented in Python 3.7 relying on ABC v1.01
.
- Install yosys
sudo apt-get update -y
sudo apt-get install -y yosys
- Create Python 3.7 venv
# Create virtualenv
python3.7 -m venv ./venv
# Activate venv
source venv/bin/activate
# Try installing requirements
pip install ./requirements.txt # if getting issues with torch installation visit: https://pytorch.org/get-started/previous-versions/
#----- Begin Graph-RL: if you need to run Graph-RL experiments, you need to install the following (you can skip this if BOiLS is only what you need):
# follow instructions from: https://github.com/krzhu/abc_py
#----- End Graph-RL -----
Dataset and results should be stored in the same directory STORAGE_DIRECTORY
: run python utils_save.py
and follow
the instructions given in the FileNotFoundError
message. Rerun python utils_save.py
to check
where the data should be saved (DATA_PATH
) and where the results will be stored.
- download the circuits from EPFL Combinatorial Benchmark Suite and put them
(only the "*.blif" are needed) in
DATA_PATH/benchmark_blif/
(not in a subfolder as the code will look for the circuits directly asDATA_PATH/benchmark_blif/*.blif
).
If comparing with our reported results, run the following in your environment and make sure the output statistics are the same:
DATA_PATH=... # change with your DATA_PATH
yosys-abc -c "read $DATA_PATH/benchmark_blif/sqrt.blif; strash; balance; rewrite -z; if -K 6; print_stats;"
# Should output:
# top : i/o = 128/ 64 lat = 0 nd = 4005 edge = 19803 aig = 29793 lev = 1023
The code is organised in a modular way, providing coherent API for all optimisation methods. Third-party libraries used for the baseline implementations can be found in
the resources directory, while the scripts to run the synthesis flow optimisation experiments are in the
core folder. The only exception to this organisation is for DRiLLS
algorithm whise implementation is stored in DRiLLS.
BOiLS can be run as shown below to find a sequence of logic synthesis primitives optimising the area / delay of a given circuit (e.g. log2.blif
from EPFL benchmark).
python ./core/algos/bo/boils/main_multi_boils.py --designs_group_id log2 --n_parallel $n_parallel 1 \
--seq_length 20 --mapping fpga --action_space_id extended --ref_abc_seq resyn2 \
--n_total_evals 200 --n_initial 20 --device 0 --lut_inputs 4 --use_yosys 1 \
--standardise --ard --acq ei --kernel_type ssk \
--length_init_discrete_factor .666 --failtol 40 \
--objective area \
--seed 0"
Meaning of all the parameters are provided in the script: ./core/algos/bo/hebo/multi_hebo_exp.sh. We created similar scripts for a wide set of optimisers, as detailed in the following section.
To run sequence optimisation using COMBO you need to download code of the
official implementation, and to put it in the ./resources/
folder.
cd resources
wget https://github.com/QUVA-Lab/COMBO/archive/refs/heads/master.zip
unzip master.zip
mv COMBO-master/ COMBO
Algorithm | Implementation | Optimisation script | Comment |
---|---|---|---|
Reinforcement Learning | |||
DRiLLS | ./DRiLLS | ./DRiLLS/drills_script.sh | Implementation was taken from the DRiLLS official repository . The code is adapted to run with PPO and A2C update rules, using stable-baselines library. |
Graph-RL | ./resources/abcRL | ./core/algos/GRiLLS/multi_grills_exp.sh | Implementation was taken from the abcRL official repository. The reward function has been changed so that agents optimise the QoR improvement on both area and delay. |
Bayesian optimisation | |||
Standard BO | ./core/algos/bo/hebo | ./core/algos/bo/hebo/multi_hebo_exp.sh | Implementation was taken from HEBO. |
COMBO | ./core/algos/bo/combo | ./core/algos/bo/boils/multi_combo_exp.sh | Using official COMBO implementation: COMBO. |
BOiLS | ./core/algos/bo/boils | ./core/algos/bo/boils/multiseq_boils_exp.sh | Adaptation of Casmopolitan implementation using a string-subsequence kernel (SSK) in the surrogate model. The SSK is a pytorch rewriting of BOSS implementation. |
Genetic Algorithm | |||
Simple Genetic Algorithm | ./core/algos/genetic/sga | ./core/algos/genetic/sga/multi_sga_exp.sh | Used simple genetic algorithm from geneticalgorithm2. |
Random Search | |||
Latin Hypercube Sampling (LHS) | ./core/algos/random | ./core/algos/random/multi_random_exp.sh | Used LHS from pymoo. |
Greedy Search | |||
Greedy oprimisation | ./core/algos/greedy | ./core/algos/greedy/main_greedy_exp.sh | Implementation from scratch (code). |
Grosnit, Antoine, et al. "Bayesian Optimisation for Logic Synthesis", DATE (2022).
@inproceedings{DBLP:conf/date/GrosnitMTWWB22,
author = {Antoine Grosnit and
C{\'{e}}dric Malherbe and
Rasul Tutunov and
Xingchen Wan and
Jun Wang and
Haitham Bou{-}Ammar},
editor = {Cristiana Bolchini and
Ingrid Verbauwhede and
Ioana Vatajelu},
title = {BOiLS: Bayesian Optimisation for Logic Synthesis},
booktitle = {2022 Design, Automation {\&} Test in Europe Conference {\&}
Exhibition, {DATE} 2022, Antwerp, Belgium, March 14-23, 2022},
pages = {1193--1196},
publisher = {{IEEE}},
year = {2022},
url = {https://doi.org/10.23919/DATE54114.2022.9774632},
doi = {10.23919/DATE54114.2022.9774632},
timestamp = {Wed, 25 May 2022 22:56:20 +0200},
biburl = {https://dblp.org/rec/conf/date/GrosnitMTWWB22.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
-
Stable baselines: A. Hill, A. Raffin et al., ''Stable Baselines,'' https://github.com/hill-a/stable-baselines, 2018.
-
DRiLLS: H. Abdelrahman, S. Hashemi et al. ''DRiLLS: Deep reinforcement learning for logic synthesis,'' 2020 25th Asia and South Pacific Design Automation Conference (ASP-DAC)
-
abcRL: K. Zhu et al., ''Exploring Logic Optimizations with Reinforcement Learning and Graph Convolutional Network,'' Proceedings of the 2020 ACM/IEEE Workshop on Machine Learning for CAD, 2020.
-
HEBO: A. Cowen-Rivers et al., ''An Empirical Study of Assumptions in Bayesian Optimisation,'' arXiv preprint arXiv:2012.03826, 2020
-
Casmopolitan: X. Wan et al., ''Think Global and Act Local: Bayesian Optimisation over High-Dimensional Categorical and Mixed Search Spaces,'' International Conference on Machine Learning (ICML), 2021.
-
BOSS: H. B. Moss, ''BOSS: Bayesian Optimization over String Spaces'', NeurIPS, 2020.
-
geneticalgorithm2: D. Pascal, ''geneticalgorithm2 (v.6.2.12)'', https://github.com/PasaOpasen/geneticalgorithm2, 2021.
-
pymoo: J. Blank and K. Deb, ''pymoo: Multi-Objective Optimization in Python,'' IEEE Access, 2020.