In this assignment, you are required to use an open source AutoML toolkit (NNI) for neural architecture search. You are ask to compare three different NAS methods: Efficient Neural Architecture Search via Parameter Sharing (ENAS)[1] and DARTS: Differentiable Architecture Search[2].
[1] Pham, H., Guan, M., Zoph, B., Le, Q. and Dean, J., 2018, July. Efficient neural architecture search via parameters sharing. In International Conference on Machine Learning (pp. 4095-4104). PMLR.
[2] Liu, H., Simonyan, K. and Yang, Y., 2018. Darts: Differentiable architecture search. arXiv preprint arXiv:1806.09055.
- Install toolkit NNI following the instruction
- Perform CNN architecture search for Cifar10 (training and testing datasets are automatically split). You need to compare ENAS and DARTS. Both methods require retraining after searching.
- Implement retrain.py for ENAS method. Please refer to code.
- Please record the top1 testing accuracy per GPU hours. You can present the results using a figure (referring to Figure 3 of [2]). Single-trial experiments will be fine. Please use the same GPU type for a fair comparison.
- Show the final architecture for each method. %%Bonus%% - Check the visualization method on NNI Document
- You are required to submit a report to show the comparison results, discuss the results, and analyze the limitation of ENAS and DARTS.
Python >= 3.6, Pytorch >= 1.7.0
# search the best architecture
cd examples/nas/darts
python3 search.py --v1 --visualization
# train the best architecture
python3 retrain.py --arc-checkpoint ${Your saved checkpoint}
cd examples/nas/enas/
# search in micro search space, where each unit is a cell
python3 search.py --search-for micro --v1 --visualization
Your need to implement the retraining code first and run retrain. You can refer to retrain.py of DARTS or this repo.
Remark: You may meet possible error on .view
function in utils.py.
Change it to .reshape
will work.