-
Notifications
You must be signed in to change notification settings - Fork 1
Home
Using nbconvert:
jupyter nbconvert --to notebook --inplace --execute --ExecutePreprocessor.timeout=1200 notebooks/*
This executes all notebooks in-place, overwriting the previous state of the output cells.
Another option is jupyter-cache (which caches the executed state in a separate location and intelligently determines if that needs to be updated) - this is used by jupyter-book, which is in turn used by nbmake. nbmake is a plugin for pytest, with possibility for parallel execution, and by using jupyter-book, offers some extra control on the generated notebook html using notebook cell metadata.
Non-automated initial jupyter book is at https://vigilant-kirch-b18653.netlify.app - transitioning to swarm.magneticearth.org for permanent location.
Recipe to compile book locally:
git clone https://github.com/Swarm-DISC/Swarm_notebooks.git
cd Swarm_notebooks
docker run -v $(pwd):/home/jovyan -u root -e VIRES_TOKEN=$VIRES_TOKEN \
registry.gitlab.eox.at/esa/vires_vre_ops/vre-swarm-notebook:0.8.0 \
bash -c \
'
export CDF_LIB=/opt/conda/lib && \
pip install --upgrade viresclient && \
viresclient set_token https://vires.services/ows $VIRES_TOKEN && \
viresclient set_default_server https://vires.services/ows && \
pip install jupyter-book --ignore-installed && \
jupyter-book build .
'
Experiment with using nbmake to benefit from parallelisation:
pip install nbmake[html] pytest-xdist[psutil] jupyter-book
pytest --numprocesses 2 --nbmake notebooks/*.ipynb --deselect=notebooks/04c1_Geomag-Ground-Data-FTP.ipynb
should store the executed state in jupyter-cache? then use
jupyter-book build .
to build the book from the cache. Might need jupyter-book clean .
to fix things between builds.
Or should that be pytest --numprocesses 2 --nbmake --overwrite
- does the cache not work?