Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Port GH actions and other fixes to master #3061

Closed
wants to merge 7 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
25 changes: 25 additions & 0 deletions .travis.yml → .disabled-travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -111,6 +111,31 @@ jobs:
INSTALL_HOLE="false"
CODECOV="true" SETUP_CMD="${PYTEST_FLAGS} --cov=MDAnalysis"

- os: linux
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we really planning to bring aarch64 support to 1.0.x? Given it's still experimental (and I think it requires as-of-yet unreleased numpy versions), maybe it's best to leave it out for now at least?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps @tylerjereddy has a better view of whether it fits in a near-future 1.0.1 release?

language: python
arch: arm64-graviton2
python:
- "3.7"
dist: focal
virt: vm
group: edge
before_install:
- python -m pip install cython numpy
# special test SciPy wheel for ARM64:
- python -m pip install https://anaconda.org/multibuild-wheels-staging/scipy/1.6.0.dev0+a240c17/download/scipy-1.6.0.dev0+a240c17-cp37-cp37m-manylinux2014_aarch64.whl
- python -m pip install --no-build-isolation hypothesis matplotlib pytest pytest-cov pytest-xdist tqdm
install:
- cd package
- python setup.py install
- cd ../testsuite
- python setup.py install
- cd ..
script:
- cd testsuite
- python -m pytest ./MDAnalysisTests --disable-pytest-warnings -n 8 -rsx --cov=MDAnalysis
after_success:
- echo "Override this stage for ARM64"

allow_failures:
- env: NUMPY_VERSION=dev EVENT_TYPE="cron"

Expand Down
357 changes: 357 additions & 0 deletions .github/workflows/gh-ci.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,357 @@
name: mda_gh_ci
on:
push:
branches:
- develop
- master
pull_request:
branches:
- develop
- master

defaults:
run:
shell: bash -l {0}

env:
MDA_CONDA_MIN_DEPS: "pip pytest mmtf-python biopython networkx cython matplotlib scipy griddataformats hypothesis gsd codecov"
MDA_CONDA_EXTRA_DEPS: "seaborn"
MDA_CONDA_PY3_DEPS: "netcdf4 scikit-learn joblib>=0.12 clustalw=2.1 chemfiles mock"
MDA_CONDA_PY27_DEPS: "clustalw=2.1 netcdf4=1.3.1 six=1.15.0 funcsigs<=1.0.2 mock<=3.0.5"
MDA_CONDA_PY35_DEPS: "netcdf4 scikit-learn joblib>=0.12 clustalw=2.1 mock<=3.0.5"
MDA_PIP_MIN_DEPS: 'coveralls coverage<5 pytest-cov pytest-xdist'
MDA_PIP_EXTRA_DEPS: 'duecredit parmed'
MDA_PIP_PY27_DEPS: "setuptools<45.0.0 chemfiles scikit-learn joblib>=0.12"
MDA_PIP_PY3_DEPS: ""
MDA_PIP_PY35_DEPS: ""
Comment on lines +17 to +26
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This and adding the 2.7 and 3.5 jobs are the primary changes I made. Py3.5 and 2.7 seemed to require a lot of pinning in gh actions for conda to resolve dependencies.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we have to pin for versions, should this be reflected in our setup (or maybe it is)? We probably want to avoid further matplotlib-like issues in 1.0.1?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So conda never refused to resolve dependencies in the 1-2 hours I would let it run. In addition, on both my linux and mac machines it resolved the original unpinned versions easily within a couple minutes. Moreover, Travis didn't require pinning. I'm more inclined to think that this is a GH Action quirk than anything else. On the other hand, seeing as we know those versions work, I am up for it.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To be honest this sounds more like miniconda trickery more than GH issues :/ I don't know, I guess we can't release a new version that goes beyond tested limits.


jobs:
main_tests:
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest, ]
python-version: [3.6, 3.7, 3.8]
lilyminium marked this conversation as resolved.
Show resolved Hide resolved
extra_deps: ["EXTRA PY3", ]
install_hole: [true, ]
codecov: [true, ]
conda-version: ["latest", ]
include:
- name: python-2.7
os: ubuntu-latest
python-version: 2.7
install_hole: true
codecov: true
numpy: numpy=1.16
extra_deps: "EXTRA PY27"
conda-version: 4.7.10
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pinning conda didn't really help in resolving dependencies but I found astropy's reason for pinning persuasive (conda is not always so stable from update to update)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given that the old CI behaviour was just to pin for everything, wouldn't it be cleaner to enforce this as a blanket thing for the entire matrix?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it broke for a 3.7 job?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How odd, is this the same version that the ci helpers were pinning to?

- name: python-3.5
os: ubuntu-latest
python-version: 3.5
install_hole: true
codecov: true
extra_deps: "EXTRA PY35"
- name: macOS
os: macOS-latest
python-version: 3.7
run_type: FULL
install_hole: true
codecov: true
- name: minimal-ubuntu
os: ubuntu-latest
python-version: 3.6
extra_deps: ""
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This has an empty extra_deps, but macOS doesn't? Can we get away with nothing here?

install_hole: false
codecov: true
- name: numpy_min
os: ubuntu-latest
python-version: 3.6
extra_deps: ""
install_hole: false
codecov: false
numpy: numpy=1.16.0
- name: asv_check
os: ubuntu-latest
python-version: 3.7
extra_deps: "EXTRA PY3"
install_hole: false
codecov: false
env:
CYTHON_TRACE_NOGIL: 1
MPLBACKEND: agg
GH_OS: ${{ matrix.os }}

steps:
- uses: actions/checkout@v2

- name: setup_osx
if: startsWith(matrix.os, 'macOS')
run: |
# Set OS specific vars and compiler flags
echo "OS_NAME=osx" >> $GITHUB_ENV
# TODO: work out why this is necessary (from CI helpers)
echo "MACOSX_DEPLOYMENT_TARGET=10.9" >> $GITHUB_ENV
ulimit -S -n 2048
clang -v
echo "CC=clang" >> $GITHUB_ENV
clang++ -v
echo "CXX=clang++" >> $GITHUB_ENV
gfortran-9 -v
echo "FC=gfortran-9" >> $GITHUB_ENV

- name: setup_linux
if: startsWith(matrix.os, 'ubuntu')
run: |
# Set OS specific vars and compiler flags
echo "OS_NAME=linux" >> $GITHUB_ENV
gcc -v
echo "CC=gcc" >> $GITHUB_ENV
g++ -v
echo "CXX=g++" >> $GITHUB_ENV
gfortran -v
echo "FC=gfortran" >> $GITHUB_ENV

- name: setup_miniconda
uses: conda-incubator/setup-miniconda@v2
with:
python-version: ${{ matrix.python-version }}
auto-update-conda: true
channel-priority: flexible
channels: biobuilds, conda-forge, defaults
add-pip-as-python-dependency: true
miniconda-version: ${{ matrix.conda-version }}
# TODO: mamba causes pip to segfault, switch when fixed
#mamba-version: "*"
architecture: x64

- name: install_deps
run: |
# NOTE: vars need to be re-assigned
# NOTE: set matrix.numpy to pin to a specific numpy version
conda_deps="${{ matrix.numpy }} ${MDA_CONDA_MIN_DEPS}"
pip_deps="${MDA_PIP_MIN_DEPS}"

for dep in ${{ matrix.extra_deps }}; do
conda_var="MDA_CONDA_${dep}_DEPS"
pip_var="MDA_PIP_${dep}_DEPS"
echo "${!conda_var}"
conda_deps="$conda_deps ${!conda_var}"
echo $conda_deps
pip_deps="$pip_deps ${!pip_var}"
done
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I changed this part from @IAlibay's original setup because setting up all the environment variables was a little bit of a pain. I am however happy to go back to the original construct now that we've figured out which dependencies to list.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't particularly think we should push for keeping things the same as what they are on develop (they are both completely different kettles of fish).

That being said, at least to me, the dependency list is getting quite bloated/hard to follow.

As a thought here, is there maybe a way we could improve readability/maintainability by splitting up the job into multiple versions of it? It would mean a lot of duplication, but if it comes with increased readability, I personally think it might be a good tradeoff.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could do -- we could also have environment.yml files for pythons 2.7, 3.5, fully featured, and minimal. That also saves users from having to work this stuff out.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So the reason I chose to avoid env files in develop is that they cause really big slowdowns for some reason, conda setup time was going from ~ 3 mins to ~ 15 mins. That and mamba 100% hates pip in environment files.

I'd prefer stay away from them on develop, but if we need them to clean up the matrix here, I don't have any objections (as long as CI actually completes).


conda install ${conda_deps}
pip install ${pip_deps}

# also install asv if required
if [ ${{ matrix.name }} = "asv_check" ]; then
pip install asv
fi

- name: check_setup
run: |
# Check OS and python setup
echo "OS: ${OS_NAME}"
which python
which pip
pip list
conda info
conda list

- name: install_hole
if : matrix.install_hole
run: |
# We manually build hole2 to avoid OS incompatibilities
git clone https://github.com/MDAnalysis/hole2.git
cd hole2/src
source ../source.apache
(make FC=${FC}) && (make PREFIX=${HOME}/hole2 FC=${FC} install)
source ../source.unset
echo "HOLE_BINDIR=${HOME}/hole2/bin" >> $GITHUB_ENV
echo "${HOME}/hole2/bin" >> $GITHUB_PATH

- name: install_mda
run: |
# TODO: using install instead of develop here causes coverage to drop
# for .pyx file. If possible a solution for this should be found.
(cd package/ && python setup.py develop) && (cd testsuite/ && python setup.py install)

- name: run_tests
if: contains(matrix.name, 'asv_check') != true
run: |
PYTEST_FLAGS="--disable-pytest-warnings --durations=50"
if [ ${{ matrix.codecov }} = "true" ]; then
PYTEST_FLAGS="${PYTEST_FLAGS} --cov=MDAnalysis --cov-report=xml"
fi
echo $PYTEST_FLAGS
pytest -n 2 testsuite/MDAnalysisTests $PYTEST_FLAGS

- name: run_asv
if: contains(matrix.name, 'asv_check')
run: |
cd benchmarks
time python -m asv check -E existing

- name: codecov
if: matrix.codecov
uses: codecov/codecov-action@v1
with:
file: coverage.xml
fail_ci_if_error: True
verbose: True


build_docs:
if: "github.repository == 'MDAnalysis/mdanalysis'"
runs-on: ubuntu-latest
env:
CYTHON_TRACE_NOGIL: 1
MPLBACKEND: agg

steps:
- uses: actions/checkout@v2

- name: setup_miniconda
uses: conda-incubator/setup-miniconda@v2
with:
python-version: 3.7
auto-update-conda: true
channel-priority: flexible
channels: biobuilds, conda-forge
add-pip-as-python-dependency: true
architecture: x64

- name: install_deps
run: |
conda_deps="${{ env.MDA_CONDA_MIN_DEPS }} ${{ env.MDA_CONDA_EXTRA_DEPS}}"
pip_deps="${{ env.MDA_PIP_MIN_DEPS}} ${{ env.MDA_PIP_EXTRA_DEPS }} sphinx==1.8.5 sphinx-sitemap sphinx_rtd_theme msmb_theme==1.2.0"
conda install ${conda_deps}
pip install ${pip_deps}

- name: install_mda
run: |
cd package && python setup.py develop

- name: build_docs
run: |
cd package && python setup.py build_sphinx -E

- name: deploy_docs
if: github.event_name != 'pull_request'
env:
GH_USER: github-actions
GH_EMAIL: "[email protected]"
GH_REPOSITORY: "github.com/${{ github.repository }}.git"
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
URL: https://docs.mdanalysis.org

run: |
# set up environment variables
# cannot execute bash to make variables in env section
# export URL for the Python script $UPDATE_JSON
export URL
export VERSION=$(cd package/MDAnalysis; python -c 'import version; print(version.__version__)')
UPDATE_JSON=$(pwd)/maintainer/update_json_stubs_sitemap.py
BRANCH="${GITHUB_REF#refs/heads/}"

# the below turns off non-blocking as it causes large writes to stdout to fail
# (see https://github.com/travis-ci/travis-ci/issues/4704)
# commented out as this is not a problem with gh-actions
# python -c 'import os,sys,fcntl; flags = fcntl.fcntl(sys.stdout, fcntl.F_GETFL); fcntl.fcntl(sys.stdout, fcntl.F_SETFL, flags&~os.O_NONBLOCK);'
cd package/doc/html/html

# move docs into version subfolder
mkdir ../${VERSION} && mv * ../${VERSION} && mv ../${VERSION} $VERSION

# set up git
REV=$(git rev-parse --short HEAD)
git init
git config user.name $GH_USER
git config user.email $GH_EMAIL
git remote add upstream "https://${GH_USER}:${GH_TOKEN}@${GH_REPOSITORY}"
git fetch --depth 50 upstream $BRANCH gh-pages
git reset upstream/gh-pages

# redirects and copies
mkdir latest
python $UPDATE_JSON
touch .
touch .nojekyll

git add -A ${VERSION}/
git add .nojekyll versions.json *.xml *.html index.html latest

for dirname in dev stable documentation_pages ; do
if [ -d $dirname ]; then git add $dirname; fi
done

# check for anything to commit
# https://stackoverflow.com/questions/3878624/how-do-i-programmatically-determine-if-there-are-uncommited-changes
git diff-index --quiet HEAD -- || git commit -m "rebuilt html docs for version ${VERSION} from branch ${BRANCH} with sphinx at ${REV}"
git push -q upstream HEAD:gh-pages


pylint_check:
if: "github.repository == 'MDAnalysis/mdanalysis'"
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v2

- name: setup_miniconda
uses: conda-incubator/setup-miniconda@v2
with:
python-version: 3.7
auto-update-conda: true
channel-priority: flexible
channels: conda-forge
add-pip-as-python-dependency: true
mamba-version: "*"
architecture: x64

- name: install
run: |
which pip
which python
pip install pylint

- name: pylint
env:
PYLINTRC: package/.pylintrc
run: |
pylint --py3k package/MDAnalysis && pylint --py3k testsuite/MDAnalysisTests


pypi_check:
if: "github.repository == 'MDAnalysis/mdanalysis'"
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v2

- name: setup_miniconda
uses: conda-incubator/setup-miniconda@v2
with:
python-version: 3.7
auto-update-conda: true
channel-priority: flexible
channels: conda-forge
add-pip-as-python-dependency: true
mamba-version: "*"
architecture: x64

- name: install_conda
run: |
conda install setuptools cython numpy twine

- name: install_mdanalysis
run: |
cd package && python setup.py sdist

- name: check_package_build
run: |
DISTRIBUTION=$(ls -t1 package/dist/MDAnalysis-*.tar.gz | head -n 1)
test -n "${DISTRIBUTION}" || { echo "no distribution package/dist/MDAnalysis-*.tar.gz found"; exit 1; }
echo "twine check $DISTRIBUTION"
twine check $DISTRIBUTION
Loading