Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update benchmark env to > 3.7 #11

Closed
orbeckst opened this issue Sep 17, 2021 · 7 comments · Fixed by #12
Closed

update benchmark env to > 3.7 #11

orbeckst opened this issue Sep 17, 2021 · 7 comments · Fixed by #12
Assignees

Comments

@orbeckst
Copy link
Member

orbeckst commented Sep 17, 2021

The benchmarks for 2.0 and greater fail because we discontinued 3.6 support:

· Getting existing hashes
· Creating environments
· Discovering benchmarks
·· Uninstalling from conda-py3.6-Cython-mdanalysistests-numpy-scipy
·· Building 39850215 <develop> for conda-py3.6-Cython-mdanalysistests-numpy-scipy
·· Error running ~/MDA/repositories/mdanalysis/benchmarks/env/80c5ba7efaa89366b97da3ced85c0c44/bin/python setup.py build (exit status 255)
  STDOUT -------->
  MDAnalysis requires Python 3.7 or better. Python 3.6 detected
  Please upgrade your version of Python.
  STDERR -------->

Update the benchmarking environment to 3.8.

@IAlibay
Copy link
Member

IAlibay commented Jan 5, 2022

I suggest jumping to 3.8 so we don't have to do this again for again for a year or so? (assuming we eventually start following NEP29)

Based on #9

Things to do

  • make sure that we only run 3.8 benchmarks since 1.0.1 (1.0.0 was broken): in benchmarking/benchmarks/scripts/cron.sh
 timeout 20h asv run -e -j 4 --config ${ASV_CONFIG} "release-1.0.1..HEAD --merges" --skip-existing || true
  • change matrix to 3.8 in ASV_CONFIG="${ASV_RUN_DIR}/asv_c3potato.conf.json"
  • disable cron job until history is built up
  • manually run to build a history from the earlier merge commits
  • test that new results appear in https://www.mdanalysis.org/benchmarks/
  • test cron job
  • update wiki docs

@orbeckst
Copy link
Member Author

orbeckst commented Jan 5, 2022

Thanks for taking the time to put the step-by-step.

Given that I have to do this (or most of it), how urgent is it?

@IAlibay
Copy link
Member

IAlibay commented Jan 5, 2022

@orbeckst not at all urgent.

For context I was putting the step by step looking at how much of it could be done by someone else and then realised that this wasn't the case :( [thought it was better to just post the steps than delete what I had written].

Given how much this ends up being your responsibility, is this maybe something we might want to apply for some AWS credits for?

@orbeckst
Copy link
Member Author

orbeckst commented Jan 5, 2022

AWS would give us more flexibility.

Things to consider

  • If the hardware changes, the benchmark comparison becomes meaningless between commits.
  • Benchmarking a single commit takes a few hours on my hardware. Thus, building up the history could consume a sizable amount of CPU-hours although one might be able to do this in parallel and one might restrict oneself to releases only (?).
  • Not all commits were added to the history, only merge commits (IIRC) to ensure that only working code was benchmarked.

@orbeckst
Copy link
Member Author

orbeckst commented Feb 4, 2022

I am just trying it out with changing 3.6 -> 3.8 ...

@orbeckst
Copy link
Member Author

orbeckst commented Feb 7, 2022

need to check wiki

@orbeckst orbeckst reopened this Feb 7, 2022
@orbeckst
Copy link
Member Author

orbeckst commented Feb 7, 2022

I updated the relevant wiki pages and started a table of benchmarked versions with the automated benchamrk.

@orbeckst orbeckst closed this as completed Feb 7, 2022
orbeckst added a commit to MDAnalysis/mdanalysis that referenced this issue Feb 7, 2022
- fix #3430
- automated benchmarks were updated (see MDAnalysis/benchmarks#11)
IAlibay pushed a commit to MDAnalysis/mdanalysis that referenced this issue Feb 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants