Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Benchmarks expect flags #3519

Closed
jbarnoud opened this issue Feb 4, 2022 · 1 comment · Fixed by #4360
Closed

Benchmarks expect flags #3519

jbarnoud opened this issue Feb 4, 2022 · 1 comment · Fixed by #4360
Labels
benchmarks performance benchmarking with ASV Continuous Integration defect

Comments

@jbarnoud
Copy link
Contributor

jbarnoud commented Feb 4, 2022

Expected behavior

The benchmarks run without error.

Actual behavior

Some benchmarks fail because they expect flags that were removed in version 2.0.0. Here is a sampling of the errors:

For parameters: 'prop abs z <= 5.0', False, [True, False]
Traceback (most recent call last):
 File "/home/benchy/mambaforge/lib/python3.9/site-packages/asv/benchmark.py", line 1184, in main_run_server
   main_run(run_args)
 File "/home/benchy/mambaforge/lib/python3.9/site-packages/asv/benchmark.py", line 1058, in main_run
   result = benchmark.do_run()
 File "/home/benchy/mambaforge/lib/python3.9/site-packages/asv/benchmark.py", line 537, in do_run
   return self.run(*self._current_params)
 File "/home/benchy/mambaforge/lib/python3.9/site-packages/asv/benchmark.py", line 627, in run
   samples, number = self.benchmark_timing(timer, min_repeat, max_repeat,
 File "/home/benchy/mambaforge/lib/python3.9/site-packages/asv/benchmark.py", line 694, in benchmark_timing
   timing = timer.timeit(number)
 File "/home/benchy/mambaforge/lib/python3.9/timeit.py", line 177, in timeit
   timing = self.inner(it, self.timer)
 File "<timeit-src>", line 6, in inner
 File "/home/benchy/mambaforge/lib/python3.9/site-packages/asv/benchmark.py", line 599, in <lambda>
   func = lambda: self.func(*param)
 File "/home/benchy/mdanalysis/benchmarks/benchmarks/selections.py", line 76, in time_geometric_selections
   MDAnalysis.core.flags['use_periodic_selections'] = periodic_selection[0]
AttributeError: module 'MDAnalysis.core' has no attribute 'flags'
@orbeckst orbeckst added benchmarks performance benchmarking with ASV Continuous Integration defect labels Feb 20, 2022
@orbeckst
Copy link
Member

These flags can be removed since we are only benchmarking since 1.0.1 where these flags didn't exist anymore. One has to check if the equivalent behavior is now provided by kwargs for the methods.

tylerjereddy added a commit to tylerjereddy/mdanalysis that referenced this issue Dec 10, 2023
* Fixes MDAnalysisgh-3519

* There is no good reason to keep the flags in the
benchmark--this benchmark has been broken for over a year
so we should just use the appropriate kwargs where possible
and leave a `TODO` comment for core flags that we don't
have a substitute for

* better to have this running than hard failing all the time,
even if we've lost one of the parametrization dimensions

* if someone comes along with the confidence to say we
can delete the `TODO` comment, then we can also delete
some of the extra parametrization complexity of course

[skip cirrus]
RMeli pushed a commit that referenced this issue Dec 11, 2023
* Fixes gh-3519

* There is no good reason to keep the flags in the
benchmark--this benchmark has been broken for over a year
so we should just use the appropriate kwargs where possible
and leave a `TODO` comment for core flags that we don't
have a substitute for

* better to have this running than hard failing all the time,
even if we've lost one of the parametrization dimensions

* if someone comes along with the confidence to say we
can delete the `TODO` comment, then we can also delete
some of the extra parametrization complexity of course

[skip cirrus]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
benchmarks performance benchmarking with ASV Continuous Integration defect
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants