Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix nightly NumPy cron job, add numpy 1.25-dev0 fixes #3977

Merged
merged 31 commits into from
Jan 4, 2023
Merged
Show file tree
Hide file tree
Changes from 27 commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
4bc48d5
check what's installed by cron job
IAlibay Dec 20, 2022
0a75811
add workflow trigger
IAlibay Dec 20, 2022
0b1f58f
move away from mamba since it won't respect pip installs
IAlibay Dec 20, 2022
ca2c07a
add build dependencies
IAlibay Dec 20, 2022
4324868
Merge branch 'develop' into investigate-cron
IAlibay Dec 23, 2022
8b1e8bf
strict channel priority for cron ci
IAlibay Dec 27, 2022
a6feb4c
switch things around
IAlibay Dec 27, 2022
4cd3458
commas around your 3.10
IAlibay Dec 27, 2022
58249dc
upgrade on the pip install
IAlibay Dec 27, 2022
2895585
use pre flag for pip
IAlibay Dec 27, 2022
f239d37
check if it's caused by numpy 1.25
IAlibay Dec 27, 2022
4f82253
add numpy nightly wheels back in
IAlibay Dec 28, 2022
1ad1508
circumvent elementwise comparison completely
IAlibay Jan 2, 2023
e2a2da0
is not ==
IAlibay Jan 2, 2023
4ccd37c
Add changelog entry, make cron-ci cron-only
IAlibay Jan 2, 2023
3dc9773
tie off 2.4.2 release
IAlibay Jan 2, 2023
3724efe
Is actions CI different?
IAlibay Jan 2, 2023
b5b521c
minor changes to CI yaml file
IAlibay Jan 2, 2023
3b3e396
try directly invoking it from the commandline
IAlibay Jan 2, 2023
10f9ae1
remove intentional ValueError raising
IAlibay Jan 2, 2023
ff70268
try using the sigterm variable
IAlibay Jan 2, 2023
deb7499
Maybe the try/finally encapsulation does some magic?
IAlibay Jan 3, 2023
301a359
try just one entry in coveragerc
IAlibay Jan 3, 2023
d3c4130
add streamlines 2D fix
IAlibay Jan 4, 2023
2a008a3
get pytest to fail on FutureWarning for cron CI
IAlibay Jan 4, 2023
9d96661
temporarily undo cron only trigger for action
IAlibay Jan 4, 2023
a16d035
does reverting coveragerc changes affect codecov report?
IAlibay Jan 4, 2023
2bc1f88
Update changelog, remove try branch in streamlines
IAlibay Jan 4, 2023
c5dab47
update changelog date for 2.4.2
IAlibay Jan 4, 2023
81e2891
set cron job to only work via schedule
IAlibay Jan 4, 2023
2f6f577
correct 1.25 not 1.24
IAlibay Jan 4, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .coveragerc
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ omit =
*/legacy/*
*/due.py
concurrency = thread, multiprocessing
parallel = True
parallel = true

[report]
exclude_lines =
Expand Down
38 changes: 25 additions & 13 deletions .github/workflows/gh-ci-cron.yaml
Original file line number Diff line number Diff line change
@@ -1,5 +1,11 @@
name: GH Actions Cron CI
on:
push:
branches:
- develop
pull_request:
branches:
- develop
schedule:
# 3 am Tuesdays and Fridays
- cron: "0 3 * * 2,5"
Expand Down Expand Up @@ -35,26 +41,32 @@ jobs:
- name: setup_miniconda
IAlibay marked this conversation as resolved.
Show resolved Hide resolved
uses: conda-incubator/setup-miniconda@v2
with:
python-version: 3.9
auto-update-conda: true
channel-priority: flexible
python-version: "3.10"
miniforge-variant: Mambaforge
miniforge-version: latest
channel-priority: strict
channels: conda-forge, bioconda
mamba-version: "*"
add-pip-as-python-dependency: true
architecture: x64

- name: install_dev_versions
run: |
pip install -i https://pypi.anaconda.org/scipy-wheels-nightly/simple numpy
pip install -i https://pypi.anaconda.org/scipy-wheels-nightly/simple scipy

- name: install_other_deps
- name: install_deps
uses: ./.github/actions/setup-deps
with:
mamba: true
numpy: ""
scipy: ""
full-deps: true

# overwrite installs by picking up nightly wheels
- name: nightly_wheels
run: |
pip install --pre -U -i https://pypi.anaconda.org/scipy-wheels-nightly/simple scipy numpy h5py matplotlib
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

comment: there's a lot of .\\agic hidden in these action files... I would have no idea where to look for these wheels.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think one of the next major tasks here will be to a) document the CI actions, b) clean them up (it's grown a lot beyond what I initially planned back at the start of last year when I squashed things into using re-usable actions).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just to add here before I forget - there is a draft SPEC for nightly wheels: https://scientific-python.org/specs/spec-0004/

It needs a lot of fleshing out though...



- name: list_deps
run: |
mamba list
pip list

# Intentionally going with setup.py builds so we can build with latest
- name: build_srcs
uses: ./.github/actions/build-src
with:
Expand All @@ -64,7 +76,7 @@ jobs:

- name: run_tests
run: |
pytest -n $numprocs testsuite/MDAnalysisTests --disable-pytest-warnings --durations=50
pytest -n $numprocs testsuite/MDAnalysisTests --durations=50 -W error::FutureWarning
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ideally we'd error on DeprecationWarnings too, but we emit too many of them internally for this to be viable. At least this way we can maybe pick up potential issues?



# Issue #3442
Expand Down
5 changes: 4 additions & 1 deletion package/CHANGELOG
Original file line number Diff line number Diff line change
Expand Up @@ -26,13 +26,16 @@ Changes
Deprecations


12/21/22 IAlibay
01/03/23 IAlibay

* 2.4.2

Fixes
* np.histogramdd calls in :class:`DensityAnalysis` now pass the `density`
argument rather than the NumPy 1.24 removed `normed` (PR #3976)
* visualization.streamlines_3D no longer relies on the scalar return of an
numpy array elementwise comparison which is no longer supported as of
NumPy 1.24 (PR #3977)


12/17/22 IAlibay
Expand Down
4 changes: 2 additions & 2 deletions package/MDAnalysis/visualization/streamlines.py
Original file line number Diff line number Diff line change
Expand Up @@ -176,7 +176,7 @@ def produce_list_centroids_this_frame(list_indices_in_polygon):
list_centroids_this_frame = []
for indices in list_indices_in_polygon:
if not indices[0].size > 0: # if there are no particles of interest in this particular square
list_centroids_this_frame.append('empty')
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same issue as per streamlines_3D, also can't get coverage over it (even though the new test assigns None to everything (hence the zeroed array return)

list_centroids_this_frame.append(None)
else:
current_coordinate_array_in_square = relevant_particle_coordinate_array_xy[indices]
current_square_indices_centroid = np.average(current_coordinate_array_in_square, axis=0)
Expand All @@ -201,7 +201,7 @@ def produce_list_centroids_this_frame(list_indices_in_polygon):
xy_deltas_to_write = []
for square_1_centroid, square_2_centroid in zip(list_centroids_this_frame_using_indices_from_last_frame,
list_previous_frame_centroids):
if square_1_centroid == 'empty' or square_2_centroid == 'empty':
if square_1_centroid is None or square_2_centroid is None:
xy_deltas_to_write.append([0, 0])
else:
xy_deltas_to_write.append(np.subtract(square_1_centroid, square_2_centroid).tolist())
Expand Down
22 changes: 13 additions & 9 deletions package/MDAnalysis/visualization/streamlines_3D.py
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,9 @@

import MDAnalysis

def determine_container_limits(topology_file_path, trajectory_file_path, buffer_value):

def determine_container_limits(topology_file_path, trajectory_file_path,
buffer_value):
"""Calculate the extent of the atom coordinates + buffer.

A function for the parent process which should take the input trajectory
Expand Down Expand Up @@ -281,15 +283,15 @@ def update_dictionary_point_in_cube_start_frame(array_simulation_particle_coordi
axis=0)
cube['centroid_of_particles_first_frame'] = centroid_particles_in_cube
else: # empty cube
cube['centroid_of_particles_first_frame'] = 'empty'
cube['centroid_of_particles_first_frame'] = None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

apparently not covered by tests... not sure if this is incorrect or if we must or can cover it but I'll mention it

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

O that is a bit problematic, we really must cover this. It also means we're getting 'empty' entries being fed somewhere else 🙀 Could you place a blocking review here?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

Copy link
Member Author

@IAlibay IAlibay Jan 2, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

aaaaahhh I was wondering why I was getting to print in that branch but CI wasn't picking up forced errors.. we're using multiprocessing 🤦🏽

I'll have a quick look around but it looks like:
a) Errors aren't being propagated from child workers (hence why it didn't raise the relevant "truth of an array" error),
b) Codecov isn't picking up on coverage since it's not being properly logged by the child workers... we do have the multiprocessing thing set in coveragerc, but it's obviously not working

cube_counter += 1

def update_dictionary_end_frame(array_simulation_particle_coordinates, dictionary_cube_data_this_core):
"""Update the cube dictionary objects again as appropriate for the second and final frame."""
cube_counter = 0
for key, cube in dictionary_cube_data_this_core.items():
# if there were no particles in the cube in the first frame, then set dx,dy,dz each to 0
if cube['centroid_of_particles_first_frame'] == 'empty':
if cube['centroid_of_particles_first_frame'] is None:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

comment: Looked innocently enough...

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm surprised it's the only failure we have! 1.25 is going to cause a lot of trouble for legacy downstream codes.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there's a really similar 'empty' check in

if square_1_centroid == 'empty' or square_2_centroid == 'empty':

Those probably also need to get changed...

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can confirm the same FutureWarning is happening in streamlines :/ the test isn't good enough to pick it up though.

cube['dx'] = 0
cube['dy'] = 0
cube['dz'] = 0
Expand Down Expand Up @@ -470,12 +472,14 @@ def log_result_to_parent(process_dict):
start_frame, end_frame)
#step 4: per process work using the above grid data split
pool = multiprocessing.Pool(num_cores)
for sub_dictionary_of_cube_data in list_dictionaries_for_cores:
pool.apply_async(per_core_work, args=(
start_frame_coord_array, end_frame_coord_array, sub_dictionary_of_cube_data, MDA_selection, start_frame,
end_frame), callback=log_result_to_parent)
pool.close()
pool.join()
try:
for sub_dictionary_of_cube_data in list_dictionaries_for_cores:
pool.apply_async(per_core_work, args=(
start_frame_coord_array, end_frame_coord_array, sub_dictionary_of_cube_data, MDA_selection, start_frame,
end_frame), callback=log_result_to_parent)
finally:
pool.close()
pool.join()
#so, at this stage the parent process now has a single dictionary with all the cube objects updated from all
# available cores
#the 3D streamplot (i.e, mayavi flow() function) will require separate 3D np arrays for dx,dy,dz
Expand Down
24 changes: 24 additions & 0 deletions testsuite/MDAnalysisTests/visualization/test_streamlines.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@
# J. Comput. Chem. 32 (2011), 2319--2327, doi:10.1002/jcc.21787
#
import numpy as np
from numpy.testing import assert_allclose
import MDAnalysis
from MDAnalysis.visualization import (streamlines,
streamlines_3D)
Expand Down Expand Up @@ -83,6 +84,29 @@ def test_streamplot_2D(membrane_xtc, univ, tmpdir):
with open(plot_outpath, 'rb'):
pass


def test_streamplot_2D_zero_return(membrane_xtc, univ, tmpdir):
# simple roundtrip test to ensure that
# zeroed arrays are returned by the 2D streamplot
# code when called with an empty selection
u1, v1, avg, std = streamlines.generate_streamlines(topology_file_path=Martini_membrane_gro,
trajectory_file_path=membrane_xtc,
grid_spacing=20,
MDA_selection='name POX',
start_frame=1,
end_frame=2,
xmin=univ.atoms.positions[...,0].min(),
xmax=univ.atoms.positions[...,0].max(),
ymin=univ.atoms.positions[...,1].min(),
ymax=univ.atoms.positions[...,1].max(),
maximum_delta_magnitude=2.0,
num_cores=1)
assert_allclose(u1, np.zeros((5, 5)))
assert_allclose(v1, np.zeros((5, 5)))
assert avg == approx(0.0)
assert std == approx(0.0)


def test_streamplot_3D(membrane_xtc, univ, tmpdir):
# because mayavi is too heavy of a dependency
# for a roundtrip plotting test, simply
Expand Down