-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix a build problem with numpy 1.23 #431
Conversation
@lfarv in #363 you showed that for the moment we cannot build with MPI using In which case I would suggest that either #363 is solved (I do not see much problem in adding |
@swhite2401 : in this situation, you could still upgrade For the moment, I think the problem you and @simoneliuzzo encountered recently, and that everybody running python 3.8+ will encounter, is much worse. So I'm looking at solutions for building with MPI. |
@swhite2401 : here is a solution to build with MPI using
with the There is still a hope to find a cleaner solution, but does this allow to merge and solve this critical problem? |
Yes ok for me. |
@swhite2401 : did you try running with MPI ? I just compiled and checked that |
Not since it was introduced a while ago...should I try again now? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I haven't got sufficient python skills to approve or not this pull request
How does this affect the building of wheels as part of our release process? |
@MJGaughran : if we were to produce a wheel now, the C integrators for python3.8+ would use the numpy 1.23 ABI, so would be incompatible with any older version. As mentioned above, this would force the users to upgrade their numpy to 1.23. With this PR, wheels will be compiled versus the numpy version in the table above, that is the oldest numpy compatible with the python version. Consequently they will be compatible at run time with any newer numpy version. We had no problem until now because apparently the numpy ABI was stable since long. Now we can profit from the fact that with the new packaging method (pyproject.toml), we can have different requirements for building (freeze a version) and running (allowing any version recent enough). |
Looking at #346 I can remember how it works now. The githubproject.toml file is used for configuration. As the github workflow specifies Python 3.9, it builds using 1.16.6, which is what we need for ABI compatibility. |
Problem:
The recent release of
numpy
1.23 (available for python 3.8 to 3.10) introduced a problem for users of python 3.8+ building from source.pip install
runs in its own environment, and installs there the most recent numpy version, to compile the C-extensions of AT (sonumpy
1.23 now). At run time, the code links with the release installed by the user in its environment, possibly an older one. It appears thatnumpy
1.23 changes something so thatat
fails to import:The only solution for the user is to upgrade
numpy
. Though usually this can be done without side effects, this should not be requested!Proposed solution:
The solution proposed here is to select for building with each python version the oldest compatible
numpy
release. The version used at runtime will then always be more recent, ensuring full compatibility (except for documented deprecations). The following table is used:This the version used for building only (compilation of C extensions), configured in the
[build-system]
section ofpyproject.toml
. The requirement for running PyAT (install_requires
insetup.cfg
) stays as it is: numpy > 1.16.6, to get the necessary features. Practically, it will be the current version at the time of 1st installation, but the user may upgrade as he wants.Possible side effects:
python setup.py install
does not run in a separated environment. Instead, it will downgrade numpy in the user's environment. So always use "pip
" for installation !numpy
. However the AT C code does not call any numpy function, it just needs access to the internal structure of andarray
. It's very unlikely that any new feature may improve (accelerate) this significantly. So this is not relevant.