-
-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Increasing the ABI level to follow Nep29 #324
Comments
That doesn't matter at all, it's a lower bound only. If Is this a hypothetical worry, or do you have a real-world problem? |
Ah wait, missed this bit. You should also be setting a |
You have a direct runtime dependency on That's a bug (and technically always already was a bug). |
Do you want all downstream packages to have the upper bound hardcoded? The whole point of the run_exports was that downstream packages didn't have t worry about the upper bound of compatibility. It would have to be a
I'm happy to make the PR to scikit-image, but the point is to make is easy to not make these mistakes for other maintainers. I'm with you that it is a bug, but I think that having the default point to 1.19 is really counter to the whole spirit of SPEC0 and NEP29. In the spirit of SPEC0/NEP29, support for 1.19 should be opt-out instead of opt in. In my mind, we should help downstream packages shed the burden of infinitely long support windows. If we can let the the defaults follow SPEC0 and/or NEP29, that would shed alot of the maintenance from ourselves in general. |
NEP29 or not, the issue is that by starting to the numpy 2.0 migration, the default lower bound unexpectedly (to me) changed from 1.22 to 1.19. |
Ultimately, @rgommers this isn't just a scikit-image problem. Even the scipy feedstock (which I would argue has the most knowledgeable eyes on it) suffers from the same problem https://github.com/scipy/scipy/blob/v1.14.0/pyproject.toml#L55
|
Missed this discussion when looking at the scikit-image PR It should be possible to just add the additional If that doesn't work, would think that is a bug in build tools (like conda-build). Though I'm pretty sure I've seen this work elsewhere |
I understand the sentiment, I also didn't want to go "backward". We discussed this a lot in conda-forge/conda-forge.github.io#1997, see comments starting here. There are two separate concerns here:
The first part is where we get the 1.19 from. We could override this with For the second point, packages can of course require newer numpy API. That constraint should still be specified in the run requirements, as the 2.0 migrator ToDos said:
In retrospect that should really have been formulated as "you need to add" rather than "you can" - sorry about that. I'll open a PR to improve that. For SciPy, we do catch the upper bound, but indeed we're missing the lower bound (and that it changed for 1.14). Thanks for pointing that out, I'll go fix that. I'm sure we can improve the messaging on this (i.e. distinguish that removing |
This is being done in these PRs:
Also this is being done in this PR:
Just to follow up on this point, this in fact is what is now happening the If we look at a recent IOW adding multiple |
Maybe my knowledge is outdated, but previous some solver, one of
had real trouble with double run constraints like the one you added for scipy
its somewhat hard to test this today in 2024 since numpy 2.4 doesn't exist. |
Honestly, I probably need to get over myself and take the recommendations of NEP29/SPEC0 are so loose and non-binding that it seems that the burden continues to get push onto package maintainers to learn new "best practices" on a daily basis. I'll just close this issue on the note that neither
had a lower bound set on numpy at the time I opened this issue. So instead of helping people make better packages, we are actively encouraging maintainers to continue to spend their resources (their own human time) to make PRs and write repo-data patches (that only core can merge). While it is true that numpy 2.0 is ABI compatible with numpy 1.19, that doesn't mean that we have to allow this at conda-forge. I don't see anybody in the near future making a PR to the 1.19 branch in case anything accidentally breaks Setting the lower bound to 1.22 would just help create a better safety net when mistakes in packages happen (and they did in the 4 popular packages I listed above) edit: added 1.19 in the sentence:
|
Think some misunderstanding may be happening here No one is saying packages need to support If anything think the improvements on the NumPy ABI side with NumPy 2 and the use of
There is no If you are still feeling that there are pain points here, would encourage to reopen and discuss further. Would like us to sort out any remaining issues so that the experience of using As a particular example, if there are issues with repeated constraints, we should identify what they are and track them down and make sure they are fixed. It should also be possible to create example test cases with any package (not just IOW this... - numpy >=1.23.5
- numpy <2.0 ...should be identical to one constraint using - numpy >=1.23.5,<2.0 ...and solvers should be able to handle both easily (as they can map to one another). If that is not the case, we can raise issues and work on resolution We could also look at having build tooling do this transformation for us when generating the package metadata |
FWIW, with the upcoming numpy 2.1.0, we'll bump this to 1.21: numpy/numpy#27187 |
The bugs in recipes are unfortunate indeed. On the upside, the only reason we didn't catch them yet is that no one opened a bug report (until this one yesterday) - so in practice it is very rare to get latest version of scipy/pandas/etc. with a 3 year old numpy version. +1 to @jakirkham's explanation. ABI is managed by |
I closed this issue because I agree on the technical merits of your decisions. Numpy's ABI is compatible with down to 1.19, strictly technically we should expose that. Conda-forge shouldn't be picking and choosing features it exposes I do think we don't do enough to set lower bounds more aggressively to help ourselves minimize our bug exposure surface. Few people are asking for a 3 year old numpy in new packages in 2024, so why make that the default? |
Because everything is a trade-off... taking numpy's default ABI saves us some bug surface (in packaging) too, at the risk that someone installing a very old numpy might get something broken. But that's not a very common case, so IMO the current trade-off is still favourable. 🤷♂️ |
@jakirkham where should I post this kind of feature suggestion. I was going to post on conda/grayskull but i'm not sure if that is the approproate spot. Feature request to discussIs your feature request related to a problem? Please describe. Often these issues arise for hardware configurations not available upstream to test on, beit osx-arm, aarch64, or just package combinations that affect how each other work (like numpy / cupy / dask / xarray dispatching) As a maintainer, I would like the choice to restrict the lower bound of the packages to those suggested by NEP29, and not really think about it. Describe the solution you'd like
to be filtered with the command
would update on August 17 to:
I have a package called https://github.com/hmaarrfk/nep29 (on pypi and conda-forge) that can generate these lower bounds for you in according to my understanding of the heuristics spelled out in https://numpy.org/neps/nep-0029-deprecation_policy.html Describe alternatives you've considered
Ultimately the answer seems to be:
This is a solution that I think would provide an option for maintainers to opt in. Additional context |
I think the best would be an issue in https://github.com/conda-forge/conda-forge.github.io |
Follows up on conda-forge/numpy-feedstock#324 (comment), which pointed out that the dependency is currently missing and that it isn't correct to depend only on the `run_exports` of `numpy` itself (that's ABI-only and allows >=1.19). The upstream constraints for v2.2.2 are at https://github.com/pandas-dev/pandas/blob/v2.2.2/pyproject.toml#L30-L32. Note that (unlike in `pyproject.toml`) it is not necessary to have separate lower bounds for different Python versions that correspond to the first `numpy` version with support for that Python version. A single lower bound is enough; there will not be (for example) `numpy` conda-forge packages for 1.23.0 and python 3.11. In `pyproject.toml` the constraints have to be more complex only to avoid `pip` & co trying to build a too-old version from source.
Follows up on conda-forge/numpy-feedstock#324 (comment), which pointed out that the dependency is currently missing and that it isn't correct to depend only on the `run_exports` of `numpy` itself (that's ABI-only and allows >=1.19). The upstream constraints for v2.2.2 are at https://github.com/pandas-dev/pandas/blob/v2.2.2/pyproject.toml#L30-L32. Note that (unlike in `pyproject.toml`) it is not necessary to have separate lower bounds for different Python versions that correspond to the first `numpy` version with support for that Python version. A single lower bound is enough; there will not be (for example) `numpy` conda-forge packages for 1.23.0 and python 3.11. In `pyproject.toml` the constraints have to be more complex only to avoid `pip` & co trying to build a too-old version from source.
Comment:
I noticed that the ABI level is set to 1.19 which is quite aggressively backward.
This is somewhat troublesome since we have encouraged the following practices:
run
section.While the ABI is compatible with 1.19, many packages are likely not supporting numpy that old.
NEP29 and SPEC suggestion numpy version 1.24 today.
For example, scikit-image now has a constraint of
numpy >=1.19,<3
, but the runtime upstream is not tested with 1.19 andhttps://conda-metadata-app.streamlit.app/?q=conda-forge%2Flinux-64%2Fscikit-image-0.24.0-py39hfc16268_1.conda
The text was updated successfully, but these errors were encountered: