-
-
Notifications
You must be signed in to change notification settings - Fork 279
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Many noarch packages that are installable but broken on early Pythons #2210
Comments
I think we should specify the minimum python in the global pinnings and let people override if they want. This will simplify migration logic in the bot and will allow us to parameterize the recipe to avoid having to parse yaml to change the version. |
I think logic like python {{ min_noarch_python_version }} should work and let us embed the same constraint in the test section. |
Interesting we have a similar variable with CUDA. We name it Maybe we could do something similar like Perhaps this is useful beyond |
Yeah that'd be fine too. |
I had commented in another thread (thanks for the xref @jakirkham):
So my full support for fixing this, and 👍 to |
Isn't it more of a linter change? We nag users saying "add lower constraints in your noarch python!" but we could change the message and say instead "Use the |
Yeah the key is to parameterize the bound so the ecosystem moves and we eliminate crazy solves. We do lint on this already, but we don't lint on moving the bound. |
Note that we don't actually need to pin the bound in host if we parameterize it globally. We could use |
But when a package requires a minimum lower bound higher than our default, they will have to override it and at that point they have opted out of the defaults. Not sure if they would revert once the default catches up with their lower bound. Isn't it more explicit for noarch users to always specify the pinned lower bound in |
Yeah people can add a bound instead of overriding (say
However, more to the point, I don't think people should have to manually change their recipes when we drop a python version. That leaves the bot or explicit bounds in pinnings. The bot can only take us so far in updates due to inherent limits in the ability to parse and understand a recipe (i.e., When we find Having an explicit bound in the pinnings ensures no manual updates, specifies the intent precisely, and ensures that we can change it globally. Folks who opt-out of a global pinning have always been on their own in conda-forge, and this case is no different. |
That's the wrong thing to do. We want an explicit equal bounds in |
With your suggestion, we will still install python 3.12 or whatever is the latest |
Yeah I am happy with an |
Ok you convinced me :) |
So I think our consensus is to have host:
- python {{ minpy }}
run:
- python >={{ minpy }}
test:
requires:
- python ={{ minpy }} We'll need a better name for (edit: changed to pin_compatible since that will respect the patch version too.) |
There's also the concern from @dopplershift at conda-forge/conda-forge-pinning-feedstock#5013 (comment) |
It is unclear to me if @dopplershift's comment is for
or if it is for
Maybe he can expand on his comment? I really don't think we should be marking packages we ship with bounds like |
I can't speak for Ryan but I believe that he wants the original min Python, from the metadata, to be used. As someone who re-builds conda-forge's recipes for my day job, back when we had a delay to bump Python, this was extremely useful. While I no longer need that I do see the value in it. |
So then |
@ocefpaf If we parameterize the bound in the pinnings, it should be easy enough to roll back globally via a custom pinnings file for your delay, right?
Formally, yes @jaimergp , but in practice accessing the python requires for the upstream code is non-trivial. Thus we should IMO leave that to the user and simply use our own min python version. |
I was guessing on what Ryan meant by projecting an old requirement that I no longer have, but yes. I could use a custom pinning in that case. As it turns out, it was easier for everyone at the day-job to just keep updated with Python instead for many other reasons beyond conda-forge. BTW, I am +1 to #2210 (comment) |
But we do still ship 3.6? I just created a perfectly fine 3.6 environment on my mac. I would be extremely confused if I went to install an upstream package that says it supports 3.6 into this environment, and the conda solver gave me either an older version or failed completely to resolve the environment. |
I suppose my solution in this case is just to use pip to install since it doesn't need to build. I can be convinced that my problem is far smaller than solving the problem that this issue is solving. |
We ship conda packages. Mixing pip and conda is asking for trouble. |
This is IMO a much lesser evil than shipping broken packages because we get the bounds wrong the other way without noticing. To me, conda-forge definitely has the right to make choices here, like "we support CPython versions for 5 years, after that it's over for all new builds", even if that ends up tightening the upstream package requirements. You'd still get the old builds of course if you want to use a py36 environment today. And there'd be an escape hatch too, in that the feedstock can choose to override |
@h-vetinari I'm at peace with that. I will note that my original comment was in the context of a linter message blindly telling people to specify "> 3.10" rather than a more formal change to conda-forge's infrastructure and pinning around noarch. |
OK so I think we're at consensus. We'll do the usual, announcement + migrator + lint + hint thing here. |
xref: conda-forge/cfep#56 |
This will be addressed by CFEP 25 (recently passed). The next steps are enumerated in issue ( #2351 ). Closing this one out. Let's move any implementation discussion to the new tracking issue |
Your question:
As Python is release more often and the use of requires-python/python-requires metadata is getting adopted, we are creating modern packages that should not be installable on earlier version of Python due to the use of
>=
in both host and run.Many issues like conda-forge/urllib3-feedstock#84 are cropping up and the bot does not use grayskull to update Python, only the other dependencies. Here is what we discussed in today's meeting.
Pin the minimum Python in host and test to ensure a recipe will break when building if the min Python version was updated.
or we could improve the bot to use grayskull to fully regenerate the recipe, or both togehter.
The former seems to have a clear path for implementation: 1 linter update, migration to modify the recipes.
Note that this will cause a lot of churn! There are many noarch recipes.
cc: @isuruf, @jaimergp, @jakirkham, and @beckermr who are probably the key people who may be interested in this.
The text was updated successfully, but these errors were encountered: