Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Many noarch packages that are installable but broken on early Pythons #2210

Closed
ocefpaf opened this issue Jun 26, 2024 · 30 comments
Closed

Many noarch packages that are installable but broken on early Pythons #2210

ocefpaf opened this issue Jun 26, 2024 · 30 comments
Labels

Comments

@ocefpaf
Copy link
Member

ocefpaf commented Jun 26, 2024

Your question:

As Python is release more often and the use of requires-python/python-requires metadata is getting adopted, we are creating modern packages that should not be installable on earlier version of Python due to the use of >= in both host and run.

Many issues like conda-forge/urllib3-feedstock#84 are cropping up and the bot does not use grayskull to update Python, only the other dependencies. Here is what we discussed in today's meeting.

Pin the minimum Python in host and test to ensure a recipe will break when building if the min Python version was updated.

requirements:
  host:
    - python 3.8.*
  run:
    # option 1
    - python >=3.8
    # option 2
    - {{ pin_compatible("python", max_pin=None) }}
test:
  requires:
    - python 3.8.*

or we could improve the bot to use grayskull to fully regenerate the recipe, or both togehter.

The former seems to have a clear path for implementation: 1 linter update, migration to modify the recipes.

Note that this will cause a lot of churn! There are many noarch recipes.

cc: @isuruf, @jaimergp, @jakirkham, and @beckermr who are probably the key people who may be interested in this.

@jakirkham
Copy link
Member

cc @marcelotrevisani

@beckermr
Copy link
Member

I think we should specify the minimum python in the global pinnings and let people override if they want. This will simplify migration logic in the bot and will allow us to parameterize the recipe to avoid having to parse yaml to change the version.

@beckermr
Copy link
Member

I think logic like

python {{ min_noarch_python_version }}

should work and let us embed the same constraint in the test section.

@jakirkham
Copy link
Member

Interesting we have a similar variable with CUDA. We name it cuda_compiler_version_min = cuda_compiler_version (the other variable) + _min

Maybe we could do something similar like python_min?

Perhaps this is useful beyond noarch

@beckermr
Copy link
Member

Yeah that'd be fine too.

@h-vetinari
Copy link
Member

I had commented in another thread (thanks for the xref @jakirkham):

I think it would make sense to set the lower bound in noarch recipes to our oldest supported python version. That's realistically the only thing we can test and support. Upstream may support more, but conda-forge doesn't have to - when we drop python 3.8, that IMO should include noarch packages.

So my full support for fixing this, and 👍 to python_min or python_noarch_min. Noarch feedstocks can then still override that in CBC as necessary.

@jaimergp
Copy link
Member

Isn't it more of a linter change? We nag users saying "add lower constraints in your noarch python!" but we could change the message and say instead "Use the =lower.bound in host and test, and >=lower.bound in run, thankssss"?

@beckermr
Copy link
Member

Yeah the key is to parameterize the bound so the ecosystem moves and we eliminate crazy solves. We do lint on this already, but we don't lint on moving the bound.

@beckermr
Copy link
Member

Note that we don't actually need to pin the bound in host if we parameterize it globally. We could use >= {{ min_python_ver }} in all sections and it'd be fine.

@jaimergp
Copy link
Member

But when a package requires a minimum lower bound higher than our default, they will have to override it and at that point they have opted out of the defaults. Not sure if they would revert once the default catches up with their lower bound. Isn't it more explicit for noarch users to always specify the pinned lower bound in host? They have to do it for run with >= anyway, right?

@beckermr
Copy link
Member

Yeah people can add a bound instead of overriding (say minpy=3.9):

host:
 - python >={{ minpy }}
 - python >=3.11
run:
 - python >={{ minpy }}
 - python >=3.11

However, more to the point, I don't think people should have to manually change their recipes when we drop a python version. That leaves the bot or explicit bounds in pinnings.

The bot can only take us so far in updates due to inherent limits in the ability to parse and understand a recipe (i.e., When we find python >=XYZ in host for noarch, it is probably the right thing to change, but I am sure we will hit edge cases.).

Having an explicit bound in the pinnings ensures no manual updates, specifies the intent precisely, and ensures that we can change it globally. Folks who opt-out of a global pinning have always been on their own in conda-forge, and this case is no different.

@isuruf
Copy link
Member

isuruf commented Aug 28, 2024

That's the wrong thing to do. We want an explicit equal bounds in host to make sure the package actually works/

@isuruf
Copy link
Member

isuruf commented Aug 28, 2024

With your suggestion, we will still install python 3.12 or whatever is the latest

@beckermr
Copy link
Member

Yeah I am happy with an == in host. My main thing is that we should have a global pinning value for it. From my perspective, noarch packages fail on both old and new python versions. Pinning the oldest in host+test versus not is a question of whether you think a failure on the oldest one or newest one is more likely or not.

@jaimergp
Copy link
Member

Ok you convinced me :)

@beckermr
Copy link
Member

beckermr commented Aug 28, 2024

So I think our consensus is to have

host:
  - python {{ minpy }}
run:
  - python >={{ minpy }}

test:
 requires:
    - python ={{ minpy }}

We'll need a better name for minpy as suggested above.

(edit: changed to pin_compatible since that will respect the patch version too.)
(edit: changed back to >= since we don't want to pin against patch version.)

@isuruf
Copy link
Member

isuruf commented Aug 28, 2024

There's also the concern from @dopplershift at conda-forge/conda-forge-pinning-feedstock#5013 (comment)

@beckermr
Copy link
Member

It is unclear to me if @dopplershift's comment is for

  • not supporting 3.9 for noarch when we still support it as part of our standard build matrix

or if it is for

  • any noarch pin that differs from an upstream package.

Maybe he can expand on his comment?

I really don't think we should be marking packages we ship with bounds like >=3.6 when we have not shipped python 3.6 in a long, long time and no longer offer support for it.

@ocefpaf
Copy link
Member Author

ocefpaf commented Aug 28, 2024

I can't speak for Ryan but I believe that he wants the original min Python, from the metadata, to be used. As someone who re-builds conda-forge's recipes for my day job, back when we had a delay to bump Python, this was extremely useful. While I no longer need that I do see the value in it.

@jaimergp
Copy link
Member

So then host should have python pinned to max([conda_forge_min_python, upstream_python_requires])?

@beckermr
Copy link
Member

@ocefpaf If we parameterize the bound in the pinnings, it should be easy enough to roll back globally via a custom pinnings file for your delay, right?

So then host should have python pinned to max([conda_forge_min_python, upstream_python_requires])?

Formally, yes @jaimergp , but in practice accessing the python requires for the upstream code is non-trivial. Thus we should IMO leave that to the user and simply use our own min python version.

@ocefpaf
Copy link
Member Author

ocefpaf commented Aug 28, 2024

@ocefpaf If we parameterize the bound in the pinnings, it should be easy enough to roll back globally via a custom pinnings file for your delay, right?

I was guessing on what Ryan meant by projecting an old requirement that I no longer have, but yes. I could use a custom pinning in that case. As it turns out, it was easier for everyone at the day-job to just keep updated with Python instead for many other reasons beyond conda-forge.

BTW, I am +1 to #2210 (comment)

@dopplershift
Copy link
Member

I really don't think we should be marking packages we ship with bounds like >=3.6 when we have not shipped python 3.6 in a long, long time and no longer offer support for it.

But we do still ship 3.6? I just created a perfectly fine 3.6 environment on my mac. I would be extremely confused if I went to install an upstream package that says it supports 3.6 into this environment, and the conda solver gave me either an older version or failed completely to resolve the environment.

@dopplershift
Copy link
Member

But we do still ship 3.6? I just created a perfectly fine 3.6 environment on my mac. I would be extremely confused if I went to install an upstream package that says it supports 3.6 into this environment, and the conda solver gave me either an older version or failed completely to resolve the environment.

I suppose my solution in this case is just to use pip to install since it doesn't need to build. I can be convinced that my problem is far smaller than solving the problem that this issue is solving.

@beckermr
Copy link
Member

We ship conda packages. Mixing pip and conda is asking for trouble.

@h-vetinari
Copy link
Member

h-vetinari commented Aug 29, 2024

I would be extremely confused if I went to install an upstream package that says it supports 3.6 into this environment, and the conda solver gave me either an older version or failed completely to resolve the environment.

This is IMO a much lesser evil than shipping broken packages because we get the bounds wrong the other way without noticing.

To me, conda-forge definitely has the right to make choices here, like "we support CPython versions for 5 years, after that it's over for all new builds", even if that ends up tightening the upstream package requirements.

You'd still get the old builds of course if you want to use a py36 environment today. And there'd be an escape hatch too, in that the feedstock can choose to override min_py locally to a lower version than the pinning. That to me seems plenty accomodative for way-EOL python versions.

@dopplershift
Copy link
Member

@h-vetinari I'm at peace with that. I will note that my original comment was in the context of a linter message blindly telling people to specify "> 3.10" rather than a more formal change to conda-forge's infrastructure and pinning around noarch.

@beckermr
Copy link
Member

OK so I think we're at consensus. We'll do the usual, announcement + migrator + lint + hint thing here.

@beckermr
Copy link
Member

xref: conda-forge/cfep#56

@jakirkham
Copy link
Member

This will be addressed by CFEP 25 (recently passed). The next steps are enumerated in issue ( #2351 ). Closing this one out. Let's move any implementation discussion to the new tracking issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Development

No branches or pull requests

7 participants