Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

run pip-compile on all of our requirements files #11912

Merged
merged 7 commits into from
Jul 21, 2022
Merged

Conversation

dstufft
Copy link
Member

@dstufft dstufft commented Jul 21, 2022

Closes #11911
Closes #11910
Closes #11909
Closes #11908
Closes #11907
Closes #11906
Closes #11905
Closes #11904
Closes #11895
Closes #11894
Closes #11892
Closes #11890
Closes #11881
Closes #11879
Closes #11876
Closes #11875
Closes #11870
Closes #11869
Closes #11867
Closes #11852
Closes #11851
Closes #11847
Closes #11843
Closes #11840
Closes #11837
Closes #11836
Closes #11832
Closes #11830
Closes #11828
Closes #11814
Closes #11291

--hash=sha256:e3d3df3292ab4bae85213b9ebef566b5aedb45f97425a92fac5b2e431d31e71c \
--hash=sha256:ef0768a609a02b2b412fa0f59f1242f1597e9bb15188d043f3fde09115ca6c69 \
--hash=sha256:f2f43ae8dff452aee3026b59ea0a09245ab2529a55a0984992e76bcf848610e1
protobuf==3.20.1 \
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this downgrade intentional?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, ish.

We're currently actually installing a "broken" (as far as dependency constraints are concerned) environment because of --no-deps (I think). I was actually going to ask you about this, but figured I'd get the rest of the updates working first.

So we're using google-cloud-bigquery<3.0.0, which I see you pinned due to googleapis/python-bigquery#1196.

That pin causes us to install google-cloud-bigquery==2.34.4, which 2.34.4 requires protobuf<4.

Obviously, it seems to work fine in practice, but I don't think we want to be installing "broken" sets of dependencies like that, and I think the reasons why we were capable of doing that to begin with is:

  • The --no-deps in Dockerfile means pip won't check the version constraints.
  • Dependabot doesn't check version constraints, it just blindly updates the pins.
  • make deps doesn't check that the pins match, just that all of the names in the old and new ones match.

So, this brings us back to a "clean" set of dependencies. If we want a newer version of protobuf, we'll need to unpin google-cloud-bigquery, which I'm not sure I understand why it was pinned to start with. It looks like just because it introduced a (needless) dependency on pyarrow and numpy, but I can't tell if that was actually causing problems or if it was just conceptually we felt we didn't want those deps drug in.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We're currently actually installing a "broken" (as far as dependency constraints are concerned) environment because of --no-deps (I think). I was actually going to ask you about this, but figured I'd get the rest of the updates working first.

The --no-deps flag was added in #10463 as a workaround for pypa/pip#9644 which still seems to be a bug, unfortunately.

Dependabot doesn't check version constraints, it just blindly updates the pins.

Huh, I was under the impression that it was re-compiling our .in files to determine upgrades, but I agree that that doesn't appear to be the case.

It looks like just because it introduced a (needless) dependency on pyarrow and numpy, but I can't tell if that was actually causing problems or if it was just conceptually we felt we didn't want those deps drug in.

Yes, this is exactly why.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Huh, I was under the impression that it was re-compiling our .in files to determine upgrades, but I agree that that doesn't appear to be the case.

I think Renovate might support that, and maybe dependabot does too I'm not sure.

Though we have a more foundational issue that even pip-compile isn't really right here, because we need all of our requirements files to be resolvable together. Right now pip-compile on deploy.in gives you protobuf==4.* while on main.in gives you protobuf=3.*, which even with --no-deps pip will error because you can't install both.

So blindly upgrading the .txt files is ~fine as long as we ensure that our dependency set is resolvable, which prior to #10463 was done by just having pip install it, but I just pushed a new commit to this PR that adds pip check after we install which will restore checking for resolvability.

Yes, this is exactly why.

I think the edge cases caused by pinning might be sufficiently annoying that skipping the extra dependencies isn't worth the pin? The issue is open with Google so if/when they fix it we should drop those dependencies ourselves, but I'm not sure that we're really gaining much by installing them.

What do you think?

@dstufft dstufft marked this pull request as ready for review July 21, 2022 15:05
@dstufft dstufft requested a review from a team as a code owner July 21, 2022 15:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants