-
-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Manual pypy37 migration #155
Conversation
…da-forge-pinning 2021.01.11.17.02.47
Hi! This is the friendly automated conda-forge-linting service. I just wanted to let you know that I linted all conda-recipes in your PR ( |
Thank you @h-vetinari! |
The last drone job timed out - I've seen this pattern a lot in the blas-variant PRs: 2-3 drone jobs go through in 25-30min, the ones beyond that get less resources(?) and then ultimately fail at the 60min timeout (somewhere between finishing the build and the middle of the test suite). Let's see if following https://github.com/scipy/scipy/blob/v1.6.0/setup.py#L234-L237 has any effect. |
There are two different machines that drone runs jobs on. One is faster than the other I think? @isuruf has details |
How much control do we have over this process? If the timeout were 90min or so, the build would very likely pass even on the slow machines (instead of retrying it all the time and spending much more CI time in total). |
None. Yay! |
Then I suggest to merge as is and manually restart the drone build once or twice until all variants get built eventually. |
OK. Let's let it get as far as it can to make sure the tests pass. Then we can merge. |
I just removed the patch: commit 07dd65932da4c773990bb6480503724ba78b74a4
Author: H. Vetinari <[email protected]>
Date: Mon Jan 11 20:29:17 2021 +0100
try upping build parallelism
diff --git a/recipe/build.sh b/recipe/build.sh
index 2ebd6d5d5..9cf1c5f4a 100644
--- a/recipe/build.sh
+++ b/recipe/build.sh
@@ -4,6 +4,7 @@
# can have a G77 ABI (currently only MKL)
export SCIPY_USE_G77_ABI_WRAPPER=1
+set NPY_NUM_BUILD_JOBS=${CPU_COUNT}
if [[ "$python_impl" == "pypy" && "$target_platform" == "linux-ppc64le" ]]; then
$PYTHON setup.py install --single-version-externally-managed --record=record.txt
else The pipeline with this exact state had already run through before, with everything but one aarch-build passing. Proof: compare first commit in "merge xxx into yyy" before (clickable from here), and after (clickable from here) I say this because aarch builds will almost certainly fail again, and so the automerge label will be useless. Which of the 4 aarch jobs fails is pretty random, so IMO this can be handled after merging by restarting until the missing job also runs through on the faster machine. |
Hi! This is the friendly conda-forge automerge bot! I considered the following status checks when analyzing this PR:
Thus the PR was not passing and not merged. |
There's some weird cycles in the dependency graph that make the migrator think that scipy depends on pandas and psycopg2, among others, see regro/cf-scripts#1331
I verified that all dependencies have been built (hopefully not overlooking anything), so raising this to break some of these cycles.