-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Latest YCM devel of robotology-superbuild on Travis with Docker blocks during bootstrap on "Performing build step for 'YCM'' #254
Comments
Apparently the problem is there also with v0.10.2 (see https://travis-ci.org/robotology/robotology-superbuild/builds/527850737?utm_source=github_status&utm_medium=notification), so the problem is probably some weird Travis problem in accessing one of the remote repos used in the YCM bootstrap (ref #105). |
This commit is just to debug on Travis robotology/ycm-cmake-modules#254
I am unable to reproduce the problem on my local machine. |
I enabled
took ~2/3 minutes, and now the build seems stuck to:
see https://travis-ci.org/robotology/robotology-superbuild/jobs/527863239 . After a while, download this file result in Timeout:
|
Related tweets:
|
i'm in contact with travis support and will let y'all know if i hear anything useful from them :) |
I tried replicating the issue on ycm's Travis, but apparently everything is working correctly there. Probably the issue is related to the fact the the robotology-superbuild Travis build is working inside a docker container. |
Thanks a lot @beaugunderson ! |
if you want to contact their support as well you can reference our support ticket so they can cross-reference it; it's 6994 |
I'm not familiar at all with this repository, but my builds have also been getting stuck and timing out. Strangely, it happens during different stages and it is only for one of my repositories. I reran builds from old commits that were fine and those are also breaking so assuming this is a travis problem. Running a python 3.6 container, tests via docker compose. |
fwiw we also use docker/docker-compose in our tests... edit: verified that we're still pinned to the same docker-compose version, and docker-ce also has an identical version between old successful and now failing builds ( |
@beaugunderson do your tests run in multiple threads? |
@zak10 yup, we use gnu edit: and also |
I'm also experiencing this issue -- also in docker builds. |
We're using pytest + xdist to run tests in parallel. I'm attempting now to remove xdist completely to see if that alleviates the issue. edit: I was unable to alleviate any issues and now it seems the issue has spread to a second repository with a similar configuration. Still no word from travis - I think it may be time for me to pull the trigger and switch to a different CI provider. |
still no response from travis since last thursday :( |
currently trying to force |
|
i feel like this fundamentally misunderstands the issue, sadly |
|
yep - got the same canned response from them. at least they're acknowledging the issue now. unfortunately for them, I've already begun the process of migrating to Circle. best of luck! |
I tried the
|
awesome @traversaro; i'm trying that now as well (slightly more complicated with docker-compose, but it seems like |
this is more of a pain than i thought, especially if you already use |
Related issue: ros-industrial/industrial_ci#364 . |
travis is claiming that this issue is fixed but we're still getting 100% failures... around the time they were testing fixes we got a few passes, so it certainly improved for a small period of time, but now it's broken again and i'm having a hard time getting their support to do anything about it |
On our side, in the last day all the build that were affected by that (see https://travis-ci.org/robotology/robotology-superbuild/jobs/529722072) are working correctly. Given @beaugunderson input, probably it make sense to monitor a bit more for possible failures. |
On our side, we did not observed Travis failures related to this in the past week. I think we can close the issue, at least for what concern YCM and its bootstrap. |
See https://travis-ci.org/robotology/robotology-superbuild/builds/527715806 . Until yesterday, everything went fine. I already restarted a job, and the same problem appeared again.
I wonder if the problem is related to this commit: 912e0d6
The text was updated successfully, but these errors were encountered: