-
Notifications
You must be signed in to change notification settings - Fork 29.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[PRIORITY] Persistent failures on Windows #1005
Comments
ping @iojs/platform-windows, this needs fairly urgent attention from anyone capable of digging in and figuring these out. |
Two of these have been fixed by #1008 Last complete That means two things:
/cc @iojs/build |
I had limited success in reproducing many of these. Running them locally on a Windows machine and even running them on the exact same server that is giving us the failures via Jenkins, I can only make either 4 or 3 tests fail. The block of timeouts seem to be somehow Jenkins-specific. Can other Windows users verify against 1.v HEAD and report back? |
I get these failures: https://gist.github.com/domenic/ad191da152fc6632fa32 Unsure what to make of them compared to yours above. I guess the first one, maybe others, is due to not having the openssl binary installed. I'll go check my prereqs and report back if anything else changes. |
Install git bash and you'll get what you need. Google "git windows" and download and install that one. |
Yeah I have Git bash but when I try to run .bat files from within it all hell breaks loose. |
You should be able to run vcbuild.bat from cmd.exe but git bash comes with curl and other Unix utils that can go in your PATH. I think that's an install option though. |
OK yeah, getting the same failures minus the first one (the openssl one). |
I have 32-bit Windows at home so I ran After deleting tmp.0 manually I re-ran parallel/test-fs-access and it passed. |
right, sorry, I'm barking up the wrong so, ignore my comment for now! |
I haven't looked at the tests or the Jenkins config, so forgive me if this isn't helpful.. The combination of Jenkins and Windows immediately brings to mind the problems I've had with build console logs being garbled or out of order because it is non-blocking. Depending on how the scripts are wired up, that could introduce timing issues. |
Restarting Jenkins seems to have fixed the timeouts, I was playing with running Jenkins from cmd rather than as a service but soon discovered it was simply a matter of needing to restart. I also did a full cleanout of the build workspace so that potentially could have helped too, I really don't know. Jenkins These are the remaining persistent failures on both 2008 and 2012. If you want to contribute but don't feel confident enough to actually find and fix the underlying problem then running a Build with
|
test-http-content-length is from #1062 See also #1137 More info: (edit): First CI run: https://jenkins-iojs.nodesource.com/view/iojs/job/iojs+any-pr+multi/250/ CI was not run on the PR. Edit2: fixed in 53e200a |
test-regress-GH-io-1068 is from #1073 I reported that the test case was failing on windows a week ago in the original issue: #1068 (comment) More info: (edit): First CI: https://jenkins-iojs.nodesource.com/view/iojs/job/iojs+any-pr+multi/247/ CI was run on this PR, it did error, was unreproducible out of CI, and further investigation did not yet happen. |
Fwiw, both of these have been failing since they were added. |
|
#1150 reverts preload. |
https://jenkins-iojs.nodesource.com/view/iojs/job/iojs+any-pr+multi/312/ |
As of recently,
|
Doesn't happen for me on my Windows box. |
I can repro |
@mathiask88 thanks for pointing that out. Given that is a changelog commit I believe it was committed in error from dev testing. #1198 ping @rvagg |
I restarted jenkins again on the 2008 machines (actually the 2012 ones as well) and we're back to parity between test runs on both: https://jenkins-iojs.nodesource.com/job/iojs+any-pr+multi/331/nodes=iojs-win2008r2/console So now it's just test-regress-GH-io-1068 and test-net-reconnect-error left. |
improving, but we now have test-child-process-stdout-flush-exit in the mix occasionally, not sure if this is a new failure but it's shown up a few times in recent runs. |
Maybe it helps, but on my windows machine |
🍺 to everyone! Now to figure out some odd leak. |
woo! thanks all |
Thank you very much @indutny ! |
Same thing on OS X Yosemite
|
Same thing on OS X Yosemite could we open this issue |
Perhaps you have a server listening on 8080 port? |
I'm going to lock this. This is historical only. Please report new issues if they arise in new issue thread. :) |
See https://jenkins-iojs.nodesource.com/job/iojs+any-pr+multi/215/nodes=iojs-win2012r2/console for sample output, consistent with 2008 build and consistent across runs at the moment too.
10 failures in total. These have gone uncaught for a little while because of a combination of:
test-ci
still doesn't work on any platform so we have to resort to more hacky means of making enough tests runSo we've been getting lots of blue when they really should have been red; so totally off everyone's radar.
I'm can't assess the severity of these failures at a glance, nor can I see a single common theme that would point to something to address. When I have time I'll go back and find a run that had these passing so we can start a manual bisect at least.
_test-child-process-stdio-big-write-end fixed in #1008_
_test-pipe-head fixed in #1008_
The text was updated successfully, but these errors were encountered: