Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Windows - inspector - intermittent ? - inspector/test-bindings #15558

Closed
mhdawson opened this issue Sep 22, 2017 · 24 comments · Fixed by #16472
Closed

Windows - inspector - intermittent ? - inspector/test-bindings #15558

mhdawson opened this issue Sep 22, 2017 · 24 comments · Fixed by #16472
Assignees
Labels
flaky-test Issues and PRs related to the tests with unstable failures on the CI. inspector Issues and PRs related to the V8 inspector protocol test Issues and PRs related to the tests. windows Issues and PRs related to the Windows platform.

Comments

@mhdawson
Copy link
Member

Failure during run on un-related PR

  • Version: master
  • Platform: windows
  • Subsystem: inspector

https://ci.nodejs.org/job/node-test-binary-windows/11280/COMPILED_BY=vs2017,RUNNER=win2016,RUN_SUBSET=1/console

not ok 441 inspector/test-bindings
  ---
  duration_ms: 0.227
  severity: crashed
  stack: |-
    oh no!
    exit code: CRASHED
  ...
@mhdawson mhdawson added the windows Issues and PRs related to the Windows platform. label Sep 22, 2017
@mhdawson
Copy link
Member Author

@nodejs/platform-windows

@mscdex mscdex added inspector Issues and PRs related to the V8 inspector protocol test Issues and PRs related to the tests. labels Sep 22, 2017
@joaocgreis
Copy link
Member

Looking at https://ci.nodejs.org/view/All/job/node-daily-master/ , this has happened for 3 days (since September 20).

There's also `inspector/test-debug-end`.

https://ci.nodejs.org/job/node-test-binary-windows/11275/COMPILED_BY=vs2017,RUNNER=win2016,RUN_SUBSET=1/console

not ok 442 inspector/test-debug-end
  ---
  duration_ms: 0.696
  severity: fail
  stack: |-
    Test there's no crash stopping server that was not started
    Test there's no crash stopping server without connecting
    [err] Debugger listening on ws://127.0.0.1:63103/542e4e93-0081-41fd-a597-bf147ce898b0
    [err] For help see https://nodejs.org/en/docs/inspector
    [err] 
    Test there's no crash stopping server after connecting
    [test] Connecting to a child Node process
    [test] Testing /json/list
    [err] Debugger listening on ws://127.0.0.1:63104/cc5fdd8a-9fb6-4229-853f-3361070018df
    [err] For help see https://nodejs.org/en/docs/inspector
    [err] 
    [err] Debugger attached.
    [err] Debugger listening on ws://127.0.0.1:63107/ede6f417-c23f-44e3-8f61-292d613b1dde
    [err] For help see https://nodejs.org/en/docs/inspector
    [err] 
    { AssertionError [ERR_ASSERTION]: 42 === 3221225477
        at testSessionNoCrash (c:\workspace\node-test-binary-windows\COMPILED_BY\vs2017\RUNNER\win2016\RUN_SUBSET\1\test\inspector\test-debug-end.js:35:3)
        at <anonymous>
        at process._tickCallback (internal/process/next_tick.js:188:7)
      generatedMessage: true,
      name: 'AssertionError [ERR_ASSERTION]',
      code: 'ERR_ASSERTION',
      actual: 42,
      expected: 3221225477,
      operator: '===' }
    1

cc @nodejs/testing and @eugeneo perhaps

@refack refack added the flaky-test Issues and PRs related to the tests with unstable failures on the CI. label Sep 22, 2017
@BridgeAR
Copy link
Member

Here is another stack trace

not ok 441 inspector/test-bindings
  ---
  duration_ms: 0.234
  severity: fail
  stack: |-
    Expecting warning to be emitted
    (node:5800) Error: We expect this
    c:\workspace\node-test-binary-windows\COMPILED_BY\vs2017\RUNNER\win2016\RUN_SUBSET\1\test\common\index.js:789
                 (err) => process.nextTick(() => { throw err; }));
                                                   ^
    
    AssertionError [ERR_ASSERTION]: [ 'Iteration 1 variable: i expected: 1 actual: 0',
      'Iteration 2 variable: i expected: 2 actual: 1',
      'Iteration 2 variable: a deepStrictEqual []
        at testSampleDebugSession (c:\workspace\node-test-binary-windows\COMPILED_BY\vs2017\RUNNER\win2016\RUN_SUBSET\1\test\inspector\test-bindings.js:104:10)
        at doTests (c:\workspace\node-test-binary-windows\COMPILED_BY\vs2017\RUNNER\win2016\RUN_SUBSET\1\test\inspector\test-bindings.js:131:3)
        at <anonymous>
        at process._tickCallback (internal/process/next_tick.js:188:7)
        at Function.Module.runMain (module.js:643:11)
        at startup (bootstrap_node.js:187:16)
        at bootstrap_node.js:607:3
  ...

@BridgeAR
Copy link
Member

@joaocgreis it is probably a good idea to open a new issue for the other failure?

@joaocgreis
Copy link
Member

@BridgeAR they started at about the same time in the same subsystem, so they might be related. We can probably address them here until we figure out if they are related or not, but feel free to open another issue if you think that will help.

@Trott
Copy link
Member

Trott commented Sep 26, 2017

Is anyone actively looking at this? (Saw another one today and I'm looking at CI a lot less than usual, so a good chance there's lots more.) I'm hoping we can get CI reliably green in preparation for Code + Learn next week.

@refack
Copy link
Contributor

refack commented Sep 26, 2017

Is anyone actively looking at this? (Saw another one today and I'm looking at CI a lot less than usual, so a good chance there's lots more.) I'm hoping we can get CI reliably green in preparation for Code + Learn next week.

I'm trying to repro locally.

@refack refack self-assigned this Sep 26, 2017
@Trott
Copy link
Member

Trott commented Sep 28, 2017

ping @nodejs/v8-inspector Any idea what might be causing these tests to fail on Windows? It would be good to have CI back to green for Code + Learn next week....

@eugeneo
Copy link
Contributor

eugeneo commented Sep 29, 2017

@refack - are you looking into it? I inspected the test case visually and feel like the test may not be deterministic (e.g. it relies on the variables order in the inspector response). I think V8 might shuffle the variables because of reasons (JIT, etc). I'm going to try and make the test more deterministic.

@eugeneo
Copy link
Contributor

eugeneo commented Sep 29, 2017

(Sorry, disregard my comment - does not seem to be the case. I still would like to take a look at this test case)

@mhdawson
Copy link
Member Author

@eugeneo
Copy link
Contributor

eugeneo commented Sep 29, 2017

Stress test is unable to reproduce the failure - https://ci.nodejs.org/job/node-stress-single-test/1430/nodes=win2016/console

@refack
Copy link
Contributor

refack commented Sep 30, 2017

@eugeneo I just built a binary with VS2017 that reproduces quite often, which made me think it might be a VS2017 issue. The two jobs mentioned above were also built with VS2017:

/11280/COMPILED_BY=vs2017,RUNNER=win2016,RUN_SUBSET=1/console
/11453/COMPILED_BY=vs2017,RUNNER=win2016,RUN_SUBSET=0/console

(I can send you the binary if you need it)

@Trott
Copy link
Member

Trott commented Sep 30, 2017

More failures today, and yeah, it's vs2017 again.

https://ci.nodejs.org/job/node-test-binary-windows/11473/COMPILED_BY=vs2017,RUNNER=win2016,RUN_SUBSET=0/console

not ok 446 inspector/test-bindings
  ---
  duration_ms: 0.245
  severity: crashed
  stack: |-
    oh no!
    exit code: CRASHED

Same results in https://ci.nodejs.org/job/node-test-binary-windows/COMPILED_BY=vs2017,RUNNER=win2016,RUN_SUBSET=0/11473/console

@refack
Copy link
Contributor

refack commented Oct 2, 2017

/11360/COMPILED_BY=vs2017,RUNNER=win2016,RUN_SUBSET=1/console

joaocgreis added a commit to JaneaSystems/node that referenced this issue Oct 3, 2017
refack pushed a commit that referenced this issue Oct 3, 2017
PR-URL: #15747
Refs: #15558
Reviewed-By: Rich Trott <[email protected]>
Reviewed-By: Yuta Hiroto <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Colin Ihrig <[email protected]>
Reviewed-By: James M Snell <[email protected]>
Reviewed-By: Refael Ackermann <[email protected]>
addaleax pushed a commit to addaleax/ayo that referenced this issue Oct 4, 2017
PR-URL: nodejs/node#15747
Refs: nodejs/node#15558
Reviewed-By: Rich Trott <[email protected]>
Reviewed-By: Yuta Hiroto <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Colin Ihrig <[email protected]>
Reviewed-By: James M Snell <[email protected]>
Reviewed-By: Refael Ackermann <[email protected]>
@MylesBorins
Copy link
Contributor

MylesBorins commented Oct 7, 2017

MylesBorins pushed a commit that referenced this issue Oct 7, 2017
PR-URL: #15747
Refs: #15558
Reviewed-By: Rich Trott <[email protected]>
Reviewed-By: Yuta Hiroto <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Colin Ihrig <[email protected]>
Reviewed-By: James M Snell <[email protected]>
Reviewed-By: Refael Ackermann <[email protected]>
@ofrobots
Copy link
Contributor

Reopening as I think the flaky tests still need to be marked non-flaky after confirming so.

@ofrobots
Copy link
Contributor

Reopening again.

@ofrobots ofrobots reopened this Oct 26, 2017
addaleax pushed a commit to ayojs/ayo that referenced this issue Oct 26, 2017
The inspector_agent depends on the context still being accessible
during the destructor execution.

PR-URL: nodejs/node#16472
Fixes: nodejs/node#15558
Reviewed-By: Ben Noordhuis <[email protected]>
Reviewed-By: Franziska Hinkelmann <[email protected]>
Reviewed-By: Colin Ihrig <[email protected]>
Reviewed-By: Eugene Ostroukhov <[email protected]>
Reviewed-By: James M Snell <[email protected]>
Reviewed-By: Anna Henningsen <[email protected]>
@ofrobots
Copy link
Contributor

Based on the stress test here, there is still flakiness despite #16472.

gibfahn pushed a commit that referenced this issue Oct 30, 2017
The inspector_agent depends on the context still being accessible
during the destructor execution.

PR-URL: #16472
Fixes: #15558
Reviewed-By: Ben Noordhuis <[email protected]>
Reviewed-By: Franziska Hinkelmann <[email protected]>
Reviewed-By: Colin Ihrig <[email protected]>
Reviewed-By: Eugene Ostroukhov <[email protected]>
Reviewed-By: James M Snell <[email protected]>
Reviewed-By: Anna Henningsen <[email protected]>
gibfahn pushed a commit that referenced this issue Oct 30, 2017
The inspector_agent depends on the context still being accessible
during the destructor execution.

PR-URL: #16472
Fixes: #15558
Reviewed-By: Ben Noordhuis <[email protected]>
Reviewed-By: Franziska Hinkelmann <[email protected]>
Reviewed-By: Colin Ihrig <[email protected]>
Reviewed-By: Eugene Ostroukhov <[email protected]>
Reviewed-By: James M Snell <[email protected]>
Reviewed-By: Anna Henningsen <[email protected]>
gibfahn pushed a commit that referenced this issue Oct 31, 2017
The inspector_agent depends on the context still being accessible
during the destructor execution.

PR-URL: #16472
Fixes: #15558
Reviewed-By: Ben Noordhuis <[email protected]>
Reviewed-By: Franziska Hinkelmann <[email protected]>
Reviewed-By: Colin Ihrig <[email protected]>
Reviewed-By: Eugene Ostroukhov <[email protected]>
Reviewed-By: James M Snell <[email protected]>
Reviewed-By: Anna Henningsen <[email protected]>
@eugeneo eugeneo assigned eugeneo and unassigned refack Nov 22, 2017
@eugeneo
Copy link
Contributor

eugeneo commented Nov 22, 2017

@refack

BTW: did you check that all accesses to libuv are thread safe?

I see this issue using the JS API (e.g. there's no second thread running, libuv is not used for anything). What I am seeing is that session is receiving repeat "Debugger.paused" notifications from the V8. Number of those repetition is same as the total number of the previous sessions + 1. E.g. if I open and close 2 sessions, third session I will be notified thrice.

It does not look like it is in V8 side of the Inspector too - it is a debug listener that fires the event multiple times, Inspector has a single session and properly dispatches the event there.

I suspect there's something funny with some object deregistering, so could it be some destructor is now "optimized out"?

I will update this bug when I have more information.

@Trott
Copy link
Member

Trott commented Dec 4, 2017

This is now sequential/test-inspector-bindings, right?

https://ci.nodejs.org/job/node-test-binary-windows/13320/COMPILED_BY=vs2017,RUNNER=win2016,RUN_SUBSET=0/console

not ok 514 sequential/test-inspector-bindings # TODO : Fix flaky test
  ---
  duration_ms: 0.240
  severity: flaky
  stack: |-
    Expecting warning to be emitted
    (node:6608) Error: We expect this
    c:\workspace\node-test-binary-windows\test\common\index.js:797
                 (err) => process.nextTick(() => { throw err; }));
                                                   ^
    
    AssertionError [ERR_ASSERTION]: [ 'Iteration 1 variable: i expected: 1 actual: 0',
      'Iteration 2 variable: i expected: 2 actual: 1',
      'Iteration 2 variable: a deepStrictEqual []
        at testSampleDebugSession (c:\workspace\node-test-binary-windows\test\sequential\test-inspector-bindings.js:104:10)
        at doTests (c:\workspace\node-test-binary-windows\test\sequential\test-inspector-bindings.js:131:3)
        at <anonymous>
        at process._tickCallback (internal/process/next_tick.js:165:7)
        at Function.Module.runMain (module.js:703:11)
        at startup (bootstrap_node.js:195:16)
        at bootstrap_node.js:646:3
  ...

@refack
Copy link
Contributor

refack commented Dec 7, 2017

Still open.

@refack refack reopened this Dec 7, 2017
@Trott
Copy link
Member

Trott commented Oct 18, 2018

It seems like perhaps this should be closed. Feel free to re-open (or leave a comment requesting that it be re-opened) if you disagree. I'm just tidying up and not acting on a super-strong opinion or anything like that.

@Trott Trott closed this as completed Oct 18, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
flaky-test Issues and PRs related to the tests with unstable failures on the CI. inspector Issues and PRs related to the V8 inspector protocol test Issues and PRs related to the tests. windows Issues and PRs related to the Windows platform.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

9 participants