Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

jest-circus memory leak #8816

Closed
bcmarinacci opened this issue Aug 13, 2019 · 15 comments · Fixed by #9934
Closed

jest-circus memory leak #8816

bcmarinacci opened this issue Aug 13, 2019 · 15 comments · Fixed by #9934

Comments

@bcmarinacci
Copy link

🐛 Bug Report

First off... thanks heaps for creating and maintaining Jest. 🙏 JavaScript testing is in a better state because of all you do! 🃏

It looks like a regression was introduced between versions 22.4.4 and 23.1.0 of Jest Circus that is causing a rather substantial memory leak (I wasn't able to run any tests with errors on any versions between those two versions):

$ npm run jest-circus

PASS  __tests__/index-21.jest.js (24 MB heap size)
...
PASS  __tests__/index-79.jest.js (251 MB heap size)
PASS  __tests__/index-49.jest.js (253 MB heap size)
PASS  __tests__/index-65.jest.js (256 MB heap size)
PASS  __tests__/index-73.jest.js (258 MB heap size)

Test Suites: 100 passed, 100 total
Tests:       100 passed, 100 total
Snapshots:   0 total
Time:        13.991s
Ran all test suites.

I had a look at #7274 (comment) as well as #6965. I added this.stack = this.stack to https://github.com/facebook/jest/blob/master/packages/jest-util/src/ErrorWithStack.ts (via https://github.com/facebook/jest/blob/master/packages/jest-circus/src/index.ts) and this.stack = this.stack after every new Error() invocation in https://github.com/facebook/jest/blob/master/packages/jest-circus/src/utils.ts, which didn't seem to impact the leaks at all.

Other than narrowing down the version that introduced the regression and trying what was mentioned in #6965, I haven't really had time to make any more progress. I also noticed #7274 but that seemed to be focused on memory leaks from graceful-fs, which I confirmed independently, so I thought I would create a new issue that focuses purely on jest-circus.

To Reproduce

I created a sandbox repository with a simple test and 100 test files. I then ran the tests with jest-circus as well as the default runner for (matching) versions: 24.8.0, 23.1.0, and 22.4.4 and recorded the results: https://github.com/BuildingConnected/jest-circus-memory-leak.

Expected behavior

The memory footprint when running tests via jest-circus should match that of the default test runner.

Link to repl or repo (highly encouraged)

https://github.com/BuildingConnected/jest-circus-memory-leak

Run 100 test files:

npm run jest # default runner
npm run jest-circus # jest-circus

Run npx envinfo --preset jest

System:
  OS: macOS 10.14.6
  CPU: (8) x64 Intel(R) Core(TM) i7-7700HQ CPU @ 2.80GHz
Binaries:
  Node: 10.15.1 - ~/.nvm/versions/node/v10.15.1/bin/node
  Yarn: 1.17.3 - /usr/local/bin/yarn
  npm: 6.4.1 - ~/.nvm/versions/node/v10.15.1/bin/npm
npmPackages:
  jest: 24.8.0 => 24.8.0 
@SimenB
Copy link
Member

SimenB commented Aug 13, 2019

Thanks for the reproduction! It might male it easier to track down - we currently oom with circus on CI, sp there's definitely an issue here.

@scotthovestadt did you look into it yet?

@bcmarinacci
Copy link
Author

bcmarinacci commented Aug 13, 2019

No worries! Thanks heaps for looking into this!

I dug into this a bit more, used git bisect, and was able to confirm (via running the 100 tests referenced above) that the line: https://github.com/facebook/jest/blob/3341d162715679a596bfbbec8a7a5c5676c5f1b7/packages/jest-jasmine2/src/index.js#L148 (from commit: 3341d16) caused an initial memory leak and commenting that line out solved the problem.

Unfortunately, commenting out the equivalent line on master:
https://github.com/facebook/jest/blob/94067089dd25c77463e24e981fe979fb6cc0b2fb/packages/jest-runner/src/runTest.ts#L194 didn't solve the issue so it seems like there could be multiple causes.

Some other things I tried, cumulatively, were:

Even with all of those changes combined, I was still seeing the same amount of memory leaked. 😕

@SimenB
Copy link
Member

SimenB commented Aug 13, 2019

Yeah, that commit was reverted an re-landed in #6176 due to memory leaks

@bcmarinacci
Copy link
Author

bcmarinacci commented Aug 13, 2019

Ah, that is super helpful. Thanks for that!

I looked again after #6176 and, you guys may already be aware, it looks like the next memory leak was introduced with this commit: c5994de.

I think it may be this line: https://github.com/facebook/jest/blob/94067089dd25c77463e24e981fe979fb6cc0b2fb/packages/jest-circus/src/legacy-code-todo-rewrite/jestAdapter.ts#L28

Calling global.gc() and then logging process.memoryUsage().heapUsed immediately before and after the runtime.requireInternalModule(FRAMEWORK_INITIALIZER) invocation, shows memory usage increase by almost the exact amount it increases for each test file before staying almost consistent until the jestAdapter function returns.

I apologize for the crudeness, but here is a simplified example of the logging pattern that I was using followed by some sample outputs:

// jestAdapter.ts

const FRAMEWORK_INITIALIZER = require.resolve('./jestAdapterInit');

const jestAdapter = async (
  // ...
): Promise<TestResult> => {
  global.gc();
  console.log(`1. START ${filename}: process.memoryUsage().heapUsed MB`);

  const {
    initialize,
    runAndTransformResultsToJestFormat,
  } = runtime.requireInternalModule(FRAMEWORK_INITIALIZER);

  global.gc();
  console.log(`2. AFTER INVOCATION ${filename}: process.memoryUsage().heapUsed MB`);

  // ...

  global.gc();
  console.log(`3. END ${filename}: process.memoryUsage().heapUsed MB`);

  return _addSnapshotData(results, snapshotState);
}

Output:

1. START index-44.jest.js: 220.00968170166016 MB
2. AFTER INVOCATION index-44.jest.js: 222.1803207397461 MB
3. END index-44.jest.js: 222.3431625366211 MB
PASS  __sandbox__/__tests__/index-44.jest.js (222 MB heap size)

1. START index-28.jest.js: 222.53646850585938 MB
2. AFTER INVOCATION index-28.jest.js: 224.7444076538086 MB
3. END index-28.jest.js: 224.8832015991211 MB
PASS  __sandbox__/__tests__/index-28.jest.js (224 MB heap size)

@SimenB
Copy link
Member

SimenB commented Aug 13, 2019

That's really interesting, thanks! Seems like jest-circus itself keeps some state around which is not cleaned up then as the jump is after require. Is it because we mess with globals? Would you be able to inspect what's added to the heap by the require call?

@bcmarinacci
Copy link
Author

I just started getting into the runtime.requireInternalModule() code as the sun was coming up so I decide to stop there. 😅 I have some rather time-sensitive work to complete at the moment so I might not be able to spend more time on this for the next few days but I will post any new findings in this thread. 🙂

@scotthovestadt
Copy link
Contributor

scotthovestadt commented Aug 14, 2019

@bcmarinacci Amazing findings! I also tried attacking error stacks and the sourcemap install but (also) wasn't able to see this resolve.

I have an idea what might be causing this with the framework install. I wonder if the runner state, which is placed on a global, needs to be cleaned up... It really wouldn't surprise me.

I'll spend some time tomorrow on this. Thank you!

@GuskiS
Copy link

GuskiS commented Mar 4, 2020

Is there any progress on this one?

@SimenB
Copy link
Member

SimenB commented Apr 26, 2020

I looked into this a bit now as the leak is really starting to become an issue. Unfortunately, I think it has always leaked - in the reproduction from the OP it doesn't actually test circus in the non-leaking v22 - the JEST_CIRCUS env variable didn't exist at the time, so to run circus you have to set testRunner in config.

{
  "jest": {
    "testRunner": "jest-circus/build/legacy_code_todo_rewrite/jest_adapter.js"
  }
}

It's not actually enough since that file does exports.default= which that version of jest didn't support. So you need something like this

// circus-runner.js

module.exports = require('jest-circus/build/legacy_code_todo_rewrite/jest_adapter.js').default;

And doing that shows that it leaks in 22.4 as well.

image

@MarcoScabbiolo
Copy link
Contributor

For each test passed its Runtime and Environment is still allocated in memory.

@SimenB
Copy link
Member

SimenB commented Apr 27, 2020

I've tried to delete a whole bunch of assignments (including removing the whole of jest-snapshot) which got the leak on master from ~4 mb to ~3. So I don't think that actually made a difference. We could try to clean up the environment or runtime themselves, but seeing as we use both of those as-is in Jasmine I don't think that's the issue, rather how jest-circus use them? EDIT: I forgot to remove adding the snapshot matchers, doing so reduced the leaked memory per test to 1-2 mb. I still don't think it's the issue though.

I also tried to reset the runner state in mentioned in #8816 (comment) without much luck at getting it to make a difference

@MarcoScabbiolo
Copy link
Contributor

MarcoScabbiolo commented Apr 27, 2020

The only way to stop the leak I found was to return a fake result in the top of jestAdapter https://github.com/facebook/jest/blob/faa52673f4d02ee1d273103ada9382a0805b54a4/packages/jest-circus/src/legacy-code-todo-rewrite/jestAdapter.ts#L25 , before any other statement. Every other stripping of code would make the tests fail for missing stuff in the Runtime, or didnt stop the leak at all.

Its really hard to recognize which assignment of which object is being leaked, because almost everything is being leaked and inter-related. My best guess is a reference (indirect) to the Runtime is held by the previous run either in its result (which would be very rare), or in global, or in an event handler.

@SimenB
Copy link
Member

SimenB commented Apr 27, 2020

That function should do essentially the same as this function in Jasmine, which doesn't leak:

https://github.com/facebook/jest/blob/faa52673f4d02ee1d273103ada9382a0805b54a4/packages/jest-jasmine2/src/index.ts#L25-L193

I was also unable to track down where circus caused some stuff to be retained while the Jasmine runner was able to make it GCed.

(Some memory increase per test is expected of course as we store test results as they complete.)

@SimenB
Copy link
Member

SimenB commented Apr 30, 2020

The reproduction only reproduced the leak when running in band, which has now been fixed. Landed a few other leak fixes in 25.5.1, so hopefully things are better now. Might be other leaks, but new issues for those would be good.

I'm guessing worker runners worked since it serializes from workers to parent

@github-actions
Copy link

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
Please note this issue tracker is not a help forum. We recommend using StackOverflow or our discord channel for questions.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators May 11, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants