Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GRPC-JS: Possible EventEmitter memory leak detected #661

Closed
schmidt-sebastian opened this issue May 28, 2019 · 33 comments · Fixed by #676 or #700
Closed

GRPC-JS: Possible EventEmitter memory leak detected #661

schmidt-sebastian opened this issue May 28, 2019 · 33 comments · Fixed by #676 or #700
Assignees
Labels
api: firestore Issues related to the googleapis/nodejs-firestore API. priority: p1 Important issue which blocks shipping the next release. Will be fixed prior to next release. 🚨 This issue needs some love. triage me I really want to be triaged. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.

Comments

@schmidt-sebastian
Copy link
Contributor

schmidt-sebastian commented May 28, 2019

See #623 (comment)

I just upgraded to @google-cloud/[email protected] and since upgrading, I've been getting all of these warnings that say:
(node:2870) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 connectivityStateChanged listeners added. Use emitter.setMaxListeners() to increase limit

@schmidt-sebastian
Copy link
Contributor Author

I believe that this is an issue in grpc-js. See: grpc/grpc-node#694 (comment)

The user report indicates that this happens when a client repeatedly cannot connect to the backend.

@jakeleventhal
Copy link

jakeleventhal commented May 28, 2019

@schmidt-sebastian this happens when run a lot of operations at once i.e. i have a function to wipe my development database and it looks something like this

['Collection1', 'Collection2', 'Collection3'].map((collection) => {
    const data = await firestore.collection(collection).select().get();
    return data.docs.map((doc) => firestore.collection(collection).doc(doc.id).delete());
});

When my database is rather full, this error pops up hundreds of times.

This also happens when I onboard large users to my application - a lot of documents are being created/updated (different from contention on a single document), often in parallel functions like above^.

@schmidt-sebastian schmidt-sebastian changed the title Possible EventEmitter memory leak detected GRPC-JS: Possible EventEmitter memory leak detected May 28, 2019
@schmidt-sebastian
Copy link
Contributor Author

I believe this happens when there are more than 10 outstanding requests (with pending DeadlineFilters) on a single HTTP2 channel. For me, this does not reproduce over fast networks, but reproduces reliably when I turn on package loss (using Apple's Network Link Conditioner). A test that can be used to reproduce is https://github.com/googleapis/nodejs-firestore/blob/master/dev/system-test/firestore.ts#L928

I will ping the GRPC team directly to prioritize this issue.

@schmidt-sebastian
Copy link
Contributor Author

grpc-js version 0.4.1 includes a fix for this issue. If you update your dependencies, you should receive this version (google-gax pulls in 0.4.x). You might have to update/delete your package lock files.

@schmidt-sebastian
Copy link
Contributor Author

We had to revert grpc-js back to 0.4.0. I am re-opening this issue.

@jmacedoit
Copy link

This happens with my firestore node server setup always on startup (where I begin listening to several collections) ever since i upgraded to @google-cloud/firestore": "^2.1.0". And the greater the number of listeners, the greater the number of messages. It looks like this message is output once for every 100 or so streams I'm listening. My setup is exactly the same as described in: #673

@yoshi-automation yoshi-automation added triage me I really want to be triaged. 🚨 This issue needs some love. labels Jun 4, 2019
@schmidt-sebastian schmidt-sebastian added the type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns. label Jun 4, 2019
@schmidt-sebastian
Copy link
Contributor Author

@jmacedoit Sorry to hear about your troubles. We will get this fixed as soon as we sort out the remaining issues with @grpc/grpc-js v0.4.1.

@jakeleventhal
Copy link

@schmidt-sebastian any update on this? I am having critical problems with firestore related to this bug. is there a fix in the meantime to revert?

@schmidt-sebastian
Copy link
Contributor Author

This has been fixed with the release of @grpc/grpc-js, which any version of @google-cloud/firestore 2.x (with the exception of the bugfix release 2.1.1) should pull in automatically.

@guanzo
Copy link

guanzo commented Jun 13, 2019

I'm still getting this error, here's the output of npm ls

/file-processor # npm ls @grpc/grpc-js
file-processor@ /file-processor
+-- @google-cloud/[email protected]
| `-- [email protected]
|   `-- @grpc/[email protected] 
`-- @google-cloud/[email protected]
  `-- [email protected]
    `-- @grpc/[email protected]

Screenshot of error

Is the onSnapshot method able to listen forever? Or should I close the listener and resubscribe periodically?

@schmidt-sebastian
Copy link
Contributor Author

@guanzo Are you using datastore when you see these errors? The EventEmitter warning might come from the old version of grpc-js bundled with google-gax < 1.0.

I can certainly try to reproduce this issue again with the newest version of grpc-js, but for me the warnings went away as of 0.4.1.

You should be able to listen forever, and we reconnect for you. In general, will we abort a listen stream only if we fail to reconnect 10 times in a row.

@jmacedoit
Copy link

I'm seeing the same thing as @guanzo. And the only occurrences of grpc/grpc-js in my dependency tree are of [email protected].

├─┬ @google-cloud/[email protected]
│ ├─┬ [email protected]
│ │ └─┬ [email protected]
│ │   ├── [email protected]
│ │   ├── [email protected] deduped
│ │   ├── [email protected] deduped
│ │   └── [email protected]
│ ├── [email protected]
│ ├── [email protected]
│ ├─┬ [email protected]
│ │ ├── @grpc/[email protected] deduped
...
├─┬ @grpc/[email protected]

Note that the warning messages are slighty different also. Now we get:

(node:48004) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 drain listeners added. Use emitter.setMaxListeners() to increase limit
(node:48004) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 error listeners added. Use emitter.setMaxListeners() to increase limit
(node:48004) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added. Use emitter.setMaxListeners() to increase limit

as opposed to

(node:2870) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 connectivityStateChanged listeners added. Use emitter.setMaxListeners() to increase limit

@schmidt-sebastian
Copy link
Contributor Author

I will work with the GRPC JS team to get this one fixed as well. Sorry for all the trouble with this new networking layer, but shipping a 50MB+ C-dependency (the old GRPC core) was never something we were proud of.

@jmacedoit
Copy link

jmacedoit commented Jun 13, 2019

@schmidt-sebastian And you people have been doing great. From what I've seen in the past few weeks, I believe that we will soon be able to enjoy an error/warning free and very robust SDK. Keep up with the good work 🥇

@schmidt-sebastian
Copy link
Contributor Author

@jmacedoit Are you able to help us debug over at grpc/grpc-node#916?

@schmidt-sebastian
Copy link
Contributor Author

Re-opening this issue for additional visibility.

@jmacedoit
Copy link

@schmidt-sebastian Yes I'm trying to reproduce it again.

@Tobb3
Copy link

Tobb3 commented Jun 14, 2019

I've encountered this problem for a couple of weeks. My packages/dependencies are the same as @jmacedoit
To narrow it down, I created an application that subscribes to 1 single collection in firestore. A collection that has no documents and no documents are ever added.
db.collection('somecollection').onSnapshot(...)

The subscription is starting and everything looks fine until about 4 hours of idling, I get this error

MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 drain listeners added. Use emitter.setMaxListeners() to increase limit
    at _addListener (events.js:256:17)
    at DestroyableTransform.addListener (events.js:272:10)
    at DestroyableTransform.Readable.on (c:\repos\testworker\node_modules\through2\node_modules\readable-stream\lib\_stream_readable.js:786:33)
    at BunWrapper.Stream.pipe (internal/streams/legacy.js:28:8)
    at _firestore.readWriteStream.then.backendStream (c:\repos\testworker\node_modules\@google-cloud\firestore\build\src\watch.js:343:35)

That's all I have atm, I will try to find out more

@sduskis sduskis added priority: p1 Important issue which blocks shipping the next release. Will be fixed prior to next release. and removed 🚨 This issue needs some love. triage me I really want to be triaged. labels Jun 16, 2019
@yoshi-automation yoshi-automation added 🚨 This issue needs some love. triage me I really want to be triaged. labels Jun 16, 2019
@guanzo
Copy link

guanzo commented Jun 27, 2019

Thanks for the fix 👍 , will upgrade now and report any errors.

@MarkChrisLevy
Copy link

MarkChrisLevy commented Sep 15, 2019

@thebrianchen @schmidt-sebastian It is still an issue - latest @google-cloud/firestore (2.3.0) and latest grpc@grpc-js (0.5.3). Stack trace:

node --trace-warnings dist/local-test.js
(node:15717) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added. Use emitter.setMaxListeners() to increase limit
    at _addListener (events.js:243:17)
    at TLSSocket.addListener (events.js:259:10)
    at TLSSocket.Readable.on (_stream_readable.js:799:35)
    at Http2CallStream.attachHttp2Stream (/cloud-functions/node_modules/@grpc/grpc-js/build/src/call-stream.js:275:35)
    at Http2SubChannel.startCallStream (/cloud-functions/node_modules/@grpc/grpc-js/build/src/subchannel.js:110:20)
    at finalMetadata.then.metadataValue (/cloud-functions/node_modules/@grpc/grpc-js/build/src/channel.js:205:28)

It happens not only in my local environment, but also in a production firebase function.

@nip10
Copy link

nip10 commented Sep 15, 2019

Also happening here, GCP Function w/ Firestore + KMS

@google-cloud/firestore": "2.3.0"
@google-cloud/kms": "1.3.2"

MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added. Use emitter.setMaxListeners() to increase limit  
    at _addListener (events.js:280:19)  
    at TLSSocket.addListener (events.js:297:10)  
    at TLSSocket.Readable.on (_stream_readable.js:772:35)  
    at Http2CallStream.attachHttp2Stream (/srv/node_modules/@grpc/grpc-js/build/src/call-stream.js:275:35)  
    at Http2SubChannel.startCallStream (/srv/node_modules/@grpc/grpc-js/build/src/subchannel.js:110:20)  
    at finalMetadata.then.metadataValue (/srv/node_modules/@grpc/grpc-js/build/src/channel.js:205:28) 

@jakeleventhal
Copy link

can confirm

@marcinolawski
Copy link

It is still an issue:

@google-cloud/[email protected]
@grpc/[email protected]

MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added. Use emitter.setMaxListeners() to increase limit
    at _addListener (events.js:280:19)
    at TLSSocket.addListener (events.js:297:10)
    at TLSSocket.Readable.on (_stream_readable.js:772:35)
    at Http2CallStream.attachHttp2Stream (/srv/node_modules/@grpc/grpc-js/build/src/call-stream.js:275:35)
    at Http2SubChannel.startCallStream (/srv/node_modules/@grpc/grpc-js/build/src/subchannel.js:110:20)
    at finalMetadata.then.metadataValue (/srv/node_modules/@grpc/grpc-js/build/src/channel.js:205:28)
    at <anonymous>

@krufebird
Copy link

krufebird commented Sep 16, 2019

Issue over here as well:

$ npm ls @grpc/grpc-js
└─┬ [email protected]
└─┬ @google-cloud/[email protected]
└─┬ [email protected]
└── @grpc/[email protected]

(node:10569) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
at _addListener (events.js:261:17)
at TLSSocket.addListener (events.js:277:10)
at TLSSocket.Readable.on (_stream_readable.js:846:35)
at Http2CallStream.attachHttp2Stream (/Users/x/repos/x-app/server/node_modules/@grpc/grpc-js/build/src/call-stream.js:275:35)
at Http2SubChannel.startCallStream (/Users/x/repos/x-app/server/node_modules/@grpc/grpc-js/build/src/subchannel.js:110:20)
at /Users/x/repos/x-app/server/node_modules/@grpc/grpc-js/build/src/channel.js:205:28
at processTicksAndRejections (internal/process/task_queues.js:85:5)

(node:10569) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
at _addListener (events.js:261:17)
at TLSSocket.addListener (events.js:277:10)
at TLSSocket.Readable.on (_stream_readable.js:846:35)
at Http2CallStream.attachHttp2Stream (/Users/x/repos/x-app/server/node_modules/@grpc/grpc-js/build/src/call-stream.js:275:35)
at Http2SubChannel.startCallStream (/Users/x/repos/x-app/server/node_modules/@grpc/grpc-js/build/src/subchannel.js:110:20)
at /Users/x/repos/x-app/server/node_modules/@grpc/grpc-js/build/src/channel.js:205:28
at processTicksAndRejections (internal/process/task_queues.js:85:5)

@guanzo
Copy link

guanzo commented Sep 16, 2019

FWIW, i'm still on @google-cloud/[email protected] and no issues. something must have regressed in later releases.

@schmidt-sebastian
Copy link
Contributor Author

@MarkChrisLevy @krufebird @guanzo This is a regression in the latest release of @grpc/grpc-js. If you are blocked on this, you can manually pin to 0.5.2.

The corresponding issue is: grpc/grpc-node#1027

@lzl124631x
Copy link

@schmidt-sebastian How to pin the grpc version without adding that to package.json? Add it to package-lock.json would be fine. I tried npm install @grpc/[email protected] --save-exact but that saves grpc in package.json which is unnecessary.

@MarkChrisLevy
Copy link

@schmidt-sebastian Yep, [email protected] is ok. Thanks.

@schmidt-sebastian
Copy link
Contributor Author

@lzl124631x If you have a package-lock.json, you can remove the dependency from "package.json" one it is in the lockfile.

@lzl124631x
Copy link

@schmidt-sebastian ic. Thanks!

@glawson
Copy link

glawson commented Sep 25, 2019

I found @grpc/[email protected] in firebase-admin:

functions@ 
├─┬ @google-cloud/[email protected]
│ ├── @grpc/[email protected]  deduped
│ └─┬ [email protected]
│   └── @grpc/[email protected]  deduped
├── @grpc/[email protected] 
└─┬ [email protected]
  └─┬ @google-cloud/[email protected]
    └─┬ [email protected]
      └── @grpc/[email protected] 

Upgrading to [email protected] fixed it

functions
├─┬ @google-cloud/[email protected]
│ ├── @grpc/[email protected]  deduped
│ └─┬ [email protected]
│   └── @grpc/[email protected]  deduped
└── @grpc/[email protected] 

@jakeleventhal
Copy link

jakeleventhal commented Sep 29, 2019

I'm still experiencing a memory leak potentially related to this. The following is from my yarn.lock:

"@grpc/grpc-js@^0.5.2":
  version "0.5.2"
  resolved "https://registry.yarnpkg.com/@grpc/grpc-js/-/grpc-js-0.5.2.tgz#8e39876033749e1611fe9fa181d9535077805abe"
  integrity sha512-NE1tP/1AF6BqhLdILElnF7aOBfoky+4ZOdZU/0NmKo2d+9F9QD8zGoElpBk/5BfyQZ3u1Zs+wFbDOFpVUzDx1w==
  dependencies:
    semver "^6.0.0"

"@grpc/proto-loader@^0.5.0", "@grpc/proto-loader@^0.5.1":
  version "0.5.1"
  resolved "https://registry.yarnpkg.com/@grpc/proto-loader/-/proto-loader-0.5.1.tgz#48492b53cdda353110b51a4b02f465974729c76f"
  integrity sha512-3y0FhacYAwWvyXshH18eDkUI40wT/uGio7MAegzY8lO5+wVsc19+1A7T0pPptae4kl7bdITL+0cHpnAPmryBjQ==
  dependencies:
    lodash.camelcase "^4.3.0"
    protobufjs "^6.8.6"

I'm not actually seeing the error message described previously (MaxListenersExceededWarning), however, the memory usage on my application slowly and steady is increasing. I've taken several heap snapshots from local tests over the course of several hours and when comparing snapshots from a few hours apart, I notice that the top culprits for memory allocations are all related to grpc.

Here you can clearly see that the top 4 culprits for the memory delta between snapshots 12 and 4 are vastly more than everything else
Screen Shot 2019-09-28 at 7 08 24 PM

Here are the contents of each from largest to smallest:

  1. (string): This by far is the fastest growing constructor. As you can see, there are 920k new allocations in the comparison between the two snapshots. Practically every single string here is ""(function anonymous( ) { return function BeginTransactionRequest(p){ if(p)for(var ks=Object.keys(p),i=0;i<ks.length;++i)if(p[ks[i]]!=null) this[ks[i]]=p[ks[i]] } })". And this all comes from message.js:13. This is a a file from the package protobufjs which is a dependency of grpc.

Screen Shot 2019-09-28 at 7 47 17 PM

  1. (system): Essentially everything here seems to be coming from from the same message.js:13.

Screen Shot 2019-09-28 at 7 29 51 PM

  1. (array): I can't pin down consistent allocation here but basically everything here seems to be tied to grpc

Screen Shot 2019-09-28 at 7 19 11 PM

  1. Array: There are 235k of these and they all look the same - Array with an alloc size of 32 with some connection to grpc

Screen Shot 2019-09-28 at 7 12 35 PM

@jakeleventhal
Copy link

I referenced this issue in this new issue
#768

It seems almost certainly related so you might want to check it out

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: firestore Issues related to the googleapis/nodejs-firestore API. priority: p1 Important issue which blocks shipping the next release. Will be fixed prior to next release. 🚨 This issue needs some love. triage me I really want to be triaged. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.
Projects
None yet