Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Latest changes break logging in AWS lambda functions #61

Open
stuckj opened this issue Apr 12, 2020 · 1 comment
Open

Latest changes break logging in AWS lambda functions #61

stuckj opened this issue Apr 12, 2020 · 1 comment

Comments

@stuckj
Copy link

stuckj commented Apr 12, 2020

The latest changes break logging in AWS lambda functions. In lambda functions, all async processing must be done before you exit the lambda handler else that processing will basically not occur (the process gets frozen). You'll lose logs calls if you can't flush the stream before exiting. There's a bunch of detail on this in this Winston issue: winstonjs/winston#1250

Previously, we could just call end and wait for the 'finish' event to flush the stream. See here for an example (from that issue): winstonjs/winston#1250 (comment). This is what winston specifically recommends for flushing the stream: https://github.com/winstonjs/winston#awaiting-logs-to-be-written-in-winston.

This no longer works with the most recent changes. Specifically, moving the call to callback() into the callback from node-loggly-bulk breaks this (7a6d534#diff-6aa8c2a6db3f74b4c5896963f56925feR138). I have some details below from a bunch of debugging of what fails if you're interested.

That move of the callback does make sense to me. Seems like you should wait to call it until node-loggly-bulk has finished actually sending logs. But, there does seem to be a back pressure issue. I'm not sure if this lies here or in winston though so I'm starting here.

BUG DETAILS:
There appears to be a back pressure issue. If you have a lot of logs queued up in a short period of time you'll end up causing a problem. In my case, we have ~40 lines of logging in about 20ms and then call logger.end() to flush the stream since the lambda function is ready to exit. The # of log events appears to exceed the high water mark for the stream so not all of them log messages are queued up (I'm not sure where the buffering of the remaining logs occurs when there's back pressure. My guess is either here or in winston). The buffer gets flushed and then end is processed. But, the remaining log events get pushed into the stream after end. This causes an exception to be thrown (write after end). Which is caught in node-loggly-bulk here: https://github.com/loggly/node-loggly-bulk/blob/master/lib/loggly/client.js#L202.

The exception was actually thrown FROM the call to callback above that line on line 196. The stream was flushing the buffer that had gotten more events which is what triggers the write after end. And then the callback is called again in the catch block. That causes another exception (which is what ultimately kills execution) because the callback inside the stream is cleared after the error state. So, you get a very unclear error that cb is not a function from deep inside the streams code. The actual stack trace looks like this:

/app/node_modules/readable-stream/lib/_stream_writable.js:447
    cb(er);
    ^

TypeError: cb is not a function
    at onwriteError (/app/node_modules/readable-stream/lib/_stream_writable.js:447:5)
    at onwrite (/app/node_modules/readable-stream/lib/_stream_writable.js:470:11)
    at WritableState.onwrite (/app/node_modules/readable-stream/lib/_stream_writable.js:180:5)
    at logged (/app/node_modules/winston-loggly-bulk/lib/winston-loggly.js:148:17)
    at /app/node_modules/node-loggly-bulk/lib/loggly/client.js:204:9
    at Request._callback (/app/node_modules/node-loggly-bulk/lib/loggly/common.js:209:15)
    at Request.self.callback (/app/node_modules/request/request.js:185:22)
    at Request.emit (events.js:198:13)
    at Request.<anonymous> (/app/node_modules/request/request.js:1154:10)
    at Request.emit (events.js:198:13)

We can mask the error by adding an error handler on the stream (i.e., logger.on('error', ...)). That will prevent killing the result of the function. But, we'll still lose a bunch of logging which stinks.

@terrisgit
Copy link

terrisgit commented May 2, 2020

I left a comment at winstonjs/winston#1250

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants