Skip to content
This repository has been archived by the owner on Apr 22, 2023. It is now read-only.

Child process unbuffered output is lost #8009

Closed
bguezzie opened this issue Jul 28, 2014 · 2 comments
Closed

Child process unbuffered output is lost #8009

bguezzie opened this issue Jul 28, 2014 · 2 comments

Comments

@bguezzie
Copy link

Similar to issue #6595

When using unbuffered output on a child process (specifically "python -u my_child_script.py") and the 'data' event, my (parent) node process receives 'data' events for a portion of the data on the child's stdout, but drops the last several writes from the child process (which we do control, and we are even flushing stdout after each write just to make sure). For clarity, I am attaching a 'data' event listener immediately after the process is spawned, and I never remove a listener from the process.

Specifically, our child process outputs yaml documents over a period of time (status updates, essentially), and when the process exits not all of the yaml has been received by my 'data' listener prior to the 'close' event though I have confirmed that the data has indeed all been written and flushed.

No data exists in child.stdout for reading (this is after the child's 'close' event, and is expected behavior per the API), so it appears that the data is permanently missing.

@indutny
Copy link
Member

indutny commented Jul 29, 2014

Could you please provide a test case for this problem?

@bguezzie
Copy link
Author

bguezzie commented Aug 6, 2014

We actually resolved this in our code. False alarm, sorry.

@bguezzie bguezzie closed this as completed Aug 6, 2014
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

3 participants