Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dev #30

Open
wants to merge 8 commits into
base: master
Choose a base branch
from
Open

Dev #30

wants to merge 8 commits into from

Conversation

paulpflug
Copy link
Collaborator

New version with all problems resolved 😄

andrenarchy and others added 6 commits July 23, 2015 23:20
With --first, all processes are terminated after the first process
terminated (regardless of the exit code). The exit code is the exit code
of the process that terminated first.
@keithamus
Copy link
Collaborator

Cool, I guess we tried exec and that didnt work so well, so lets do detached and see how that goes.

@paulpflug
Copy link
Collaborator Author

I did deep testing on linux and windows, was quite a tinkering 😄

But before releasing I would like to wait for some positive resonance, what do you think?

@keithamus
Copy link
Collaborator

Sounds like a good idea. I'll try it out on a couple of projects and let you know how it goes.

This was referenced Sep 21, 2015
@keithamus
Copy link
Collaborator

FYI for everyone coming here, to install this development version - run:

npm install git+ssh://github.com/keithamus/parallelshell.git#dev

If people report this as working we can merge it and cut a new version 😄

console.log('-w, --wait will not close sibling processes on error')
console.log('-v, --verbose verbose logging');
console.log('-w, --wait will not close sibling processes on error');
console.log('-f, --first close all sibling processes after first exits (succes/error)');
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

succes is spelled success.

@keithamus
Copy link
Collaborator

@paulpflug I've just had some time to try this out on a project, and noticed I'm getting errors due to the stdio option. It's quite odd, but I'll show you my findings:

with stdio: ['pipe', process.stdout, process.stderr]

$ parallelshell 'echo hello' -v
parallelshell: `echo hello` failed with exit code 1
parallelshell: exit code: 1

with stdio: ['pipe', null, null] (or 'pipe' for each one)

$ parallelshell 'echo hello' -v
parallelshell: `echo hello` ended successfully
parallelshell: exit code: 0

Finally, omitting stdio and instead adding child.stdout.pipe(process.stdout); and child.stderr.pipe(process.stderr);:

$ parallelshell 'echo hello' -v
hello
parallelshell: `echo hello` ended successfully
parallelshell: exit code: 0

Obviously the last one is the most desirable, but I dont know why spawn is erroring when I just hand off the streams to the stdio option, as it's a valid option...

stdio: ['pipe', process.stdout, process.stderr]
stdio: ['pipe', process.stdout, process.stderr],
windowsVerbatimArguments: process.platform === 'win32',
detached: process.platform != 'win32'
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see any output when I run commands now. Seems this line is the culprit. Why are we detaching the process?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So they can be killed properly and we don't end up with zombie processes. I actually think L122 is the culprit more than L124

@paulpflug
Copy link
Collaborator Author

If I remember correctly, the piping will not preserve color and format.
The error looks strange indeed.. But currently I have no time to
investigate, maybe in 2 weeks or so.

Am 21. September 2015 10:51:45 nachm. schrieb Keith Cirkel
[email protected]:

@paulpflug I've just had some time to try this out on a project, and
noticed I'm getting errors due to the stdio option. It's quite odd, but
I'll show you my findings:

with stdio: ['pipe', process.stdout, process.stderr]

$ parallelshell 'echo hello' -v
parallelshell: `echo hello` failed with exit code 1
parallelshell: exit code: 1

with stdio: ['pipe', null, null] (or 'pipe' for each one)

$ parallelshell 'echo hello' -v
parallelshell: `echo hello` ended successfully
parallelshell: exit code: 0

Finally, omitting stdio and instead adding
child.stdout.pipe(process.stdout); and child.stderr.pipe(process.stderr);:

$ parallelshell 'echo hello' -v
hello
parallelshell: `echo hello` ended successfully
parallelshell: exit code: 0

Obviously the last one is the most desirable, but I dont know why spawn is
erroring when I just hand off the streams to the stdio option, as it's a
valid option...


Reply to this email directly or view it on GitHub:
#30 (comment)

@estk
Copy link

estk commented Sep 21, 2015

@keithamus In the child_process manual it says:

Stream object - Share a readable or writable stream that refers to a tty, file, socket, or a pipe with the child process. The stream's underlying file descriptor is duplicated in the child process to the fd that corresponds to the index in the stdio array. Note that the stream must have an underlying descriptor (file streams do not until the 'open' event has occurred).

Maybe the process has not 'open'ed yet? Anyway I think we just want to use integers like so:

...
stdio: [ 'pipe', 1, 2 ],
...

Also, why are we opening a pipe?

@estk
Copy link

estk commented Sep 21, 2015

Just opened #35. This is working for me.

sublimino and others added 2 commits October 14, 2015 16:12
Return exit code 0 on delayed child process error
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants