-
-
Notifications
You must be signed in to change notification settings - Fork 229
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Consider implementing a duplex mode #143
Comments
I like it. I could see myself using this in some places. I would name it |
@IssueHunt has funded $70.00 to this issue.
|
I was looking into working on this, but I ran into a small issue. If this syntax were to work we'd have to return a duplex stream: new stream.Duplex({
write(...writeArgs) {
return spawned.stdin.write(...writeArgs);
},
read(...readArgs) {
return spawned.stdout.read(...readArgs);
}
}); but we can't close this stream once we're done writing the stdin data. How do we work around that? |
Maybe use a |
Trying to work on it too. Some questions :
|
Rough, mostly untested first try : |
I am wondering whether the new pipe methods might already be fixing the underlying problem of this issue? // stdout -> stdin
await execa(...).pipeStdout(execa(...))
// stdout -> stream
await execa(...).pipeStdout(otherStream)
// stdout -> file
await execa(...).pipeStdout('path');
// stderr -> stdin/stream/file
await execa(...).pipeStderr(...);
// stdout + stderr -> stdin/stream/file
await execa(...).pipeAll(...);
// stream -> stdin
await execa(..., {stdin: stream})
// file -> stdin
await execa(..., {inputFile: 'path'})
// string -> stdin
await execa(..., {input: 'value'}) So it is actually quite easy now to pipe multiple processes: const { stdout: topTenFilenames } = await execa('ls', ['exampleDir'])
.pipeStdout(execa('sort'))
.pipeStdout(execa('head', ['-n', '10'])) |
If anyone wants to work on this, see the feedback in: #424 |
It's taken me quite some time, but I finally got this working. I have a PR at #912 which implements this, together with MotivationThe purpose is to use a subprocess as a stream, to pass it to APIs which accept streams. This contrasts with:
Readable vs Writable vs DuplexUsers might want to only use the readable/writable side of the subprocess, but still convert it to a stream. Using
If a user wants a read-only stream, it is improper to return a I actually started implementing this as a single const streamOne = subprocess.readable()
const streamTwo = subprocess.writable()
const streamThree = subprocess.duplex() This also means implementing this as a This also means File descriptorsUsers should be able to choose the file descriptor. My PR does it by re-using the const stream = subprocess.readable({from: 'stderr'}) Method vs propertyExposing this as a property (as I previously suggested in #591) does not work because:
So it needs to be called on-demand via a method. Error propagationThe stream awaits the subprocess completion. If the subprocess fails, an In the other direction, the subprocess does not wait for the stream: there is no need to. However, if the subprocess takes some input, its If the stream errors and/or is destroyed, it destroys For duplexes, we purposely do not propagate errors between the readable and the writable side. There is no reason to, and it creates race conditions and subtle bugs. Multiple consumersUsing Also, the PR allows for calling Hard partsSolving this issue was very difficult. I tried many approaches and was close to giving up a few times, especially with tests that failed in CI but not locally due to hard-to-debug race conditions. I am pretty confident with the current PR though as it has lots of automated tests covering edge cases. Some of the things that made this hard to implement:
Web streamsMy PR is implementing this with Node.js streams. All of this work should be done for web streams too. I have opened #913 to track this. I believe we should let users choose whether they want a Node.js stream or a web stream. While Node.js streams are legacy, most Node modules still only accept them. Also, from having worked with both a lot, I am expecting the implementing with web streams to be more complicated, slower and less stable. Finally, there a few things that only Node.js streams can do, such as having multiple readers at once, where there might be the only option in specific cases. That being said, we definitely should implement this with for web streams too. Method namesBecause of the above, my PR uses the following names, which are the same names as the stream class itself and the TypeScript type:
|
This is impressive research and work. I'm pretty sure I would have given up long before you. 👏 |
I wonder if there is anything Node.js could improve that would have made this work easier. If so, definitely open some Node.js issues. For example, a version of |
One of the issues with Also it propagates state changes between the write and read side. In most cases, this is good, but this was such a problem with this specific issue. The problem with streams is that they combine (by their own nature) multiple things that lead to complexity: performance-sensitive, memory-sensitive, statefulness, async, extendable by users. Then this issue combines it with OS processes, which are complex too. :) |
@sindresorhus has rewarded $63.00 to @ehmicky. See it on IssueHunt
|
This feature has been just released in Execa 9.0.0. Please see (and share!) the release post and the changelog. |
Basically, a way to make a predictable duplex stream from
stdin
andstdout
(orstderr
), so you can use a child process as a transform stream. It'd emit an error if the process errors, and it'd kill the process if the stream is destroyed.It's not terribly difficult to do something like this as
execa
is nowbut this can behave in unexpected ways.
IssueHunt Summary
Backers (Total: $70.00)
Submitted pull Requests
.readable()
,.writable()
and.duplex()
methodsTips
The text was updated successfully, but these errors were encountered: