Working with stdout and stdin of a child process in Node.js

[2018-05-05] dev, javascript, nodejs, async
(Ad, please don’t block)

In this blog post, we run shell commands as child processes in Node.js. We then use async language features to read the stdouts of those processes and write to their stdins.

Running commands in child processes  

Let’s start with running a shell command in a child process:

const {onExit} = require('@rauschma/stringio');
const {spawn} = require('child_process');

async function main() {
  const filePath = process.argv[2];
  console.log('INPUT: '+filePath);

  const childProcess = spawn('cat', [filePath],
    {stdio: [process.stdin, process.stdout, process.stderr]}); // (A)

  await onExit(childProcess); // (B)

  console.log('### DONE');
}
main();

Observations:

  • We are using spawn(), because, later on, it lets us access stdin, stdout and stderr of the command while it is running.
  • In line A, we are connecting the stdin of the child process to the stdin of the current process (etc.).
  • In line B, we are waiting until the process is completely finished.

Waiting for a child process to exit via a Promise  

Function onExit() looks as follows.

function onExit(childProcess: ChildProcess): Promise<void> {
  return new Promise((resolve, reject) => {
    childProcess.once('exit', (code: number, signal: string) => {
      if (code === 0) {
        resolve(undefined);
      } else {
        reject(new Error('Exit with error code: '+code));
      }
    });
    childProcess.once('error', (err: Error) => {
      reject(err);
    });
  });
}

Promisified writing to a child process  

The following code uses @rauschma/stringio to asynchronously write to the stdin of a child process running a shell command:

const {streamWrite, streamEnd, onExit} = require('@rauschma/stringio');
const {spawn} = require('child_process');

async function main() {
  const sink = spawn('cat', [],
    {stdio: ['pipe', process.stdout, process.stderr]}); // (A)

  writeToWritable(sink.stdin); // (B)
  await onExit(sink);

  console.log('### DONE');
}
main();

async function writeToWritable(writable) {
  await streamWrite(writable, 'First line\n');
  await streamWrite(writable, 'Second line\n');
  await streamEnd(writable);
}

We spawn a separate process, called sink, for the shell command. writeToWritable writes to sink.stdin. It does so asynchronously and pauses via await, to avoid requiring too much buffering.

Observations:

  • In line A, we tell spawn() to let us access stdin via sink.stdin ('pipe'). stdout and stderr are forwarded to process.stdin and process.stderr, as previously.
  • We don’t await in line B for the writing to finish. Instead, we await until the child process sink is done.

Read on for an explanation of how streamWrite() works.

Promisified writing to streams  

Writing to Node.js streams usually involves callbacks (see docs). It can be promisified as follows.

function streamWrite(
  stream: Writable,
  chunk: string|Buffer|Uint8Array,
  encoding='utf8'): Promise<void> {
    return new Promise((resolve, reject) => {
      const errListener = (err: Error) => {
        stream.removeListener('error', errListener);
        reject(err);
      };
      stream.addListener('error', errListener);
      const callback = () => {
        stream.removeListener('error', errListener);
        resolve(undefined);
      };
      stream.write(chunk, encoding, callback);
    });
}

streamEnd() works similarly.

Reading from a child process  

The following code uses asynchronous iteration (line C) to read content from the stdout of a child process:

const {chunksToLinesAsync, chomp} = require('@rauschma/stringio');
const {spawn} = require('child_process');

async function main() {
  const filePath = process.argv[2];
  console.log('INPUT: '+filePath);

  const source = spawn('cat', [filePath],
    {stdio: ['ignore', 'pipe', process.stderr]}); // (A)

  await echoReadable(source.stdout); // (B)

  console.log('### DONE');
}
main();

async function echoReadable(readable) {
  for await (const line of chunksToLinesAsync(readable)) { // (C)
    console.log('LINE: '+chomp(line))
  }
}

Observations:

  • Line A: We ignore stdin, want to access stdout via a stream and forward stderr to process.stderr.
  • Line B: We await until echoReadable() is completely done. Without this await, DONE would be printed before the first line of source.stdout.

Piping between child processes  

In the following example, the function transform():

  • Reads content from the stdout of a source child process.
  • Writes content to the stdin of a sink child process.

In other words, we are implementing something similar to Unix piping:

cat someFile.txt | transform() | cat

This is the code:

const {chunksToLinesAsync, streamWrite, streamEnd, onExit}
  = require('@rauschma/stringio');
const {spawn} = require('child_process');

async function main() {
  const filePath = process.argv[2];
  console.log('INPUT: '+filePath);

  const source = spawn('cat', [filePath],
    {stdio: ['ignore', 'pipe', process.stderr]});
  const sink = spawn('cat', [],
    {stdio: ['pipe', process.stdout, process.stderr]});

  transform(source.stdout, sink.stdin);
  await onExit(sink);

  console.log('### DONE');
}
main();

async function transform(readable, writable) {
  for await (const line of chunksToLinesAsync(readable)) {
    await streamWrite(writable, '@ '+line);
  }
  await streamEnd(writable);
}

Further reading