The introduction of the flow

In NodeJS, we need to rely on the core module FS to operate files. Fs has a very basic API to help us read and write files with small memory footprint. If it is a large file or the memory is uncertain, we can also operate files by open, read, write, close and other methods. Fs provides readable stream and writable stream, so that we can operate the file through the stream, which is convenient for us to read and write the file.

A readable stream

CreateReadStream Creates a readable stream

The createReadStream method takes two arguments, the first argument is the path to read the file, and the second argument is the options option, which takes eight arguments:

  • Flags: indicates the flag bit. The default value isr;
  • Encoding: character encoding. The default value is encodingnull;
  • Fd: file descriptor, defaultnull;
  • Mode: permission bit. The default value is0o666;
  • AutoClose: indicates whether to automatically close files. The default value is “autoClose”true;
  • Start: indicates the start position of reading files.
  • End: Reads the file (including) end position;
  • HighWaterMark: maximum number of bytes to read a file, default64 * 1024.

The return value of createReadStream is fs.ReadStream. The data read from the file defaults to Buffer if encoding is not specified.

Create a readable stream

const fs = require("fs"); // Create a readable stream and read the 1.txt filelet rs = fs.creatReadStream("1.txt", {
    start: 0,
    end: 3,
    highWaterMark: 2
});Copy the code

By default, the contents of a file are not read after a readable stream is created. When a file is read, the readable stream has two states, paused and running.

Note: The writable flow in this article is flow mode, which includes pause and flow states, not pause mode, which is another readable flowreadable.

2. Flow state

The running state means that once the file is read, it will be read again and again according to the value of the highWaterMark until it is read, just like an open faucet, the water will flow continuously until it dries up, which needs to be triggered by listening to the data event.

If the contents of the 1.txt file are now 0 to 90 digits, we will now create a readable stream and read it in flow state.

Flow state

const fs = require("fs");

let rs = fs.createReadStream("1.txt", { start: 0, end: 3, highWaterMark: 2 }); // Read file rs.on("data", data => { console.log(data); }); // Listen to read end rs.on("end", () => {
    console.log("Finished reading."); }); // <Buffer 30 31> // <Buffer 32 33Copy the code

In the code above, the returned RS object listens for two events:

  • Data: reads data every timehighWaterMarkBytes, triggered oncedataEvent until the read is complete, the callback parameter is the Buffer read each time;
  • End: The callback function is triggered and executed when the read is complete.

We want the last read result to be complete, so we need to concatenate each read result when the data event is triggered, which we might have done in the past.

The wrong way to concatenate data

const fs = require("fs");

let rs = fs.createReadStream("1.txt", {
    start: 0,
    end: 3,
    highWaterMark: 2
});

let str = "";

rs.on("data", data => {
    str += data;
});

rs.on("end", () => { console.log(str); }); / / 0123Copy the code

In the above code, if the file was read in Chinese, the highWaterMark was two bytes each time, which could not form a complete Chinese character. The toString method would be called by default if the += operation was performed each time, which would result in garbled characters.

In the future, when you stream files, you will mostly be working with buffers, so you should use the following method to get the final read result.

The correct way to concatenate data

const fs = require("fs");

let rs = fs.createReadStream("1.txt", { start: 0, end: 3, highWaterMark: 2 }); // Store the Buffer from each readlet bufArr = [];

rs.on("data", data => {
    bufArr.push(data);
});

rs.on("end", () => { console.log(Buffer.concat(bufArr).toString()); }); / / 0123Copy the code

3. Suspended state

In a state of flow, once you begin to read a file, will continue to trigger data, until after reading, hold each time we read is suspended directly, no longer continue to read, that is no longer trigger event data, unless we take the initiative to control continue to read, like on the water faucet open to turn off the tap immediately after a, next time when used to open.

Actions similar to turning a tap on or off, that is, to pause and resume reading, have two corresponding methods, pause and resume, on the RS object returned by the readable stream.

In the following scenario, we change the end of the creation of a readable stream to 9, reading two bytes at a time and pausing for a second before resuming reading until we have read through 0 to 90 digits.

hold

const fs = require("fs");

let rs = fs.createReadStream("1.txt", {
    start: 0,
    end: 9,
    hithWaterMark: 2
});

let bufArr = [];

rs.on("data", data => { bufArr.push(data); rs.pause(); // Pause reading console.log("Pause", new Date());

    setTimeout(() => { rs.resume(); // resume reading}, 1000)}); rs.on("end", () => { console.log(Buffer.concat(bufArr).toString()); }); 2018-07-03 T23:52:54.436Z // Suspend 2018-07-03T23:52:53.439Z // Suspend 2018-07-03T23:52:54.440Z // Suspend 2018-07-03T23:52:54.440Z // 2018-07-03T23:52:55.442z // suspend 2018-07-03T23:52:56.443z // 0123456789Copy the code

4. Error listening

When a file is read through a readable stream, it is read asynchronously. If an error is encountered in asynchronous reading, it can also be monitored asynchronously. The rs object of the return value of the readable stream can monitor the error through error events and trigger the callback function when the file is read incorrectly.

Error to monitor

const fs = require("fs"); // Read a file that does not existlet rs = fs.createReadStream("xxx.js", {
    highWarterMark: 2
});

let bufArr = [];

rs.on("data", data => {
    bufArr.push(data);
});

rs.on("err", err => {
    console.log(err);
});

rs.on("end", () => {
    console.log(Buffer.concat(bufArr).toString());
});

// { Error: ENOENT: no such file or directory, open '... xxx.js'. }Copy the code

5. Open and close file listening

Streams can be used for a wide range of applications, not just file reads and writes, but also for requests and responses to data in HTTP, but there are two proprietary events on rs for file reads and returns that listen to open and close files.

The open event is used to listen for the file to be opened, the callback function is executed after the file is opened, and the close event is used to listen for the file to be closed. If the created readable stream’s autoClose is true, it is triggered when the file is automatically closed, and the callback function is executed after the file is closed.

Turn on and off listening for readable streams

const fs = require("fs");

let rs = fs.createReadStream("1.txt", {
    start: 0,
    end: 3,
    highWaterMark: 2
});

rs.on("open", () => {
    console.log("open");
});

rs.on("close", () => {
    console.log("close");
});

// openCopy the code

As shown in the above code, the file will be opened whenever a readable stream is created, triggering the open event. Since the file is paused by default, no file will be read, so the file will not be closed, i.e. the close event will not be triggered.

hold

const fs = require("fs");

let rs = fs.createReadStream("1.txt", {
    start: 0,
    end: 3,
    hithWaterMark: 2
});

rs.on("open", () => {
    console.log("open");
});

rs.on("data", data => {
    console.log(data);
});

rs.on("end", () => {
    console.log("end");
});

rs.on("close", () => {
    console.log("close");
});

// open
// <Buffer 30 31>
// <Buffer 32 33>
// end
// closeCopy the code

As you can see from the printed results of the above example, the file is not closed and the close event is triggered until after the file has been read, and the end event is triggered earlier than close.

Streams can be written

CreateWriteStream Create a writable stream

The createWriteStream method takes two arguments. The first argument is the path to read the file. The second argument is the options option.

  • Flags: indicates the flag bit. The default value isw;
  • Encoding: character encoding. The default value is encodingutf8;
  • Fd: file descriptor, defaultnull;
  • Mode: permission bit. The default value is0o666;
  • AutoClose: indicates whether to automatically close files. The default value is “autoClose”true;
  • Start: indicates the start position of writing files.
  • HighWaterMark: a marker to compare the number of bytes written, default16 * 1024.

The createWriteStream object returns fs.WriteStream, which is actually written to the file on the first write.

Create writable streams

const fs = require("fs"); // Create a writable stream and write to 2.txtlet ws = fs.createWriteStream("2.txt", {
    start: 0,
    highWaterMark: 3
});Copy the code

2. Write methods for writable streams

To write content to a file in a writable stream, the ws write method is used. The parameter is what was written, and the return value is a Boolean that indicates whether the highWaterMark value was sufficient for the current write, true if so, false otherwise, Another way of saying this is whether the length of the content written exceeded the highWaterMark, which returns false.

Write method writes

const fs = require("fs");

let ws = fs.createWriteSteam("2.txt", {
    start: 0,
    highWaterMark: 3
});

let flag1 = ws.write("1");
console.log(flag1);

let flag2 = ws.write("2");
console.log(flag2);

let flag3 = ws.write("3");
console.log(flag3);

// true
// true
// falseCopy the code

If the value of start is not 0, the write location cannot be found by default.

3. Writable stream drain event

Drain means “drain”, and the drain event is triggered when the highWaterMark is already greater than or equal to what was written to the file, and the callback function is executed when the file is completely written from the cache.

Drain the event tapping

const fs = require("fs");

let ws = fs.createWriteStream("2.txt", {
    start: 0,
    highWaterMark: 3
});

let flag1 = ws.write("1");
console.log(flag1);

let flag2 = ws.write("2");
console.log(flag2);

let flag3 = ws.write("3");
console.log(flag3);


ws.on("drain", () => {
    console.log("Suck");
});

// true
// true
// falseCopy the code

4, writable stream end method

The end method takes the last parameter to write. End clears the cache of unwritten data and closes the file.

End method

const fs = require("fs");

let ws = fs.createWriteStream("2.txt", {
    start: 0,
    highWaterMark: 3
});

let flag1 = ws.write("1");
console.log(flag1);

let flag2 = ws.write("2");
console.log(flag2);

let flag3 = ws.write("3");
console.log(flag3);

ws.on("drain", () => {
    console.log("Suck");
});

ws.end("Finished");

// true
// true
// falseCopy the code

After calling the end method, the drain event is no longer triggered even if the value is written to the highWaterMark again, and you open the 2.txt file and find that “123 is done”.

Common error

const fs = require("fs");

let ws = fs.createWriteStream("2.txt", {
    start: 0,
    highWaterMark: 3
});

ws.write("1");
ws.end("Finished");
ws.write("2");

// Error [ERR_STREAM_WRITE_AFTER_END]: write after end...Copy the code

After calling the end method, do not call the write method again to write. Otherwise, a common error is reported: Write after end. The original contents of the file will be cleared, and no new contents will be written.

Writable streams are used in combination with readable streams

The writable stream and the readable stream are used together. If the read content exceeds the highWaterMark of the writable stream, pause is called to pause the read and wait for the contents in memory to be written to the file. If the unwritten content is smaller than the highWaterMark of the writable stream, resume the read. The PIPE method on RS that creates the return value of a writable stream is specifically designed to connect a readable stream to a writable stream by writing what is read from one file to another.

Pipe method use

const fs = require("fs"); // Create readable and writable streamslet rs = fs.createReadStream("1.txt", {
    highWaterMark: 3
});
let ws = fs.createWriteStream("2.txt", { highWaterMark: 2 }); Rs.pipe (ws); // Stream the contents of 1.txt to 2.txt.Copy the code

The above pipe-like approach allows streams to move from one file to another, with free “pacing” of writes based on the highWaterMark for read and write streams, without worrying about memory consumption.

conclusion

This article is about the basic usage of read stream and write stream. In normal development, most apis are not used, except pipe. No matter in file reading and writing or request response, other apis are rarely used, but as a qualified programmer, you must have some understanding of them.