Node’s FS documentation is full of apis that support all operations on the file system. The document organization is very good, the operation is basically divided into file operation, directory operation, file information, stream this big aspect, the programming way also supports synchronous, asynchronous and Promise.

This article documents several issues that are not described in detail in the documentation to better link the FS documentation:

  • File descriptor
  • Synchronization, asynchrony, and Promise
  • Directories and directory entries
  • File information
  • stream

🔍 follow the public account “xintan blog” / 👉 to xxoo521.com/welcome to exchange and correction

File descriptor

The file descriptor is a non-negative integer. It is an index value that the operating system can use to find the corresponding file.

In many of the underlying APIS of FS, file descriptors are needed. In documentation, descriptors are usually represented by fd. For example, fs.read(fd, buffer, offset, length, position, callback). Corresponding to this API is fs.readfile (path[, options], callback).

Because the operating system has a limit on the number of file descriptors, don’t forget close when you end a file operation:

const fs = require("fs");

fs.open("./db.json"."r", (err, fd) => {
    if (err) throw err;
    // File operation...
    // When done, close the file
    fs.close(fd, err => {
        if (err) throw err;
    });
});
Copy the code

Synchronization, asynchrony, and Promise

All file system apis come in both synchronous and asynchronous forms.

Synchronous writing

Using the synchronization API is not recommended as it blocks threads.

try {
    const buf = fs.readFileSync("./package.json");
    console.log(buf.toString("utf8"));
} catch (error) {
    console.log(error.message);
}
Copy the code

An asynchronous write

Asynchronous writing is easy to write into callback hell.

fs.readFile("./package.json", (err, data) => {
    if (err) throw err;
    console.log(data.toString("utf8"));
});
Copy the code

I Promise you

Before Node V12, you wrapped the promise yourself:

function readFilePromise(path, encoding = "utf8") {
    const promise = new Promise((resolve, reject) = > {
        fs.readFile(path, (err, data) => {
            if (err) return reject(err);
            return resolve(data.toString(encoding));
        });
    });
    return promise;
}

readFilePromise("./package.json").then(res= > console.log(res));
Copy the code

In Node V12, the FS Promise API was introduced. They return Promise objects instead of using callbacks. The API is accessible via require(‘fs’).promises. That way, development costs are lower.

const fsPromises = require("fs").promises;

fsPromises
    .readFile("./package.json", {
        encoding: "utf8".flag: "r"
    })
    .then(console.log)
    .catch(console.error);
Copy the code

Directories and directory entries

Fs. Dir class: encapsulates operations related to file directories

The fs.Dirent class encapsulates operations related to directory entries. For example, determine the device type (characters, blocks, FIFO, and so on).

The relationship between them is shown in code:

const fsPromises = require("fs").promises;

async function main() {
    const dir = await fsPromises.opendir(".");
    let dirent = null;
    while ((dirent = awaitdir.read()) ! = =null) {
        console.log(dirent.name);
    }
}

main();
Copy the code

File information

The Fs. Stats class: encapsulates operations related to file information. It is returned in the callback function of fs.stat().

fs.stat("./package.json", (err, stats) => {
    if (err) throw err;
    console.log(stats);
});
Copy the code

Note about checking for file existence:

  • Using fs.stat() to check for file existence before calling fs.open(), fs.readfile (), or fs.writefile () is not recommended. Instead, you should open, read, or write the file directly and handle the raised error if the file is not available.
  • To check for the existence of a file and then not operate on it, it is recommended to use fs.access().

ReadStream and WriteStream

Stream is a very important library in NodeJS. Many library apis are wrapped around a stream. Examples include ReadStream and WriteStream in FS below.

Fs itself provides readFile and writeFile, which are useful at the expense of performance and load content into memory all at once. But for large files of several gigabytes, there are obviously problems.

The solution to large files, of course, is to read them little by little. That’s where the stream comes in. In the case of readStream, the code looks like this:

const rs = fs.createReadStream("./package.json");
let content = "";

rs.on("open", () = > {console.log("start to read");
});

rs.on("data", chunk => {
    content += chunk.toString("utf8");
});

rs.on("close", () = > {console.log("finish read, content is:\n", content);
});
Copy the code

Stream pipe encapsulates a large file copy function in a single line:

function copyBigFile(src, target) {
    fs.createReadStream(src).pipe(fs.createWriteStream(target));
}
Copy the code

Refer to the link

  • File descriptor
  • The Socket Socket
  • Nodejs basics: Stream module introduction and use
  • Fastest way to copy file in node.js
  • Using Node.js to Read Really, Really Large Datasets & Files (Pt 1)

👇 scan code concern, view the “front-end atlas” & “algorithm problem solving”, your support is my motivation to update 👇