The role of flow:

Stream is a means of data transmission. As we know, data reading and writing must be carried out in binary form at the bottom, so the control of data transmission by stream should be reflected in the size or speed of data transmission.

So in what kind of scenarios do we need to control the size and speed of data transmission, or when we need to use the data transmission mode of stream, let’s use a few small examples to demonstrate.

Suppose we need to read data from one file into another,

let fs=require('fs');
let data = fs.readFileSync('read.txt');
fs.writeFileSync('write.txt', data);Copy the code

You can see how easy it is to read and write files using nodeJs. We read all the data into memory at once and then write it to the target file, without having to stream it at all. However, we should pay attention to the fact that we are dealing with such a small text file. If we are faced with several gigabyte video files, such a one-time reading operation will take up our memory, or even burst the memory.

That’s where streaming comes in – controlling how much data is transferred. Streams can divide resources into small pieces and transfer them one by one, just like flowing water. All data is put into memory without first, resulting in memory consumption. Assuming the text file is large, we overwrite the above functionality with a stream in nodeJs.

let readStream=fs.createReadStream('read.txt');
let writeStream=fs.createWriteStream('write.txt');
readStream.on('data'.function(chunk) {// When data comes in, data is written to writestream.write (chunk); });readStream.on('end'.function(chunk) {// No data is written, writestream.end (); })Copy the code

The problem with this implementation, of course, is that if the speed of reading is faster than the speed of writing, data may be lost. Therefore, we can use pipe, a method of streaming, to control the speed of data transfer, so that the stream can write a section of data before reading the next section. As follows:

fs.createReadStream('read.txt').pipe(fs.createWriteStream('write.txt'));Copy the code

Knowing what streams do, let’s look at a concrete implementation of nodeJs streams.

Stream implementation:

NodeJs’s Stream functionality is provided by the Stream base module, which is also an abstract interface implemented by many modules.

We talked about streams chopping up resources. What are those blocks? They could be buffers, strings, or other objects, and buffers are used to manipulate bytes, to manipulate binary data. Because there are no other data type conversions, such as when rendering data as characters or other types.

The above examples only use read flow and write flow. There are also two types of flow, Duplex and Transform, which are realized by combination or processing based on read and write flow. For example, Duplex flow has both read and write methods. Node.js Streams: All you need to know about Node.js Streams: All you need to know

Sample code shown in this article

Reference:

Node.js Stream is easy to understand and fully parsed

Hungry for power? The young! The principle of flow