What is flow?

As its name suggests, a Stream can be thought of as a Stream of water flowing from one place to another. At a certain rate, some are fast and some are slow. Streaming in Nodejs moves data from the source file to the destination file at a certain rate. At its most basic, it is officially described as a publishable emitter because it inherits From EventEmitter and then implements Readable,Writeable, and Duplex streams. Readable will allow you to read data from a file while Writeable will allow you to write data to a target file. Nodejs the most common example of a stream is from an HTTP server, whose request is a Readable stream and whose response is a Writeable stream.


Why use streams?

Having said all that, why do we use streams? What good can it do us? Here is an example where we first create a large file to act as a test case:


const fs = require('fs');
const file = fs.createWriteStream('./big.file');

for(let i=0; i<= 1e6; i++) {
    file.write('Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.\n');
}

file.end();
Copy the code

I used writable stream to make this big file. The FS module implements the Streams interface so that it can be used to read files as well as perform writes. In the example above, we wrote a million lines to the file through the call writable stream loop. Running the above script produces a file of about 400 MB in size. This is a simple Node Web server that serves big.file:

const fs = require('fs');
const server = require('http').createServer();

server.on('request', (req, res) => {
    fs.readFile('./big.file', (err, data) => {
    if (err) throw err;
    res.end(data);
    });
});

server.listen(8000);
Copy the code

When the server gets the request, it reads the large file using the asynchronous method fs.readfile and returns it to the client. Simple lines of code that don’t seem to behave like a bunch of event loops or other complex code. So, let’s run the server and monitor memory when requesting this file to see what happens. When I start the service, it starts at a normal memory size of 8.7MB:

434.8 MB
big.file
big.file
pipe
fs
createReadStream


const fs = require('fs');
const server = require('http').createServer();

server.on('request', (req, res) => {
const src = fs.createReadStream('./big.file');
src.pipe(res);
});

server.listen(8000);
Copy the code

Now, when you make the request again, an amazing thing happens (see memory consumption) :



block
25 MB
The Stream flow
A small
A large file
The stream flow


The Readable Stream is a Readable stream

A readable Stream allows you to read data from a data source, which can be anything from a regular file, a system file, an in-memory buffer, or even another Stream. As an EventEmitter, streams emit events at various stages of execution that we can control by streaming

The specific use

One of the best ways to read data from a Stream is to listen for the Stream’s data event and give a callback to the event. When a block of data is read, the readable stream emits a data event and executes the callback. Like this:

let fs = require('fs');
let readableStream = fs.createReadStream('file.txt');
let data = ' ';

readableStream.on('data'.function(chunk) {
    data+=chunk;
});

readableStream.on('end'.function() {
console.log(data);
});
Copy the code

Calling fs.createreadStream will generate a readable stream for you. Initially the stream is static when you listen for the data event and give it a callback. The flow begins to flow. The data is stuffed into the callback we gave it at a default rate of 64K. The stream emits an end event when all the data has been read. In the above code block, we get a log that prints out the entire contents of the stream

There is another way to read, which is to listen for readable events and then call the read() method of the stream. The data can be read continuously until it has been read

let fs = require('fs');
let readableStream = fs.createReadStream('file.txt');
let data = ' ';
let chunk;

readableStream.on('readable'.function() {
    while((chunk=readableStream.read()) ! =null) { data += chunk; }}); readableStream.on('end'.function() {
    console.log(data)
});
Copy the code

The read() method reads a certain amount of data and returns it as a Buffer. When all the data is read, a NULL is returned, so in the above code, we exit the while loop when no data reads return null. At the same time trigger end,log out all data. But what you find is that the log is not the string that you expected. Read () can be passed as an argument, and if not, the entire cache is read out at once.

Set the coding

Since the default data read from the stream is Buffer, if you want to get a String, you must set the encoding format. By calling setEncoding() on the stream, like this:

let fs = require('fs');
let readableStream = fs.createReadStream('file.txt');
let data = ' ';

readableStream.setEncoding('utf8');

readableStream.on('data'.function(chunk) {
    data+=chunk;
});

readableStream.on('end'.function() {
    console.log(data);
});
Copy the code

Above we set the encoding to UTF-8, and when the end is triggered the logged data will be presented in UTF-8 format.


Writeable Steam to write to

Writable streams, where data can be written to one place. Like readable streams, they are EventEmitter in nature, so they emit events at various stages of execution. I can manipulate it directly by listening for these events. How to write? To write data, you need to call the write method on the stream. For example:

let fs = require('fs');
let readableStream = fs.createReadStream('file1.txt');
let writableStream = fs.createWriteStream('file2.txt');

readableStream.setEncoding('utf8');

readableStream.on('data'.function(chunk) {
writableStream.write(chunk);
});
Copy the code

The above code simply reads data from a readable stream and writes it to the target file by calling the write() method of the writable stream. The write() method returns a Boolean value, true indicating that more data can be added to the write, or false indicating that no more data can be added to the write until the current block is written. So the writable stream emits a drain event after writing to the current block, indicating that it is ready to accept data again. The write rate of the writable stream created by fs.createWritestream () can also be set, although it is not pre-configured like fs.createreAdableStream (). Fs.createwritestream () calls writableHighWaterMark() separately after the writable stream is returned by fs.createWritestream and writable.writablehig hWaterMark)

Pipe

readableSrc.pipe(writableDest)
Copy the code

Pipe is a method on readable streams that does a good job of connecting readable streams with writable streams. It takes the output of the readable stream (the data source) as the input to the writable stream, so the data source must be readable and the target must be writable. Of course, both duplex flow and conversion flow are fine. Specific use is as follows:

let fs = require('fs');
let readableStream = fs.createReadStream('file1.txt');
let writableStream = fs.createWriteStream('file2.txt');

readableStream.pipe(writableStream);
Copy the code

The above code writes the contents of file1 to the contents of file2 using the pipe() method. The advantage of using pipe() is that it helps you control the flow of data, and you don’t care if the flow is fast or slow. It is also important to note that pipe() returns the written result as a readable stream. This means that pipe() can make a chain call. We can do this:

a.pipe(b).pipe(c).pipe(d)

// Code with the same function:
a.pipe(b)
b.pipe(c)
c.pipe(d)
Copy the code

The pipe() method is the simplest way to consume a flow. A general comparison suggests using the pipe() method or using events to consume streams, but you should avoid mixing the two. Normally when you use the pipe() method, you don’t need to use events, but if you need to use Streams in a custom way, events may be necessary.

The last

Nodejs streams are only scratched the surface at first, and will need to be further explored later. Explore the inner workings.


reference

  • Node.js Streams: Things you should know
  • Medium.freecodecamp.org/node-js-str…
  • Node. Js v10.3.0 Documentation
  • The Basics of Node.js Streams
  • stream-handbook