Flow in my opinion is a relatively advanced knowledge, can not be easily mastered and summed up. This is just an introduction to stream. The problems you encounter in the project can’t be solved simply by using it. You still need to read more documents for review.

  • What is flow?

Streams are collections of data, like arrays and strings.

The difference is that streams are not captured all at once, which is why they are called streams.

The stack, like the one we encountered before, is actually a partition of memory to hold these variables. But the stream doesn’t have to match the memory pointer, it’s like a pipe with data flowing through it. This makes it particularly powerful when dealing with large volumes of data (especially the performance savings).

Streams, however, can not only handle large volumes of data, they also give us the ability to combine in code. Pipe, for example, can link the sources of two streams so that they can transmit.

Just as most core modules inherit from Events, flows are a super cornerstone of the core module.

There are four basic flow types in Node

  1. A readable stream fs. CreateReadStream
  2. Writable stream fs. CreateWriteStream
  3. Bidirectional flow TCP Sockets
  4. Transformation flows zlib. CreateGzip

A readable stream is an abstract description of a resource, and a writable stream is a description of a resource.

Bidirectional flow can be released or expanded.

A transformation flow is a special bidirectional flow. You can think of it as an intermediary. It can plug in a readable stream, process the data, and release a writable stream.

All streams are instances of EventEmitter. Triggering these instance events can read or write data. However, we can pipe these flows in the easiest way possible.

Fs.createReadStream.pipe(fs.createWtiteStream)

Pipes look more like a Pipe, connecting two streams, and even more amazing is that they can be called in chain. This makes it look more like a pipe! We need to be careful not to use event handling when using pipe to handle streams in case something becomes unmanageable.








With that as a basic introduction, let’s dive into the world of official documentation to explore some API usage

1. Apis needed to use streams in your application

  • The buffer

Both readable and writable streams store the data they want to send in an internal buffer

Writable. writableBuffer or readable.readableBuffer can get them

fs.createWriteStream().writableBuffer
Copy the code

The amount of data that can be buffered depends on the highWaterMark parameter of the constructor of the incoming stream.

For normal streams, highWaterMark specifies the total number of bytes. For streams in object mode, highWaterMark specifies the total number of objects.

When stream.push(chunk) is called, the data is buffered in the internal buffer of the readable stream. If a writable stream does not call stream.read(), the data is kept inside until it is read by a writable stream.

Once the value stored in the internal buffer reaches the highWaterMark value, the stream temporarily stops reading data from the underlying resource until the current buffered data is read by a writable stream

When writable.write(chunk) is called, the data is buffered in a writable stream. Calling writable.write() returns true when the total size of the internal writable buffer is less than the highWaterMark value. False is returned once the size of the internal buffer reaches or exceeds the highWaterMark.

Either writable here or readble above represents some writable/readable instance that inherits Stram classes.Copy the code

Stram is here to limit the size of your data cache and prevent your memory from crashing.

  • Your Node application can’t live without a Stream

const http = require('http'); Const server = http.createserver ((req, res) => {// req is an instance of http.incomingMessage, which is a readable stream. // res is an HTTP.ServerResponse instance, which is a writable stream.let body = ' '; // The received data is a UTF8 string, and a Buffer object is received if the character encoding is not set. req.setEncoding('utf8'); // If a listener is added, the readable stream fires'data'Events. req.on('data', (chunk) => {
    body += chunk;
  });

  // 'end'The event indicates that the entire request body has been received. req.on('end', () => { try { const data = JSON.parse(body); // Respond information to the user. res.write(typeof data); res.end(); } catch (er) {// JSON parsing failed. res.statusCode = 400;returnRes. End (` error:${er.message}`); }}); }); server.listen(1337);Copy the code

Writable streams (such as res in the example) expose methods such as write() and end() for writing data to the stream.

When data can be read from a stream, a readable stream uses a trigger. There are many ways to read data from a stream.

Both writable and readable streams use listeners/triggers in a variety of ways to communicate the current state of the stream.

Duplex and Transform flows are both writable and readable.

Applications that simply write data to or consume data from a stream generally do not need to call require(‘stream’). Because they’ve already implemented its features

For developers who need to implement new types of streams, see the API section for Implementing streams.