Introduction to the Node. Js

Node.js is a cross-platform JavaScript runtime environment based on the Google V8 engine, not a language

Installation and operation

Node. js can be installed at nodejs.org/zh-cn/

Select your own runtime environment to install, after the installation is complete, you can create a node.js program in vscode

Now we create a Node program that reads the contents of the file

Json and index.js files are created in the root directory. Index.js is our Node program. As mentioned above, Node is not a language, but a JS runtime environment, so our Node programs are written in JS language. Write content to package.json file

{
  "name": "ts"."version": "1.0.0"."description": ""."main": "index.js"."scripts": {
    "test": "echo "Error: no test specified" && exit 1"
  },
  "keywords": []."author": ""."license": "ISC"."dependencies": {
    "typescript": "^ 4.4.2." "}}Copy the code

Content can be written as you like. This file is intended to be read by the Node program we are writing, without affecting other content

Then write the Node code

const { readFile } = require('fs')
​
readFile('./package.json', { encoding: 'utf-8' }, (err, data) = > {
    if (err) {
        throw err
    }
    console.log(data);
})
Copy the code

The first is to introduce a module called ‘FS’, this module is dedicated to an operation of the file, including creating, modifying, reading, deleting files and other operations;

The first parameter is the file path, the second parameter is the encoding format, and the third parameter is a callback promise that contains both err and data

Then open the terminal and execute the following command

node index.js
Copy the code

We can get the contents of the file that we’re reading

Note that the log output here is in the terminal, not in the browser

In the past, js files were attached to HTML files. HTML files were parsed in the browser, and JS will be parsed only when js content is read. Node is the equivalent of taking the V8 engine and parsing the JS file directly, giving js its own runtime environment.

Version management

During the development process, there will be a problem with different Node versions. How to quickly switch the node.js version with the help of version management tools

N: a global open source NPM package, is dependent on NPM to install, use globally

FNM: fast and simple, compatible with. Node-version and. NVMRC files

NVM: Standalone package, Node Version Manager

features

Node was born in 2009, and now it is very healthy, with the following three characteristics

Asynchronous I/O

I/O is not just about reading and writing files, but also network requests and database reads and writes

When Node.js performs an I/O operation, the response returns and the operation resumes, rather than blocking the thread and wasting CPU loops waiting

The order in which code is written has nothing to do with the order in which it is executed

Take the node program that reads the file, and add a line of output after the file reads

const { readFile } = require('fs')
​
readFile('./package.json', { encoding: 'utf-8' }, (err, data) = > {
    if (err) {
        throw err
    }
    console.log(data);
})
console.log(123456);
Copy the code

This is because node puts the fs.readFile method on an asynchronous stack when it is compiled, executes other code first, and notifies the main thread when the stack is complete

Single thread

Node.js preserves the single-threaded nature of JavaScript in the browser

Advantages:

  • No need to worry about state synchronization, no anti-life and death lock
  • There is no performance overhead associated with thread context switching

Disadvantages:

  • Unable to utilize multi-core CPUS
  • Errors can cause the entire application to quit, resulting in poor robustness
  • The CPU cannot continue to execute due to large computing usage

Browser for example, the browser is multi-threaded, JS engine is a single thread, so it is not our code does not work, is the JS engine is according to the single thread to parse

Browser has Browser process, plug-in process, GPU process, rendering process; The rendering process includes page rendering, JS execution and event handling

cross-platform

Compatible with Windows and * NIx platforms, mainly thanks to a layer of platform architecture built between the operating system and Node’s upper module system. Node as js runtime environment, but the underlying code is really WRITTEN in C and C++, in the development of a platform to do a smooth; Modules like FS belong to the application API, use JS code, and are very compatible

Application scenarios

Node.js has a niche in most areas, especially I/O intensive

Web applications: Express/Koa

Front-end build: Webpack

GUI client software: VSCode/ netease Cloud Music

Others: real-time communication, crawler, CLI…

But it is not suitable for computationally intensive applications

Modularization mechanism

  1. What is modularity?

    To break a large program into small interdependent files based on function or business and splice them together in a simple way

  2. Why modular? No modularity problems

    All script tags must be in the correct order, otherwise they will rely on errors

    For example, the HTML file needs to import multiple JS files, but the HTML reads JS in order. If 1.js needs to reference the contents of 2.js file, then when we import the order is not 1->2, the error will occur

    Global variables have name conflicts and cannot be reclaimed

    When we develop multiple people, if the variable name is the same, there will be memory can not be destroyed, has been occupied the problem, if it is strict mode will also report an error, resulting in the program can not run

    IIFE/namespace can cause a lot of problems with code readability

CommonJS specification

Node.js supports the CommonJS module specification and loads modules synchronously

//greeting.js
const prefix = 'hello'
const sayHi = function(){
    return prefix + 'world'
}
module.exports = {
    sayHi,
}
​
​
​
//index.js
const{ sayHi } = require('./greeating')
sayHi();
Copy the code

For example, create the greeting js file, define the sayHi method, and export it using the module.export method

It can be called using the require method introduced in the index.js file

SayHi is not exported from module.exports, so it is not accessible in the index.js file. It is the sayHi private variable.

The synchronous loading mechanism is used here, because Node reads files on the server side, so it reads files very quickly

//greeting.js
const prefix = 'hello'
const sayHi = function(){
    console.log(`${prefix}world`)}exports.sayHi = sayHi
​
​
​
//index.js
const{ sayHi } = require('./greeating')
sayHi();
Copy the code

Module. Exports is changed from module. Exports. Both methods are the same, pointing to the sayHi method

Module. exports can only export one variable. Exports can be used multiple times because it is followed by the name of the variable

CommonJS exports, require, module, filename, dirname variables

function (export.require.module,__filename,__dirname){
    const m = 1
    module.exports.m = m
}
Copy the code

Loading way

  1. Loading a built-in module

    require('fs')
    Copy the code
  2. Load relative | absolute path to the file module

    require('/User/... /file.js')
    ​
    require('./file.js')
    Copy the code
  3. Load the NPM package

    require('loadash')
    Copy the code

NPM package lookup rules

  1. Current directory node_modules
  2. If not, go to the parent node_modules
  3. If not, recurse up the path to node_modules in the root directory
  4. If package.json is not found, the file will be loaded. If package.json is not found, the file will be searched for index.js, index.json, and index.node in sequence

Because there are so many references to this package in a real project, it would be time-consuming to look it up every time, so Node sets up a caching mechanism

Require. The cache contains loaded modules. The reason for the cache is synchronous loading

  1. The file module search time, if every require needs to traverse the search, the performance will be poor
  2. In real development, modules may contain side effect code

A new version of the module may be introduced in a real project, and you need to read the new version, not the old version of the cache, so you need to write the code cache-free

/ / a cache
const mod1 = require('./foo')
const mod2 = require('./foo')
console.log(mod1 === mod2) //true/ / no cache
function requireUncached(module){
    delete require.cache[require.resove(module)]
    return require(module)}const mod3 = requireUncache('./foo')
console.log(mod1 === mod3)  // false
Copy the code

Other module specifications

AMD is RequireJS in the promotion process of standardized output, asynchronous loading, advocating dependence on preloading

CMD is SeaJS standardized output in the promotion process, asynchronous loading, advocating nearby dependence

UMD specification, compatible with AMD and CommonJS mode

ES Modules, a language-level modularity specification that is context-independent and can be compiled with Babel

ES Modules

ESM is a modular standard proposed at the ES6 language level

The ESM cannot console two keywords as long as they are import and export

/ / export
export default 1
export const name = "cola"
export { age }
export { name as nickname }
export { foo } from './foo'
export * from './foo'/ / import
import Vue from 'vue'
import * as Vue from 'vue'
import { Component } from 'vue'
import {default as a} from 'vue'
import {Button as Btn} from 'Element'
import 'index.less'
Copy the code

CommonJS VS ESM

The CommonJS module outputs a value worth copying; The OUTPUT of the ESm module is worthy of reference

CommonJS modules are loaded at runtime; ESm modules are compile-time output (pre-loaded)

Can mix, but it is not recommended (import commonjs | | the import of the require)

// CommonJS
//lib.js
let counter = 3
function addCounter(){
    counter ++
}
module.exports = {
    conunter,
    addCounter
}
​
//main.js
const { counter,addCounter } = require('./lib')
console.log(counter);  / / 3
addCOunter()
console.log(counter);  / / 3
Copy the code
//ES Modules
export let counter = 3
export function addCounter(){
    counter++
}
​
//main.js
import { counter,addCounter } from './lib.mjs'
console.log(counter);  / / 3
addCounter()
console.log(counter);  / / 4
Copy the code

Package management mechanism

Introduce NPM

NPM is the package manager in Node.js and provides other commands to manage packages such as install, delete, and so on

Common commands:

  • NPM init initialization helps us automatically generate package.json configuration files
  • NPM config configuration
  • NPM run run
  • NPM install install NPM I
  • NPM uninstall deletes packages
  • NPM updata Updates the specified version
  • NPM info Displays package information
  • NPM publish publishes its own package

Package. Json information

Take the package.json file of Webpack as an example

  • The name package name
  • Version version number
  • Main entry file
  • Script Run the NPM run serve NPM run build script
  • Dependencies
  • DevDependencies Develops dependencies
  • Repository code hosts the address

More package.json configurations

Asynchronous programming

In our actual development, there are many requirements that need to be implemented after the last function is completed. In this case, we usually think of callback, that is, call fn2 from the fn1 function, if you need multiple layers of call relationship, fn3 in fn2, fn4 in fn3, the code is not only cumbersome, but also not easy to read and expand, this is callback hell.

Callback

const { readFile } = require('fs')
​
fs.readFile('./package.json', { encoding: 'utf-8' }, (err, data) = > {
    if (err) throw err
    const {main} = JSON.parse(data)
    fs.readFile(main,{encoding:'utf-8'},(err,data) = >{
        if(err) throw err
        console.log(data)
    })
})
Copy the code

Or start the example of reading the file, now want to read the file content corresponding to the main field, because JS is a single thread processing mode, we need to read the file after reading the file content data, the second read file operation inside the first function. Once or twice is fine, but if there’s a lot of demand, one layer at a time, you’ll have callback hell.

Promise

Promise is a finite state machine with four states, including three core states: Pending, Fulfilled, Rejected and an un-started state

Using Promise, the implementation reads the file contents corresponding to the main field in package.json

const { readFile } = require('fs/promise')
​
readFile('./package.json', {encoding:'utf-8'}).then(res= >{
    return JSON.parse(res)
}).then(data= >{
    return readFile(data.main,{encoding:'utf-8'})
}).then(res= >{
    console.log(res)
})
Copy the code

As you can see in the code above, the function is very clean and concise with Promise, which avoids callback hell by making chained calls

function promise(fn,receiver){
    return (. args) = >{
        return new Promise((resolve,reject) = >{
            fn.apply(receiver,[...args,(err,res) = >{
                returnerr? reject(err):resolve(res) }]) }) } }const readFilePromise = promisify(fs.readFile,fs)
Copy the code

await

Await functions use try catch to catch exceptions (note the parallel processing). In fact async await is the syntactic candy of the promise, which is more concise than the original Promise function

const { readFile } = require('fs/promise')
​
async() = > {const { main } = JSON.parse(await readFile('./package.json', {encoding:'utf-8'}))
    const data = await readFile(main,{encoding:'utf-8'})
    console.log(data)
}
Copy the code

Event

Publish subscribe mode, Node.js built-in Events module

For example, HTTP Server on(‘ Request ‘) event listener

// Publish subscribe mode
const EventEmitter = require('events')
​
class MyEmitter extends EventEmitter{}
const myEmitter = new MyEmitter()
​
myEmitter.on('event'.() = >{
    console.log('an event occurred! ')
})
myEmitter.emit('event')
​
//sever listens for request events
const http = require('http')
​
const server = http.createServer((req,res) = >{
    res.end('hello')
})
server.on('request'.(req,res) = >{
    console.log(req.url)
})
server.listen(3000)
Copy the code

Web Application Development

The HTTP module

Build a simple HTTP service, node.js built-in HTTP module

const http = require('http')
​
http.createServer((req,res) = >{
    res.end('hello world\n')
}).listen(3000.() = >{
    console.log('the App running at http://127.0.0.1:3000/')})Copy the code

Koa is introduced

Koa- The next generation Web development framework based on the Node.js platform

Koa simply provides a lightweight and elegant library of functions that makes it easy to write Web applications without tying any middleware to kernel methods

Koa needs to be installed using NPM before it can be used because Koa is not a built-in component of Node

const app = new Koa()
​
app.use(async ctx => {
    ctx.body = 'Hello World'
})
​
app.listen(3000.() = > {
    console.log('App started at http://localhost:3000 ... ');
})
Copy the code

Implementation process

  • The service start

    • Instantiate the application
    • Registered middleware
    • Create services and listen to ports
  • Accept/process the request

    • Gets the request REq, RES object
    • Req -> Request, RES -> Response encapsulation
    • request&response->context
    • Execution middleware
    • Set the output to ctx.body

The middleware

A Koa application is an object containing a set of middleware functions that are organized and executed according to the Onion model

The execution order of the middleware

Above we can see the actual execution order of the middleware, which is equivalent to callback functions within functions, but without callback hell.

Middleware simple code implementation

// Middleware emulation
const fn1 = async (ctx,next)=>{
    console.log('before fn1')
    ctx.name = 'codecola'
    await next()
    console.log('after fn1')}const fn2 = async (ctx,next)=>{
    console.log('before fn2')
    ctx.age = 23
    await next
    console.log('after fn2')}const fn3 = async (ctx,next)=>{
    console.log(ctx)
    console.log('in fn3 ... ')}const compose = (middlewares,ctx) = >{
    const dispatch = (i) = >{
        let fn = middlewares[i]
        return Promise.resolve(fn(ctx,() = >{
            return dispatch(i+1)}}))return dispatch(0)
}
​
compose([fn1,fn2,fn3],{})
Copy the code

The execution time of processing function is obtained based on middleware principle

const Koa = require('koa')
const app = new Koa
​
/ / logger middleware
app.use(async(ctx,next)=>{
    await next()
    const rt = ctx.response.get('X-Response-Time')
    if(ctx.url! = ='/favicon.ico') {
        console.log(`${ctx.methed} ${ctx.url} - ${rt}`)}})/ / x - the response - time middleware
app.use(async (ctx,next)=>{
    const start = Data.now()
    await next()
    const ms = Data.now - start
    ctx.set('X-Response-Time'.`${ms}ms`)
})
​
app.use(async ctx=>{
  let sum = 0
  for(let i = 0; i < le9; i++){ sum += i } ctx.body =`sum=${sum}`
})
​
app.listen(3000.() = >{
    console.log('App started at http://localhost:3000 ... ')})Copy the code

Common middleware

  • Koa-router: indicates route resolution
  • Koa-body: Request body parsing
  • Koa-logger: logs are recorded
  • Koa-views: Template rendering
  • Koa2-cors: cross-domain processing
  • Koa-session: session processing
  • He’s more like a koa-helmet
  • .

Koa middleware is varied in quality, which requires reasonable selection and efficient combination

Koa-based front-end framework

Open source: ThinkJS/Egg…

Internal: Turbo, Era, Gulu…

What did they do?

  • Koa object response/request/context/application extension
  • Kuo common middleware library
  • Company internal service support
  • Process management
  • The scaffold
  • .

Online deployment

Node.js maintains the single-threaded nature of JavaScript in browsers (one thread per process)

Node.js is single-threaded, but based on event-driven, asynchronous, non-blocking mode, can be applied to high concurrency scenarios, while avoiding the resource overhead caused by thread creation and context switching between threads.

Disadvantages:

  • Unable to utilize multi-core CPUS
  • Errors can cause the entire application to quit, resulting in poor robustness
  • A large number of calculations occupy the CPU, causing the execution to fail

Utilize multi-core cpus

Implement the simplest HTTP Server

const http = require('http')
​
http.createServer((req,res) = >{
    for(let i=0; i<le7; i++){ res.end(`handled by process.${process.pid}`)
    }
}).listen(8080)
Copy the code

So how do you take advantage of multi-core cpus?

Node.js provides the cluster/child_process module

const cluster = require('cluster')
const os = require('os')
​
if(cluster.isMaster){
    const cpulen = os.cpus().length
    for(let i=0; i<cpulen; i++){ cluster.fork() } }else{
    require('./server.js')}Copy the code

The process to protect

const http = require('http')
const numCPUs = require('os').cpus().length
const cluster = require('cluster')
​
if(cluster.isMaster){
    console.log('Master process id is',process.pid)
    for(let i=0; i<numCPUs; i++){ cluster.fork() } cluster.on('exit'.function(worker,code,signal){
        console.log('work process died,id',worker.process.pid)
        cluster.fork()
    })
}else{
    const server = http.createServer()
    server.on('request'.(req,res) = >{
        res.writeHead(200)
        console.log(process.pid)
        res.end('hello world\n')
    })
    server.listen(8080)}Copy the code