A preface

What might you learn after reading this article?

① Node implements terminal command line ② Terminal command line interaction ③ deep copy the entire folder ④nodejs executes terminal commands such as NPM install ⑤ Establishes communication between sub-processes ⑥ Webpack low-level operations. Start webPack, merge configuration items ⑦ Write a plugin to understand the stages ⑧require. Context to automate the front-end

1. Realization of effect display

Effect of the project

mycli creat Create a project

mycli startRun the project

mycli buildPackaging project

Experience the steps

And we’re going to use in this articlemycliBut I didn’t upload the project tonpmHowever, this article is based on a scaffolding prototype of the author’s previous work, which interested students can download locally to experience the effect.

Globally download scaffolding rux-cli

windows

npm install rux-cli -g 
Copy the code

mac

sodu npm install rux-cli -g 
Copy the code

One command creates the project, installs dependencies, compiles the project, and runs the project.

rux create 
Copy the code

2 Setting goals

Set goals, break them down

We want to create a project, download a dependency, run a project, collect a dependency, and so on from the command line. If you design an entire feature in one sitting, your mind will probably go blank, so learn to break it down. The actual overview of the entire process is divided into the create file phase, build phase, integrate WebPack phase, and run the project phase. Sort out what we need to do at each stage.

Create file phase

1 Terminal CLI interaction

① node Modifies bin

We want to start creating a project by using the custom command line vue create, just like vue-CLI. First we need to be able to get the program terminal to recognize our custom instruction.

Example:

mycli create 
Copy the code

We want the terminal to be able to recognize mycli and then create a project with mycli create. The process actually looks something like this: Mycli allows you to directionally execute the specified node file. Let’s go through the steps.

Execute the terminal command number with the expected result that the current node file is executed.

Set up project

As shown in the figure above, when we execute the command line in the terminal, we go to mycli.js file under the bin folder.

Mycli. Js file

#! /usr/bin/env node
'use strict';
console.log('hello,world')
Copy the code

Then declare bin in package.json.

{
  "name": "my-cli"."version": "0.0.1"."description": ""."main": "index.js"."bin": {
    "mycli": "./bin/mycli.js"
  },
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "author": "👽"."license": "ISC"."dependencies": {
    "chalk": "^ 4.0.0"."commander": "^ 5.1.0"."inquirer": "^ 7.1.0"."which": "^ 2.0.2"}}Copy the code

Everything is ready. For local debugging, use NPM Link in the my-CLI folder, or if you need to run sudo NPM Link on your MAC

Then let’s create a new folder and run mycli. So if I print Hello world, that’s a success. The next thing we do is make the node file (mycli.js in the demo project) read our terminal commands. For example, mycli create creates a project; Mycli start Run the project; Mycli build package project; In order to be able to manipulate the command line fluently at the terminal, we introduced the Commander module.

② The commander-nodejs terminal command line

In order to print colorful colors in the terminal, we introduce Chalk library.

const chalk = require('chalk')
const colors = [ 'green' , 'blue' , 'yellow' ,'red'  ]
const consoleColors = {}
/* console color */
colors.forEach(color= >{
    consoleColors[color] = function(text,isConsole=true){
         return isConsole ? console.log( chalk[color](text) ) : chalk[color](text)
    }
})
module.exports = consoleColors
Copy the code

Next we need to declare our terminal command with commander.

This section describes the common COMMANDER apis

A complete solution to the Commander.js Node.js command line interface, inspired by Ruby Commander. Front-end node CLI development essential skills.

1 versionversion
var program = require('commander');
 
program
    .version('0.0.1') .parse(process.argv); # Execution result: node index.js -v0.01.
Copy the code
2 optionoptions

Use the.option() method to define the options of commander. Example:.option(‘-n, –name [items2]’, ‘name description’, ‘default value’)

program
  .option('-d, --debug'.'output extra debugging')
  .option('-s, --small'.'small pizza size')
program.parse(process.argv)
if( program.debug ){
    blue('option is debug')}else if(program.small){
    blue('option is small')}Copy the code

The input terminal

mycli -d
Copy the code

Terminal output

3 commanderCustom instructions (key)

Example:.command(‘add

1 Command name < Must > : The command can be followed by <> or []. The last argument to the command can be mutable, as in the instance by adding… Mark; Arguments passed after the command are passed to the action callback function and to the program.args array.

2 Command description < omitted > : If action(fn) is invoked but not displayed, the subcommand program is started. Otherwise, an error is reported. Configuration options < omitted > : noHelp and isDefault can be configured.

Use Commander to add custom commands

Since we are doing scaffolding, the most basic function, create the project, run the project (development environment), and package the project (production environment), we add three commands. The code is as follows:

/* mycli create create project */
program
    .command('create')
    .description('create a project ')
    .action(function(){
        green('👽 👽 👽'+'Welcome to use MyCLI to easily build react TS projects ~ 🎉🎉🎉')})/* mycli start run the project */
program
.command('start')
 .description('start a project')
 .action(function(){
    green('-------- run project -------')})/* mycli build package project */
program
.command('build')
.description('build a project')
.action(function(){
    green('-------- Build project -------')
})

program.parse(process.argv)
Copy the code

The effect

mycli create
Copy the code

The first step is done.

③ Inquirer module command line interaction

We expect to interact with the terminal like vue-CLI or DvA-CLI or taro- CLI. This requires another NodeJS module, Inquirer. Inquirer. Js provides the user interface and query session.

Get started:

var inquirer = require('inquirer');
inquirer
  .prompt([
    /* Send your questions */
  ])
  .then(answers= > {
    /* Feedback user content */
  })
  .catch(error= > {
    /* An error occurred */
  });
Copy the code

Since we are doing react scaffolding, our interaction question with the user is, do we create a new project? (Yes/No) -> Please enter the project name? (Text input) -> Please enter author? (Text input) -> Please select public administration status? (Radio) Mobx or Redux. The first parameter of prompt above requires basic configuration for these problems. Our question configuration looks something like this

const question = [
   {
        name:'conf'./* key */
        type:'confirm'.Confirm * / / *
        message:'Create a new project? ' / * hint * /}, {name:'name'.message:'Please enter a project name? '.when: res= > Boolean(res.conf) /* Whether to perform */}, {name:'author'.message:'Please enter author? '.when: res= > Boolean(res.conf)
    },{
        type: 'list'./* Select box */
        message: 'Please select public administration status? '.name: 'state'.choices: ['mobx'.'redux']./ * option * /
        filter: function(val) {    Filter / * * /
          return val.toLowerCase()
        },
        when: res= > Boolean(res.conf)
    }
]

Copy the code

We then add the following code to the command(‘create’) callback action().

program
    .command('create')
    .description('create a project ')
    .action(function(){
        green('👽 👽 👽'+'Welcome to use MyCLI to easily build react TS projects ~ 🎉🎉🎉')
        inquirer.prompt(question).then(answer= >{
            console.log('answer=', answer )
        })
    })
Copy the code

run

mycli create 
Copy the code

Results the following

The next thing to do is to copy the project file based on the information provided by the user. There are two ways to copy the file. The first way to copy the project template is in the scaffolding, and the second is to pull the project template from github. Let’s create a new template folder in the scaffold project. Put the React-typescript template in. The next thing to do is to copy the entire Template project template.

2 Deep copy files

Since our Template project template may be a deep folder -> file structure, we need to deep copy the project files and folders. So you need the native fs module in Node to support this. Most of the FS API is for asynchronous I/O operations, so there are a few tricks to handle these asynchronous operations, which we’ll cover later.

1 Preparation: Understand the asynchronous I/O and FS modules

I have read some park Ling “NodeJS”, there is a paragraph about asynchronous I/O description.

const fs = require('fs')
fs.readFile('/path'.() = >{
    console.log('Read file complete')})console.log('Initiate read file')
Copy the code

‘File read’ is printed before ‘file read completed’, indicating that the process of reading a file with readFile is asynchronous. The implication is that in Node, parallel I/O is natural at the language level. There is no need to wait between each call for the previous I/O call to finish, which is a great efficiency in the programming model. Back to our scaffolding project, we need to massively read and copy template files at once, which means a lot of asynchronous I/O operations described above.

We need the FS module in NodeJS to copy the entire project. For nodeJS developers who have used nodeJS, fs module is not unfamiliar, basically involves the functions of file operation are useful, because of the length of the reason, here will not be one of the reasons, interested students can see nodeJS Chinese documentation – FS module basic tutorial

2 Recursively copy the project file

Implementation approach

Ideas:

(1) Selecting project templates: First, analyze the project configuration selected by the user under the Inquirer interaction module in the first step. Our project may have multiple sets of templates. For example, choosing state management mobx or Redux, or choosing JS or TS projects, the architecture and configuration of projects are different, and one set of templates cannot satisfy all cases. In our demo, we use one of these templates, the most common react TS project template. Here, we refer to the project template under the template file.

(2) Modify configuration: For the configuration items provided by us in the Inquirer stage, such as project name, author, etc., we need to process the project template separately and modify the configuration items. This information is usually stored in package.json.

(3) copy the template to generate the project: select the project template, first of all we traverse the entire template folder below all files, determine the subfile file type, if it is a file directly copy the file, if it is a folder, create a folder, and then recursively traverse the folder subfile, repeat the above operation. Until all files are copied.

(4) Notify the main program to perform the next operation.

We created create.js under the mycli project SRC folder specifically for creating the project. So without further ado, go straight to the code.

The core code

const create = require('.. /src/create')

program
    .command('create')
    .description('create a project ')
    .action(function(){
        green('👽 👽 👽'+'Welcome to use MyCLI to easily build react TS projects ~ 🎉🎉🎉')
        /* Interact with developers to get information about development projects */
        inquirer.prompt(question).then(answer= >{
           if(answer.conf){
              /* Create file */
              create(answer)
           }
        })
    })

Copy the code

Here’s the first core:

Step 1: Select a template

createmethods


module.exports = function(res){
    /* Create file */
    utils.green('------ starts building -------')
    /* Find the template project in the Template folder
    const sourcePath = __dirname.slice(0, -3) +'template'
    utils.blue('Current path :'+ process.cwd())
    / * modify package. Json * /
    revisePackageJson( res ,sourcePath ).then(() = >{
        copy( sourcePath , process.cwd() ,npm() )
    })
}
Copy the code

Here we need to make sense of two paths:

In node.js,__dirname always points to the absolute path of the js file being executed, so when you write __dirname in /d1/d2/mycli.js, its value is /d1/d2.

Process.cwd () : The process.cwd() method returns the current working directory of the Node.js process.

The first step is actually simple: select the path to the folder we want to copy, and then modify package.json based on user information

Step 2: Modify the configuration

Json in the template project, we simply replace demoName and demoAuthor with the user input project name and project author.

{
  "name": "demoName"."version": "1.0.0"."description": ""."main": "index.js"."scripts": {
    "start": "mycli start"."build": "mycli build"
  },
  "author": "demoAuthor"."license": "ISC"."dependencies": {
    "@types/react": "^ 16.9.25"."react": "^ 16.13.1"."react-dom": "^ 16.13.1"."react-router": "^ 5.1.2." "."react-router-dom": "^ 5.1.2." "./ /... More content}},Copy the code

RevisePackageJson modifypackage.json

function revisePackageJson(res,sourcePath){
    return new Promise((resolve) = >{
      /* Read the file */
        fs.readFile(sourcePath+'/package.json'.(err,data) = >{
            if(err) throw err
            const { author , name  } = res
            let json = data.toString()
            /* Replace template */
            json = json.replace(/demoName/g,name.trim())
            json = json.replace(/demoAuthor/g,author.trim())
            const path = process.cwd()+ '/package.json'
            /* Write to file */
            fs.writeFile(path, new Buffer(json) ,() = >{
                utils.green( 'Create file:'+ path )
                resolve()
            })
        })
    })
}
Copy the code

As shown above, the actual process of this step is as simple as reading the package.json file in the template, replacing it according to the template, and then generating package.json again in the target directory. Next, the actual process of copying files is performed in the promise returned by revisePackageJson.

Step 3: Copy the file
let fileCount = 0  /* Number of files */
let dirCount = 0   /* Number of folders */
let flat = 0       /* Readir quantity */
/ * * * *@param {*} SourcePath //template Resource path *@param {*} CurrentPath // Current project path *@param {*} Cb // Project copy completion callback function */
function copy (sourcePath,currentPath,cb){
    flat++
    /* Read the files under the folder */
    fs.readdir(sourcePath,(err,paths) = >{
        flat--
        if(err){
            throw err
        }
        paths.forEach(path= >{
            if(path ! = ='.git'&& path ! = ='package.json' ) fileCount++
            const  newSoucePath = sourcePath + '/' + path
            const  newCurrentPath = currentPath + '/' + path
            /* Determine file information */
            fs.stat(newSoucePath,(err,stat) = >{
                if(err){
                    throw err
                }
                /* Verify that it is a file and not package.json */
                if(stat.isFile() && path ! = ='package.json') {/* Create read/write streams */
                    const readSteam = fs.createReadStream(newSoucePath)
                    const writeSteam = fs.createWriteStream(newCurrentPath)
                    readSteam.pipe(writeSteam)
                    color.green( 'Create file:'+ newCurrentPath  )
                    fileCount--
                    completeControl(cb)
                /* dirExist */    
                }else if(stat.isDirectory()){
                    if(path! = ='.git'&& path ! = ='package.json' ){
                        dirCount++
                        dirExist( newSoucePath , newCurrentPath ,copy,cb)
                    }
                }
            })
        })
    })
}

/ * * * *@param {*} SourcePath //template Resource path *@param {*} CurrentPath // Current project path *@param {*} CopyCallback // The above copy function *@param {*} Cb // Project copy completion callback function */
function dirExist(sourcePath,currentPath,copyCallback,cb){
    fs.exists(currentPath,(ext= >{
        if(ext){
            /* Call copy recursively
            copyCallback( sourcePath , currentPath,cb)
        }else {
            fs.mkdir(currentPath,() = >{
                fileCount--
                dirCount--
                copyCallback( sourcePath , currentPath,cb)
                color.yellow('Create folder:'+ currentPath )
                completeControl(cb)
            })
        }
    }))
}

Copy the code

Readdir is used to read the files in the template folder. Stat is used to read the files. If the current file type is a file type, the fs.stat file is used to read the file type. Then create the file by reading and writing the fs.createReadStream and fs.createWriteStream; If the current file type is a folder type, determine whether the folder exists, if the current folder does exist, recursively call copy to copy the files under the folder, if not, then create a new folder, and then execute the recursive call. One thing to note here is that since we are dealing with package.json separately, all file operations here should exclude package.json. Because we need to automatically download dependencies after the entire project file is copied.

Tip: Three variable counting controls asynchronous I/O operations

The content about the above basic fs module are asynchronous I/O operations, and we copy files is deep recursive calls, it has a problem, how to judge all the files have been copied to finish, for this level and the quantity is all unknown file structure, it is difficult to through the asynchronous solution to deal with promise, etc. Instead of introducing a third-party asynchronous process library, we use variable counting to determine whether all files have been copied.

Readdir reads all the files under the folder. We use Flat ++ to record the number of readdir. Flat — is executed every time readdir completes.

FileCount: Every time a file (possibly a file or folder) is traversed, we record it with fileCount++. When the file or folder is created, we execute fileCount–.

DirCount: every time a new folder is created, we use dirCount++ to record the operation. When a new folder is created, we execute dirCount–.


function completeControl(cb){
    /* If all three variables are 0, the asynchronous I/O is completed. * /
    if(fileCount === 0 && dirCount ===0 && flat===0){
        color.green('------ build complete -------')
        if(cb && ! isInstall ){ isInstall =true
            color.blue('-- -- -- -- -- begin to install -- -- -- -- --')
            cb(() = >{
                color.blue('-- -- -- -- - finish the install -- -- -- -- --')
                /* Determine if webpack */ exists
                runProject()
            })
        }
    }
}

Copy the code

After the execution of each file or folder creation event, we will call the completeControl method. By judging that flat,fileCount and dirCount are all 0, we can determine the entire replication process, complete the execution, and make the next step.

The effect

The create project phase is complete

Three build, integration project phase

In the second stage, we mainly completed the following two functions:

Part 1: Above we copied the entire project, next we need to download the dependencies and run the project;

Part 2: We just completed the mycli create project process, for mycli start to run the project, and mycli build package compilation project, is not done yet. So let’s go slowly.

1 Parse the command and automatically run the command line.

Earlier we introduced how to start our program by modifying bin and using the Commander module to execute the Node file by entering the terminal command line. The next thing we need to do is execute the corresponding terminal command using the nodejs code. The background of this feature is that we need to automatically download the dependent NPM, install, and NPM start project after copying the entire project directory.

First, we created npm.js under the SRC folder of mycli scaffolding project to handle the download dependency and start the project operation.

1.whichModule assist findnpm

Like the Unixwhich utility. Finds the first instance of the specified executable in the PATH environment variable. Results are not cached, so hash -rPATH changes are not required. That is, we can find instances of NPM and control NPM to do certain things at the code level.

Example 🌰 🌰 🌰 :

var which = require('which')
 
// Asynchronous usage
which('node'.function (er, resolvedPath) {
  // Return er if no "node" is found on the PATH
  // If found, return the absolute path of exec
})
// Synchronous usage
const resolved = which.sync('node')
Copy the code

Once the js

const which = require('which')
/* NPM */
function findNpm() {
  var npms = process.platform === 'win32' ? ['npm.cmd'] : ['npm']
  for (var i = 0; i < npms.length; i++) {
    try {
      which.sync(npms[i])
      console.log('use npm: ' + npms[i])
      return npms[i]
    } catch (e) {
    }
  }
  throw new Error('please install npm')}Copy the code

② child_process.spawn Runs the terminal command

After we have successfully found NPM above, we need to run the current command with child_process.spawn.

child_process.spawn(command[, args][, options])

Command

The command to run. Args

List of string arguments. Options
Configure parameters.
[]>

/ * * * *@param {*} cmd   
 * @param {*} args 
 * @param {*} fn 
 */
/* Run terminal command */ 
function runCmd(cmd, args, fn) {
  args = args || []
  var runner = require('child_process').spawn(cmd, args, {
    stdio: 'inherit'
  })
  runner.on('close'.function (code) {
    if (fn) {
      fn(code)
    }
  })
}

Copy the code

③ Write NPM methods

Next we put together the contents of the steps ① and ② to expose the entire npm.js NPM method.

/ * * * *@param {*} InstallArg An array of command lines. The default is install */
module.exports = function (installArg = [ 'install' ]) {
  /* Through the first step, the closure saves NPM */  
  const npm = findNpm()
  return function (done){
    /* Execute command */  
    runCmd(which.sync(npm),installArg, function () {
        /* Successful callback */
        done && done()
     })
  }
}
Copy the code

Example 🌰🌰

const npm = require('./npm')

/* NPM install */
const install = npm()
install()

/* Run NPM start */
const start = npm(['start])
start()

Copy the code

(4) Complete the automatic project installation and start the project

What exactly is the callback function CB in our previous copy project? I believe the careful students have found it.

const npm = require('./npm')
 copy( sourcePath , process.cwd() ,npm() )
Copy the code

The cb function is the way to execute NPM install.

We followed the success of the above replication and started the project. After the three variables determine that the project was created successfully, we begin to execute the installation project.

function completeControl(cb){
    if(fileCount === 0 && dirCount ===0 && flat===0){
        color.green('------ build complete -------')
        if(cb && ! isInstall ){ isInstall =true
            color.blue('-- -- -- -- -- begin to install -- -- -- -- --')
            /* Download the project */
            cb(() = >{
                color.blue('-- -- -- -- - finish the install -- -- -- -- --')
                runProject()
            })
        }
    }
}
Copy the code

In the callback function that installed the dependency successfully, we continue to call runProject to start the project.

function runProject(){
    try{
        NPM start */
        const start = npm([ 'start' ])
        start()
    }catch(e){
       color.red('Automatic start failed, please start the project manually by NPM start')}}Copy the code

Effect: The running project phase is not shown in the video because the installation dependency takes too long

The runProject code is simple and continues to call NPM, executing the NPM start command.

So far, we’ve implemented the whole process of creating a project, installing dependencies, and running a project through MyCLI Create, with details like integrating WebPack, process communication, and so on, which we’ll cover in a minute.

2 Create child processes to communicate with each other

Now that we have the mycli Create details and implementation covered. Next we need to implement mycli Start and Mycli build.

(1) Dual-process solution

We intend to use WebPack as a building tool for the scaffolding. So we need the mycli main process, create a subprocess to manage webpack, merge webpack configuration items, run webpack-dev-serve, etc. Note here that our main process is in the Mycli global scaffolding project, Our child process will be built locally in our new React project node_modules created via mycli Create, so we’ve written a scaffolding plugin to build communication with mycli process on the one hand, and manage our React project configuration and webPack on the other.

So just to help you understand, LET me draw a flow chart.

Mycli-react-webpack-plugin is installed in the new project’s node_modules when we install the dependencies.

Mycli start and mycli build

Step 1: Perfectmycli startmycli build

Next we create start.js under the mycli scaffolding project SRC folder in order to establish process communication with the plugin mentioned above. Since either mycli start or mycli build requires webpack manipulation, we wrote them together.

Let’s continue to improve mycli start and mycli build directives in mycli.js.

const start = require('.. /src/start')
/* mycli start run the project */
program
.command('start')
 .description('start a project')
 .action(function(){
    green('-------- run project -------')
    /* Run the project */
     start('start').then(() = >{
		green('-------✅ ✅ Run complete -------')})})/* mycli build package project */
program
.command('build')
.description('build a project')
.action(function(){
    green('-------- Build project -------')
    /* Package the project */
    start('build').then(() = >{
		green('-------✅ ✅ build complete -------')})})Copy the code

Step 2: Start. js process communication

Child_process. The fork is introduced

ModulePath: Module in which the child process runs.

Parameter description :(repeat parameter description is not listed here)

ExecPath: An executable file used to create child processes. The default is /usr/local/bin/node. In other words, you can use execPath to specify the specific node executable path. ExecArgv: : A list of string arguments passed to the executable file. The default is process.execArgv, which is consistent with the parent process. Silent: The default is false, which means that the stdio of the child process is inherited from the parent. If true, then pipe directly to the child process’s child.stdin, child.stdout, etc. Stdio: If stdio is declared, the setting of the silent option is overridden.

Run subroutine

We started the child process in start.js to establish communication with the mycli-React-webpack-plugin described above. The next step is to introduce start.js.

start.js

'use strict';
/* Start the project */
const child_process = require('child_process')
const chalk = require('chalk')
const fs = require('fs')
/* Find mycli-react-webpack-plugin */
const currentPath = process.cwd()+'/node_modules/mycli-react-webpack-plugin'

/ * * * *@param {*} Type type = start Local start project type = build online package project */
module.exports = (type) = > {
    return new Promise((resolve,reject) = >{
        /* check mycli-react-webpack-plugin exists */
        fs.exists(currentPath,(ext) = >{
            if(ext){ /* There is a child process to start */
              const children = child_process.fork(currentPath + '/index.js' )
              /* Listen for subprocess information */
              children.on('message'.(message) = >{
                  const msg = JSON.parse( message )
                  if(msg.type ==='end') {/* Close the child process */
                      children.kill()
                      resolve()
                  }else if(msg.type === 'error') {/* Close the child process */
                      children.kill()
                      reject()
                  }
              })
              /* Send CWD path and operation type start or build */
              children.send(JSON.stringify({
                  cwdPath:process.cwd(),
                  type: type || 'build'}}))else{ /* do not exist, throw warning, download */
               console.log( chalk.red('mycli-react-webpack-plugin does not exist , please install mycli-react-webpack-plugin'()}})}Copy the code

This step is actually quite simple. There are roughly two steps:

1 check whether mycli-react-webpack-plugin exists. If yes, start index.js under mycli-react-webpack-plugin as a child process. If not, throw a warning to download the plugin.

2 bind the child process event message and send instructions to the child process whether to start the project or build the project.

(3) mycli – react – webpack – the plugin

The next thing to do is to let mycli-React-webpack-plugin complete the project configuration and build process.

1 Project Structure

mycli-react-webpack-pluginPlug-in project file structure

The project directory looks something like this. Under the config file, there are the basic configuration files for the different build environments. During the project build, the mycli.config.js configuration items for the production and development environments that create the new project are read and merged.

Our newly created projectmycli.config.js

2 Entry File

const RunningWebpack = require('./lib/run')

/** * create a runtime program to run configuration files in different webPack environments */

/* Start RunningWebpack instance */
const runner = new RunningWebpack()

process.on('message'.message= >{
   const msg = JSON.parse( message )
   if(msg.type && msg.cwdPath ){
     runner.listen(msg).then(
          () = >{
             /* When the build is complete, notify the main process and terminate the child process */ 
             process.send(JSON.stringify({ type:'end'}})),(error) = >{
             /* An error occurred, notify the main process, and terminate the child process */     
             process.send(JSON.stringify({ type:'error' , error }))
          }
      )
   }
})
Copy the code

Here we use RunningWebpack to perform a series of WebPack startup and packaging operations.

Merge configuration items to automatically start WebPack.

(1) based onEventEmittertheRunningWebpack

Our RunningWebpack is based on nodeJs EventEmitter module, which handles asynchronous I/O by firing different WebPack commands in appropriate scenarios, such as start or build.

Introduction of EventEmitter

All asynchronous I/O operations on NodeJS will send an event to the event queue when they complete.

Many objects in Node.js distribute events: a net.Server object fires an event every time a new connection is made, and a fs.readStream object fires an event every time a file is opened. All of these event-generating objects are instances of Events.EventEmitter.

Simple usage

/ / event. Js file
var EventEmitter = require('events').EventEmitter; 
var event = new EventEmitter(); 
event.on('some_event'.function() { 
    console.log('some_event event triggered '); 
}); 
setTimeout(function() { 
    event.emit('some_event'); 
}, 1000); 
Copy the code

(2) mergerwebpackConfiguration items

Having described The EventEmitter as the event model for running WebPack, let’s look at what it takes to run the entry file.

runner.listen(msg).then
Copy the code

const merge = require('./merge')
const webpack = require('webpack')
const runMergeGetConfig = require('.. /config/webpack.base')
   /** * accept different webpack states, merge */
    listen({ type,cwdPath }){
       this.path = cwdPath
       this.type = type
       /* Merge configuration items to get a new WebPack configuration item */
       this.config = merge.call(this,runMergeGetConfig( cwdPath )(type))
       return new Promise((resolve,reject) = >{
           this.emit('running',type)
           this.once('error',reject)
           this.once('end',resolve)
       })
    }
Copy the code

Type is the webpack command passed to the main thread, which is divided into start and build. CwdPath is the absolute path of the terminal command we input. The next thing we need to do is to read mycli.config.js for the newly created project. Then merge it with our default configuration.

runMergeGetConfig

RunMergeGetConfig can get the corresponding webPack base configuration based on the environment we pass (start or build). Let’s take a look at what runMergeGetConfig does.

const merge = require('webpack-merge')
module.exports = function(path){
  return type= > {
    if (type==='start') {
      return merge(Appconfig(path), devConfig(path))
    } else {
      return merge(Appconfig(path), proConfig)
    }
  }
}
Copy the code

RunMergeGetConfig is simply to merge the base configuration with the dev or Pro environment to get the basic configuration of the scaffold, which is then merged with the custom configuration items in the mycli.config.js file. Let’s see.

merge

Let’s look at mycli-react-webpack-plugin in the lib folder merge.js.

const fs = require('fs')
const merge = require('webpack-merge')


/* Merge configuration */
function configMegre(Pconf,config){
   const {
      dev = Object.create(null),
      pro = Object.create(null),
      base= Object.create(null)
   } = Pconf
   if(this.type === 'start') {return merge(config,base,dev)
   }else{
      return merge(config,base,pro)
   }
}

/ * * *@param {*} Config the scaffold base configuration */ obtained by runMergeGetConfig
function megreConfig(config){
   const targetPath = this.path + '/mycli.config.js'
   const isExi = fs.existsSync(targetPath)
   if(isExi){
     /* Get developer custom configuration */ 
      const perconfig = require(targetPath)
      / * * /
      const mergeConfigResult = configMegre.call(this,perconfig,config)
      return mergeConfigResult
   }
   /* Return the final packaged WebPack configuration
   return config
}

module.exports = megreConfig
Copy the code

This step is actually quite simple, taking the developer’s custom configuration and merging it with the scaffolding’s default configuration to get the final configuration. This is returned to our running instance.

③ Automatic startwebpack

The next thing we do is launch WebPack. The production environment is relatively simple, just webpack(Config). In the development environment, we need webpack-dev-server to set up the server and then suspend the project, so we need to handle this separately. Firstly, the config in the development environment is passed into WebPack to get the compiler, and then the dev-server service is started. The compiler is passed into WebPack as a parameter and listens to the port set by us to complete the whole process.

    const Server = require('webpack-dev-server/lib/Server')
    const webpack = require('webpack')
    const processOptions = require('webpack-dev-server/lib/utils/processOptions')
    const yargs = require('yargs')
    /* Run production webpack */
    build(){
        try{
            webpack(this.config,(err) = >{
               if(err){
                   /* If an error occurs */
                  this.emit('error')}else{
                   / * * / end
                  this.emit('end')}}}catch(e){
            this.emit('error')}}/* Run development environment webpack */
    start(){
        const _this = this
        processOptions(this.config,yargs.argv,(config,options) = >{
            /* Obtain webpack compiler*/
            const compiler = webpack(config)
            /* Create the dev-server service */
            const server = new Server(compiler , options )
            /* Port is the listening port set in the development environment configuration item under webpack.dev.js */
            server.listen(options.port, options.host, (err) = > {
              if (err) {
                _this.emit('error')
                throwerr; }})})}Copy the code

④ Effect display

mycli start

mycli build

The complete code

The complete code


const EventEmitter = require('events').EventEmitter
const Server = require('webpack-dev-server/lib/Server')
const processOptions = require('webpack-dev-server/lib/utils/processOptions')
const yargs = require('yargs')

const merge = require('./merge')
const webpack = require('webpack')
const runMergeGetConfig = require('.. /config/webpack.base')

/** * run webpack */ in different environments
class RunningWebpack extends EventEmitter{
    
    /* Bind the running method */
    constructor(options){
       super(a)this._options = options
       this.path = null
       this.config = null
       this.on('running'.(type,... arg) = >{
           this[type] && this[ type ](... arg) }) }/* Accept webpack commands in different states
    listen({ type,cwdPath }){
       this.path = cwdPath
       this.type = type
       this.config = merge.call(this,runMergeGetConfig( cwdPath )(type))
       return new Promise((resolve,reject) = >{
           this.emit('running',type)
           this.once('error',reject)
           this.once('end',resolve)
       })
    }
    /* Run production webpack */
    build(){
        try{
            webpack(this.config,(err) = >{
               if(err){
                  this.emit('error')}else{
                  this.emit('end')}}}catch(e){
            this.emit('error')}}/* Run development environment webpack */
    start(){
        const _this = this
        processOptions(this.config,yargs.argv,(config,options) = >{
            const compiler = webpack(config)
            const server = new Server(compiler , options )

            server.listen(options.port, options.host, (err) = > {
              if (err) {
                _this.emit('error')
                throwerr; }})})})module.exports = RunningWebpack
Copy the code

Iv. Run the project, implement plugin, and collect model automatically

Next we will talk about the project run phase, some additional configuration items, and other operations.

Implement a simple terminal load barplugin

We write a WebPack plugin that acts as a scaffolding tool for MyCLI. In order to show developers the modified files and a webPack build time, the entire plugin is completed during the WebPack compile phase. We need a brief introduction to WebPack.

(1) the Compiler and Compilation

The two most commonly used objects when developing a Plugin are Compiler and Compilation, which are the bridge between the Plugin and Webpack. Compiler and Compilation have the following meanings:

The Compiler object contains all configuration information for the Webpack environment, including options, loaders, and plugins. This object is instantiated at Webpack startup. It is globally unique and can be easily understood as a Webpack instance. The Compilation object contains the current module resources, compiled and generated resources, and changed files. When Webpack is running in development mode, a new Compilation is created each time a file change is detected. The Compilation object also provides a number of event callbacks for plug-in extensions. Compiler objects can also be read through Compilation. The difference between Compiler and Compilation is that Compiler represents the entire Webpack life cycle from startup to shutdown, whereas Compilation represents just one new Compilation.

2.CompilerCompilation phase

We need to understand what each stage of a Compiler does so that we can use the specified hooks to complete our custom plugin at a specific stage.

1 run

Start a new build

2 watch-run

This is similar to run, except that it starts compilation in listening mode. In this event, you can obtain which files have changed to cause a new compilation to be restarted.

3 compile

This event tells the plug-in that a new build is about to start and brings the compiler object to the plug-in.

4 compilation

When Webpack is running in development mode, a new Compilation is created each time a file change is detected. A Compilation object contains current module resources, compiled generated resources, changed files, and so on. The Compilation object also provides a number of event callbacks for plug-in extensions.

5 make

After a new Compilation is created, the files are read from Entry and compiled according to the file type and configured Loader. After Compilation, the files that the file depends on are found out and the recursively compiled and parsed.

6 after-compile

The Compilation operation is complete.

7 invalid

This event is triggered when an exception such as a nonexistent file or a file compilation error is encountered and does not cause Webpack to exit.

③ Write plug-ins

The WebPack plug-in we wrote needed to print out the current changed file and use a progress bar to show the compile time.

In the code

const chalk = require('chalk')
var slog = require('single-line-log');

class MycliConsolePlugin {
    
    constructor(options){
       this.options = options
    }
    apply(compiler){
        /* Listen for file changes */
        compiler.hooks.watchRun.tap('MycliConsolePlugin'.(watching) = > {
            const changeFiles = watching.watchFileSystem.watcher.mtimes
            for(let file in changeFiles){
                console.log(chalk.green('Currently changed file:'+ file))
            }
        })
        /* before a build is created */
        compiler.hooks.compile.tap('MycliConsolePlugin'.() = >{
            this.beginCompile()
        })
        /* Compile once */
        compiler.hooks.done.tap('MycliConsolePlugin'.() = >{
            this.timer && clearInterval( this.timer )
            console.log( chalk.yellow('Compile done'))})}/* Start recording compilation */
    beginCompile(){
       const lineSlog = slog.stdout
       let text  = 'Start compiling:'

       this.timer = setInterval(() = >{
          text +=  '█'
          lineSlog( chalk.green(text))
       },50)}}module.exports = RuxConsolePlugin
Copy the code

use

Since this plugin is in a development environment, we only need to add MycliConsolePlugin to webpack.dev.js.

const webpack = require('webpack')
const MycliConsolePlugin = require('.. /plugins/mycli-console-pulgin')
const devConfig =(path) = >{
  return  {
    devtool: 'cheap-module-eval-source-map'.mode: 'development'.devServer: {
      contentBase: path + '/dist'.open: true./* Automatically open the browser */
      hot: true.historyApiFallback: true.publicPath: '/'.port: 8888./* Server port */
      inline: true.proxy: {  /* Proxy server */}},plugins: [
      new webpack.HotModuleReplacementPlugin(),
      new MycliConsolePlugin({
        dec:1}}})]module.exports = devConfig
Copy the code

The effect

2 Require. Context implements front-end automation

Front-end automation is out of myCLI, but to give you an idea of the front-end automation process, let’s use the webPack API require.context as an example.

The require. The context

require.context(directory, useSubdirectories = true, regExp = / ^ \ \ /. * $/, mode = 'sync');
Copy the code

You can give this function three arguments: (1) directory The directory to search for, (2) useSubdirectories flags whether subdirectories are being searched, and (3) regExp matches a regular expression for the file.

Webpack will parse the code require.context() in the build.

Sample website:

/* (creates) a context with files from the test directory and request ending with '.test.js'. * /
require.context('./test'.false./\.test\.js$/);

/* (creates) a context in which all files are from the parent folder and all of its child folders, with request ending with '.story.js'. * /
require.context('.. / '.true./\.stories\.js$/);

Copy the code

Automate

We will then use the project created by mycli as a demo. We will create a new model folder under the project SRC folder to automatically collect the files in it. Ts, demo1.ts, and demo2.ts. The next thing we do is to automatically collect data from these files.

Project directory

demo.ts

const a = 'demo'

export default a
Copy the code

, not. Ts

const b = 'demo1'

export default b
Copy the code

demo2.ts

const b = 'demo2'

export default b
Copy the code

explorerequire.context

const file  = require.context('./model'.false./\.tsx? |jsx? $/)
console.log(file)

Copy the code

Print file, we found the webpack method. Next we get an array of file names.

const file  = require.context('./model'.false./\.tsx? |jsx? $/)
console.log(file.keys())
Copy the code

To parse, we automatically collect the a, B,c variables under the file.

/* Used to collect files */
const model ={} 
const file  = require.context('./model'.false./\.tsx? |jsx? $/)

/* Traverse the file */
file.keys().map(item= >{
    /* Collect data */
    model[item] = file(item).default
})

console.log(model)
Copy the code

Here we implement the automatic collection process. For deeper recursive collection, we can set the second argument of require.context to true

require.context('./model'.true./\.tsx? |jsx? $/)
Copy the code

Project directory

demo3.ts

const d = 'demo3'

export default d
Copy the code

Print perfect recursion collected under subfilesmodel

Five summarizes

Technical summary

The techniques included in the entire custom scaffold are;

The source address

Rux – cli scaffold

rux-react-webpack-plugin

Interested students can try to write their own scaffolding, the process will learn a lot of knowledge.

Send roses, hand left fragrance, reading friends can give the author like, concern a wave. Keep updating front end articles.

Feel useful friends can follow the author public number front-end Sharing to continue to update good articles.

Reference documentation

1. Commander. Js Chinese Document (required for CLI)

[2] Webpack

3. Webpack Chinese documents