Automated build is produced when we develop the source of automated build into a production environment can run code or program, normally we will call the automated build process automation workflow, role is as far as possible from the problems of running environment, used in the development phase to improve the efficiency of the grammar, norms and standards

NPM Scripts is the simplest way to automate the build workflow

"Scripts ": {"build": "sass SCSS /main/ SCSS CSS /style. CSS --watch", // watch monitor file changes "preserve":" NPM run build" "serve": "Browser-sync. --files \" CSS /*.css\", // browser-sync starts a Web server "start": "run-p build serve" // need npm-run-all},Copy the code

Common automated build tools

Grunt

The build process is relatively slow because it is based on a temporary file. During each step of the build process, disk reads and writes are performed. After each operation, a temporary file is generated, which is also read by the next operation

Basic use of grunt

1. Create an empty folder, NPM init -y initialize directory, NPM I grunt, add grunt module, NPM install -g grunt-cli. If yarn is used, this file is not required

Gruntfile.js is a Grunt entry file that defines tasks that need to be performed automatically by Grunt. This file exports a function that takes a Grunt parameter and provides some API that can be used to create tasks

By default, grunt. RegisterTask is used to register a task. The first parameter is the task name and the second parameter is the corresponding callback function

If it’s three arguments, then the second argument is a description of the task and if the task name is ‘default’, then the task will be the default task of grunt, and when it’s executed it will be grunt on the command line. We can also use default to map other tasks, The second argument is passed in an array that names the tasks that grunt executes in turn

3. The grunt code supports synchronous mode by default. If you want to use asynchronous operation, you must use this.async() to get a callback function

Grunt indicates that the task failed

When errors occur while building the logical code for a task, the task can be marked as failing by returning false in the function body

  • If the task is in the task list, subsequent tasks will not be executed if the task failsgrunt default --forceEven if a task fails, subsequent tasks continue to be executed
  • If the failed task fails the asynchronous task, then we cannot passreturn falseTo indicate that the asynchronous task has failed, add the following parameter to the done functionfalseThe specific code is as follows
grunt.registerTask('bad-async', function () { const done = this.async() setTimeout(() => { console.log('async task working'); Done (false);}, 1000); })Copy the code
How to configure Grunt

Grunt also provides an API for configuring options called initConfig

module.exports = grunt => { grunt.initConfig({ foo: 'bar' }) grunt.registerTask('foo',() => { console.log(grunt.config('foo')); })}Copy the code

The execution result

More often than not, we fetch the entire config object directly through grunt. Config () and then fetch the value

Grunt Multi-objective mission

In addition to the normal task mode, grunt also supports a multi-objective task mode, which can be understood as the concept of sub-task. Multi-objective mode allows tasks to form multiple sub-tasks according to their configuration

Module.exports = grunt => {grunt. InitConfig ({'build':{options:{// the information specified in option will be used as the configuration option for the task foo:'bar' // Except for option, }, CSS :'2', js:'2'}}) grunt. RegisterMultiTask ("build", function(){// We need to use normal functions here, This console.log(' build task ') is needed; console.log(`target: ${this.target},data:${this.data}`); console.log(this.options()); })}Copy the code

The execution result

Use of the Grunt plugin

NPM I –dev grunt-contrib-clean, grunt- contrib-xxxx, grunt- contrib-xxxx

module.exports = grunt => { grunt.initConfig({ clean: {temp :'temp/app.js' // 'temp/*.txt'}}) grunt. LoadNpmTasks (' grunt-contrib-clean ') // Load the tasks provided by this plugin} // Run grunt clean , app.js is deletedCopy the code

2. In gruntfile.js, grunt. LoadNpmTasks (‘ plugin name ‘) loads the tasks provided in the plugin. Add configuration options to this task in grunt. InitConfig () and the plug-in will work

Grunt is a popular plugin
  1. grunt sassYou can installnpm i grunt-sass sass --dev
  2. grunt-babel @babel/core @babel/preset-envCan be found ingruntfileThe use ofbabelProvide the task up
  3. As thegruntfileMore and more complicated, insideloadNpmTasksOperation more and more, there is a moduleload-grunt-tasks, can reduceloadNpmTasksThe use of

Method of use

const loadGruntTasks =require('load-grunt-tasks')
loadGruntTasks(grunt)
Copy the code

All tasks in the Grunt plug-in are automatically loaded. This is what load-grunt- Tasks do

4.runt-contrib-watch, file modification, automatic compilation

That’s all grunt has to say, because there aren’t many of them out there, so just a quick introduction, and if you want more you can go to the API on the official website.

Gulp

The streaming build system can be used to build The streaming system

Grunt allows you to perform multiple tasks at the same time by default. Grunt allows you to perform multiple tasks at the same time

Basic use of Gulp

  1. npm init -y npm i gulp --devInstall in your project firstgulpDevelopment dependency of
  2. Add one to the root directorygulpfile.jsFor writing the required filesgulpThe build task to perform is the gulp entry file
  3. The command line uses the commands provided by gulp to run these tasks
Const gulp = require('gulp') gulp.task('bar', done => { console.log('bar working'); Done ()}) // the second, currently preferred form of exports.foo = done => {console.log('foo working'); } exports.default = done => {console.log('default working'); done() }Copy the code
Gulp creates a composite task

Gulp provides two functions series Parallel which are responsible for the processing of serial and parallel tasks

const {series, parallel } =  require('gulp')

const task1 = done => {
  setTimeout(() => {
    console.log('task1 working');
    done()
  }, 1000);
}
const task2 = done => {
  setTimeout(() => {
    console.log('task2 working');
    done()
  }, 1000);
}
const task3 = done => {
  setTimeout(() => {
    console.log('task3 working');
    done()
  }, 1000);
}

exports.foo = series(task1,task2,task3)
exports.bar = parallel(task1,task2,task3)
Copy the code

Three ways to do Gulp asynchronous tasks

  1. Callback functions,promise, async, error first, if the current task fails, subsequent tasks will not be executed
  2. stream
// asynchronous task exports.callback = done => {console.log('callback'); Done ()} exports. Callback_error = done => {console.log('callback'); done(new Error('task failed')) } exports.promise = () => { console.log('promise'); Resolve () // resolve will be ignored by gulp} exports.promise_error = () => {console.log('promise_error'); return Promise.reject(new Error('task failed')) } const timeout = time => { return new Promise(resolve => { setTimeout(resolve,time) }) } exports.async = async () =>{ await timeout(1000) console.log('async task'); } // stream exports.stream = () => { const readStream = fs.createReadStream('package.json') const writeStream = Fs.createwritestream ('temp.txt') // pipe is similar to pouring water from one pool to another. Readstream. pipe(writeStream) // return to readStream. Gulp receives this stream, Registers an end event for this readStream, Return readStream} exports.stream = () => {const readStream = fs.createreadstream ('package.json') const writeStream = fs.createWriteStream('temp.txt') readStream.pipe(writeStream) readStream.on('end', () => { done() }) }Copy the code

Gulp receives the stream and registers an end event for the readStream to listen for the end of the task, so there is no need to call done to finish the task

Core working principle of gulP construction process

The build process is mostly just reading out the file, doing some transformation, and finally writing it to another location. There are three core concepts

We read the file we need by reading the stream, and then convert it to the desired result by the conversion logic of the conversion stream, and then write it to the specified file location by writing to the stream

Const fs = require('fs') const {Transform} = require('stream') exports. Default = () => {const fs = require('fs') const {Transform} = require('stream') exports Fs.createreadstream ('normalize.css') // Create file write stream const write = fs.createWritestream ('normalize.min.css') // File conversion stream const Transform = new transform ({// (chunk, encodeing, Callback) => {// Core conversion process is implemented // chunk => Chunk can fetch the content read in the stream (Buffer) const input = chunk.toString() const output = Input. The replace (/ \ s + / g, ""). The replace (/ \ \ *. +? \ \ * / / g," ") / / Spaces replace, comments, replace the callback (null, the output) / / the first parameter to the incoming is an error object, Pipe (transform) // pipe(write) // return read}Copy the code
Use of file manipulation apis and plug-ins

Gulp provides an API for creating file read and write streams – SRC,dest – which is more powerful and easier to use than the low-level Node API. In most cases, we provide the actual flow of creating a build task through gulp via a separate plug-in: create a read flow through the SRC method, then process the file through the plugin-provided transformation flow, and finally create a write flow through the dest method to write to the target file

// Gulp provides an API for creating file read and write streams (SRC, dest), which is more powerful than the underlying Node API. It's also easier to use const {SRC, Dest} = require('gulp') const cleanCss = require('gulp-clean-css') //  rename = require("gulp-rename") // npm i gulp-rename --dev exports.default = () => { return src('src/*.css') // Pipe (cleanCss()) // Convert the file to the write stream and then write to the write stream. Pipe (rename({extname: ".min.css"})) // extName specifies the rename extension.pipe(dest('dist')) // export to the write stream created by dest, specify a write destination directory}Copy the code
Use of gulP plugins

Style compile gulp-sass

const style = () => { return ( src("src/assets/styles/*.scss", { base: "SRC"}) // pipe(plugins.sass({outputStyle: "Expanded"})) // 'outputStyle:'expanded' expands by default. Pipe (dest("temp")). Pipe (bs.reload({stream: True})) // reload pushes the file stream information to the browser, and 'stream' pushes it as a stream); };Copy the code

Babel is just an EcmaScript conversion platform by default, and does nothing. The conversion is done by @babel/preset-env @babel/core

const script = () => { return ( src("src/assets/scripts/*.js", { base: "SRC"}) // Babel is an ecmascript conversion platform by default. It does not convert ecmascript. Pipe (plugins.babel({presets: ["@babel/preset-env"]})). Pipe (dest("temp")). Pipe (bs.reload({stream: preset). true})) ); };Copy the code

The page template compiles gulp-swig

const page = () => {
  return src("src/*.html", { base: "src" })
    .pipe(plugins.swig({ data }))
    .pipe(dest("temp"))
    .pipe(bs.reload({stream: true}))
};
Copy the code

Image and font files are converted to gulp-Imagemin

Return SRC (" SRC /assets/images/**", {base: {return SRC (" SRC /assets/images/**", {base: "src" }) .pipe(plugins.imagemin()) .pipe(dest("dist")) }; const font = () => { return src("src/assets/fonts/**", { base: "src" }) .pipe(plugins.imagemin()) .pipe(dest("dist")); };Copy the code

File clear del

const del = require("del");

const clean = () => {
  return del(["dist", "temp"]);
};
Copy the code

Auto-loading plug-ins are introducing more and more things, and we don’t need every plug-in to be introduced at the beginning of the file

const loadPlugins = require("gulp-load-plugins"); // loadPlugins is a method const plugins = loadPlugins(); Plugins are objects on which all plugins become attributesCopy the code
browser-sync

Support hot update after code modification, browser-Sync provides a Web server to provide our Web services, which is conducive to the development phase of the WYSIWYG watch task, can monitor the file path wildcard, according to the results of the file monitoring, Deciding whether to execute a task think about which tasks are to be executed during development (style,script) and which are not (image,font, etc.)

const browserSync = require(`browser-sync`); const bs = browserSync.create(); const serve = () => { watch("src/assets/styles/*.scss", style) watch("src/assets/scripts/*.js", script) watch("src/*.html", page) watch(["src/assets/images/**","src/assets/fonts/**", "public/**"], bs.reload) // watch("src/assets/images/**", image) // watch("src/assets/fonts/**", font) // watch("public/**", extra) bs.init({ notify: false, port: 2080, // open: false, // files:'dist/**', server: { baseDir: [" temp ", "SRC", 'public'], routes: {"/node_modules ":" node_modules ", / / the original path: refers to the position},,}}); };Copy the code
Useref file reference processing

Build comments in HTML files are automatically processed and converted accordingly

Here use temp directory as the intermediate directory, originally we read flow from the dist directory SRC read a file, do some processing, and then through the dest stream write dist file, creates a file to read and write at this time of conflict, the read and write, read and write without separation, easy to read and write files can’t write into the situation, this time, We need to redesign a file generation and placement process

In fact, before Useref, all the generated files were intermediates, so it would be appropriate to put these intermediates in a temporary directory, temp, and then at useref time, pull the files out of the temporary directory, do some conversion operations, and finally put them in the dist directory

Const useref = () => {// break file structure return SRC ('temp/*.html', {base:"temp"}) // Useref can merge all the resources introduced in the comments into a single file, automatically modifying the HTML in the process. And dependence on the HTML file to create a new file generated in the HTML file. The pipe (plugins. Useref ({searchPath: ['temp','.']})) // html js css .pipe(plugins.if(/\.js$/, plugins.uglify())) .pipe(plugins.if(/\.css$/, plugins.cleanCss())) .pipe(plugins.if(/\.html$/, plugins.htmlmin({ collapseWhitespace: true, minifyCSS: MinifyJS: true // Compress js from script tag in HTML file)))).pipe(dest('dist'))}Copy the code

In general, gulp and Grunt are the executors of tasks. They do not include the specific functions that can be implemented

Encapsulation automates the build workflow

The code snippet approach is not conducive to overall maintenance of other content… Crazy coding…