In the conclusion of my core knowledge of front-end development, “The Community of Bluster and the Self-cultivation of a Programmer”, I mentioned that in Spanish, there is a very special word called “Sobremesa”. It refers specifically to “the short but good time when everyone is talking and wanting to talk at the table after eating.” So in the last part of the class, INSTEAD of talking about “hard and dry”, I talked about maintaining community etiquette, being active in the open source world, and focusing on being a contributor to the open source community.

This article continues the exploration of open source and engineering. I would like to focus on the topic of “modern libraries and project writing”, which I believe will inspire you both technically and mentally.

Library, not only can be used

With the National Day holiday behind us and the final quarter of 2019, a new pattern of front-end technologies and solutions is emerging every moment. The topic of “how to write a modern open source library” is always worth discussing. Of course, this may not be easy for a beginner developer. For example, we need to think about:

  • How to choose an open source certificate
  • How to write library documents so that users can get started quickly
  • What are the standards TODO and CHANGELOG need to follow
  • How to complete a smooth 0 error, 0 warning build process
  • How to determine compilation scope and implementation process
  • How to design a reasonable modular solution
  • How do you package the output for multiple environments
  • How to design auto-normalized links
  • How are version specifications and COMMIT specifications guaranteed
  • How to test
  • How to introduce sustainable integration
  • How can best practices for tool use and configuration be introduced
  • How to design APIs etc

Any one of these points can lead to front-end language specifications and knowledge of design, engineering infrastructure, etc. For example, let’s think about the build and package process. If I were a library developer, my expectation would be:

  • I’m going to write library code elegantly in ES Next, so escape through Babel or Buble
  • I have custom compatibility requirements for my library output to run in browsers and Node environments
  • My library output should support modular solutions such as AMD or CMD. Therefore, for different environments, the modular approach is different
  • My library output needs to work seamlessly with tools like Webpack, Rollup, Gulp, etc

In light of these expectations, I wrestle with topics like “Rollup versus Webpack”, “How to really drive Tree shaking”, “how to choose and compare different tools”, “How to use Babel wisely, how to use plug-ins”, etc.

All of these issues have been covered in some detail in our previous article: How to Write a modern JavaScript library in 2020, and there are more detailed knowledge and practical examples in my course.

“Write library library”, design is an art

Whether you’re starting from scratch, developing an application project or an open source library, infrastructure is critical. Next, I will discuss the organization and infrastructure design of the project from the jslib-Base evolution.

A little over six months ago, we wrote jslib-Base to help you quickly build a standard JavaScript library from many aspects.

Jslib-base is the most useful JavaScript third-party library scaffolding, enabling JavaScript third-party library open source, so that the development of a JavaScript library is simpler, more professional

That’s right, it’s a “library to write libraries for.” The early approach to JSLIb-Base (Jslib for short) was more primitive, integrating templates for various best practices. As a library developer, you first need to fork your project in Github and then do custom initialization via Jslib’s built-in NPM script. This initialization process includes but is not limited to:

  • Library project name substitution based on template variables
  • JaScript/TypeScript scaffolding sandbox environment replacement based on template replacement
  • Readme. md, todo. md, changelog. md, Doc, etc

For example, rename the project name:

"scripts": {
    "rename": "node rename.js",
    // ...
  },
  
Copy the code

The core code of the corresponding script is (deleted) :

const path = require('path');
const cdkit = require('cdkit');

function getFullPath (filename) {
    return path.join(__dirname, filename)
}

const map = [
    getFullPath('package.json'),
    getFullPath('README.md'),
    getFullPath('config/rollup.js'),
    getFullPath('test/browser/index.html'),
    getFullPath('demo/demo-global.html')]; const config = [ { root:'. ',
        rules: [
            {
                test: function (pathname) {
                    return map.some(function (u) {
                        return pathname.indexOf(u) > -1;
                    });
                },
                replace: [
                    {
                        from,
                        to,
                    }
                ]
            }
        ]
    },
];

cdkit.run('replace', config);
Copy the code

The previous design approach largely met the initialization requirements of the library developer. By forking a project, scaffolding code integration with best practices can be achieved, and then custom scaffolding code requirements can be completed by running NPM scripts.

In my opinion, the true meaning of the first version of Jslib is to “clarify best practices”. For example, we argued that “library development uses Rollup and other scenarios (such as application development) use Webpack”. Here’s how to write a modern JavaScript library in 2020. At the same time, Jslib is compiled and packaged using the latest version of Babel (especially the core differences between Babel 6 and Babel 7 for readers of the source code). At the same time, in order to maximize compatibility, we use a lower version of Rollup. Of course, users can completely customize the configuration. The overall construction and design process is shown as follows:

More details are not available here, but readers are welcome to discuss them with us.

Please consider: The above are all “best practices” that the community and we are exploring, but I am not entirely satisfied with the way Jslib was used in its first release. First of all:

  • Git fork + Clone is expensive and relatively “wild”.
  • Template + NPM scripting makes the process of initializing library scaffolding “weird”, resulting in redundant code
  • Template + NPM scripting approach, relies on a large number of runtime file operations, not black box, not concise and elegant
  • There is still much room for improvement in the demand for customization

In view of these disadvantages, THE solution I give is the command line + Monorepo transformation. So began a redesign. In fact, Jslib’s makeover is a microcosm of the modernization project being used. Read on.

Command-line technology is already very simple

Now that NodeJS has matured, command-line writing is pretty common, and there’s a lot of talk about it in the knowledge community, but it’s actually pretty simple, and I won’t go into it too much. In general, the new version of Jslib is used as follows:

After typing a simple command, we have a complete library scaffolding runtime: it includes best practices packaging, Babel configuration, test case running, demo and doc, all the necessary environments are integrated and ready to run. It even includes the Github banner content of the library. The sandbox is shown below:

All that’s left is for the user to write the code directly!

Jslib provides the command line capability Jslib Update, which relies on file copying, if the user needs to update something after the project has been initialized and is happily developing the library:

  • Template file merge
  • Json file merging
  • Content to replace
  • Delete the file
  • Upgrade depend on

Such ability.

Of course, that’s not what I want to focus on. I’m going to focus on the implementation of Monorepo and other technologies.

Thinking about modern project organization

Modern project organizations manage code in two main ways:

  • Multirepo
  • Monorepo

As the name implies, Multirepo manages applications by module in different warehouses; And Monorepo is applied in all of the modules are all on the same project, so that all applications don’t need a separate contract, test, all the code in a project management, deployment of online together, sharing build and configure script core processes, at the same time during the development phase to earlier emersion bugs, exposed the problem.

These are the different organizational philosophies of project code: one that advocates divide and conquer, and one that advocates centralized management. Whether to put all eggs in one basket or advocate diversity depends on the style of the team and the actual situation it faces.

Babel and React are typical Monorepos in that issues and pull requests are clustered into a single item, and CHANGELOG can be easily sorted out from a list of commits. Take a look at the React project repository, which shows its strong Monorepo style from its directory structure:

React-16.2.0 / packages/ react-art/ react-art /Copy the code

Thus, react and react-DOM code are together, but they are two different libraries on NPM, that is, react and ReactDom are simply managed by Monorepo in the React project. As to why React and React-dom are two packages, I’ll leave that up to the reader.

Monorepoization of Jslib

Based on the above knowledge, we realized the advantages of Monorepo:

  • All projects have consistent Lint, build, test, release processes, and the core build is consistent
  • Easy debugging and collaboration between different projects
  • Convenient to deal with issues
  • Easy to initialize the development environment
  • Easy to find Bugs

So why is Jslib suitable for Monorepo, and how do we do Monorepo?

When the user enters the jslib new mylib command, we get the developer’s design intent from the interactive command line or command line arguments, including:

  • The project name
  • Publish the NPM package name
  • Author’s Github account name
  • Build libraries in JavaScript or TypeScript
  • Does the project library use English or Chinese as the content language such as documents
  • Use NPM or YARN maintenance items, or do not install dependencies automatically

With this information, we initialize the entire project library scaffolding. The essence of the initialization process is to populate the template based on input information. For example, if the developer chooses to build the project using TypeScript and an English environment, the core process initializes the rolluo.config.js file and reads rollup.js.tmpl and fills in the template with relevant information (such as compilation of TS). Json, package.json, changellog.en. md, readme.en. md, doc.en.md, etc. The generation of all these files needs to be pluggable and, ideally, a stand-alone runtime. Therefore, we can treat the initialization of each scaffold file (that is, template file) as a separate application, supervised and scheduled by the CLI application. You also create the util application, which provides the basic library of functions. In other words, we applied all the templates, taking full advantage of Monorepo and supporting independent packages.

The final project is organized as follows:

jslib-base/
  packages/
    changelog/
    cli/
    compiler/
    config/
    demo/
    doc/
    eslint/
    license/
    manager/
    readme/
    rollup/
    root/
    src/
    test/
    todo/
    util/
    ...
Copy the code

Relevant schematic diagram:

The corresponding architecture is as follows:

The relevant core codes are as follows:

const fs = require('fs');
const path = require('path');
const ora = require('ora');
const spinner = ora();

const root = require('@js-lib/root');
const eslint = require('@js-lib/eslint');
const license = require('@js-lib/license');
const package = require('@js-lib/package');
const readme = require('@js-lib/readme');
const src = require('@js-lib/src');
const demo = require('@js-lib/demo');
const rollup = require('@js-lib/rollup');
const test = require('@js-lib/test');
const manager = require('@js-lib/manager');

function init(cmdPath, option) {
    root.init(cmdPath, option.pathname, option);
    package.init(cmdPath, option.pathname, option);
    license.init(cmdPath, option.pathname, option);
    readme.init(cmdPath, option.pathname, option);
    demo.init(cmdPath, option.pathname, option);
    src.init(cmdPath, option.pathname, option);
    eslint.init(cmdPath, option.pathname, option);
    rollup.init(cmdPath, option.pathname, option);
    test.init(cmdPath, option.pathname, option);
    manager.init(cmdPath, option.pathname, option).then(function() {
        spinner.succeed('Create project successfully');
    });
}
Copy the code

We call the init method provided by each application, which takes the project path, initialization parameters generated by the user’s command line interaction, and other parameters as parameters of the init method. The core operation of the init method is to generate the associated scaffolding file and copy it to the user project directory. The last one is manager.init, which automatically installs dependencies based on the user’s NPM/YARN/None options. This is an asynchronous method.

When the version development reaches a certain stage, we can rely on Lerna to issue commands for unified release. The diagram below:

Learn is a great tool for managing Monorepo, but you can also use yarn Workspace to create a smoother process. The use of these tools can be referred to the documentation, we will not cover more.

In general, we’ll find that Jslib, like Babel and Webpack, has adopted a microkernel architecture style to accommodate complex customization requirements and frequent functional changes. The so-called microkernel refers to the core code advocating the Simple principle, the real functions are implemented through plug-in extensions. The diagram below:

The operation flow chart is as follows:

Poetry and distance, can learn can do more

Unlike the earlier article on how to write a modern JavaScript library in 2020, which focused on best practices for writing libraries and various configurations, this article ends with an introduction to the project’s design ideas and transformation process. Next, how do we do more and better, or as developers, how to continuously improve a library, and how to analyze a good library source, learn more knowledge? For example, I mentioned that YARN Workspace and LERNA work with the build process. How do you reconcile the two?

I’m not going to answer that question right now, but let’s start with something more basic and core.

Parsing a library infrastructure

I’ll continue this discussion with an example scenario of the “Develop the React component library” wheel. You should be familiar with the mature solutions of the React component libraries such as Ant-Design and React Bootstrap. My intent is obviously not to teach you how to reuse components and write common wheels using HoC, Render Prop, or even hooks patterns, but rather to give you a better idea of how to organize projects and build designs for these wheels.

There are more than 50 files in ant-Design’s Components directory (no counting), and there are bound to be references between components. If these components are independent of each other and have the ability to ship separately (users can install XXComponent separately), it would be a good idea to ship all the components together. At the same time, as the developers of these libraries, debugging will also enjoy greater convenience. All the transformation methods point to monorepoization, yes, such appeals are more suitable for Monorepo than Jslib.

Of course, this more modern way of organizing has already been used. Unfortunately, Ant-Design doesn’t use such a design, but readers can still learn about component encapsulation in Ant-Design, while inreach-uiInfrastructure and organization of learning projects. I think reach-UI, a relatively small open source work, has a better design performance in this aspect, as shown in the picture below and noted:

We further study through code, choose alert this component (directory reach – UI/packages/alert/package. Json), we see:

"scripts": {
	"build": "node .. /.. /shared/build-package"."lint": "eslint . --max-warnings=0"
},
Copy the code

The same thing can be found in the package.json files of other components, which are called “shared build scripts.” The build-package content is very simple:

const execSync = require("child_process").execSync;
const path = require("path");

let babel = path.resolve(__dirname, ".. /node_modules/.bin/babel");

const exec = (command, extraEnv) =>
  execSync(command, {
    env: Object.assign({}, process.env, extraEnv),
    stdio: "inherit"
  });

console.log("\nBuilding ES modules ...");
exec(`${babel} src -d es --ignore src/*.test.js --root-mode upward`, {
  MODULE_FORMAT: "esm"
});

console.log("Building CommonJS modules ...");
exec(`${babel} src -d . --ignore src/*.test.js --root-mode upward`, {
  MODULE_FORMAT: "cjs"
});
Copy the code

The library exports two modularity approaches: ESM and CJS for use in different environments.

In the project root directory, package.json has something like this:

"scripts": {
    "build:changed": "lerna run build --parallel --since origin/master"."build": "lerna run build --parallel"."release": "lerna run test --since origin/master && yarn build:changed && lerna publish --since origin/master"."lint": "lerna run lint"
  },
Copy the code

Lerna Run Build allows you to run build commands for all components within packages to build all components at the same time.

In the project root directory lerna.json, there is something like this:

{
  "version": "independent",
  // ...
}
Copy the code

We can see that version uses independent mode, so that when a module releases a new version, it will ask for the version number to be upgraded one by one. The base version is its own package.json, so that each component package can keep its own version number.

This project is one of the best infrastructure projects I have observed in all component library wheel class projects (I personally think, just my aesthetic and cognitive, does not represent an objective position), I recommend you to learn. I will update the detailed interpretation of reach-UI or more relevant contents (such as the complete construction of a UI wheel, automatic construction of documents, component encapsulation and other knowledge points) in my subsequent courses or articles. I hope this article can serve as an inspiration.

Parses a library script

Earlier we analyzed the build-package file in reach-UI. In fact, NPM scripts play a crucial role in a project. It is the core process of a project.

As more and more projects are made from scratch, we can see that NPM scripts have certain commonalities: maybe the Lint script for project A is similar to that for project B; The pre-commit scripts for projects B and C are similar. In this case, aspiring developers might want to create their own “script world.” When starting project D, just rely on the existing scripts and add any custom behaviors. At the same time, we put the script convergence abstract, but also convenient for everyone to learn, master.

For example, if I’m used to using Jest for unit testing, the JEST-related NPM scripts can be abstracted and introduced in a new project package.json:

"scripts": {
    "test": "lucas-script --test",
    // ...
Copy the code

The lucas-script is abstracted from kentcdodds/ kcD-scripts:

process.env.BABEL_ENV = 'test'
process.env.NODE_ENV = 'test'

const isCI = require('is-ci')
const {hasPkgProp, parseEnv, hasFile} = require('.. /utils') const args = process.argv.slice(2) const watch = ! isCI && ! parseEnv('SCRIPTS_PRE-COMMIT'.false) &&
  !args.includes('--no-watch') &&
  !args.includes('--coverage') &&
  !args.includes('--updateSnapshot')? ['--watch'] : [] const config = ! args.includes('--config') &&
  !hasFile('jest.config.js') &&
  !hasPkgProp('jest')? ['--config', JSON.stringify(require('.. /config/jest.config'))]
    : []

// eslint-disable-next-line jest/no-jest-import
require('jest').run([...config, ...watch, ...args])
Copy the code

This script abstracts from the business of the project, but the code is fairly simple. In the current test flow, it assigns the corresponding environment variables, determines whether the Jest operation needs to be monitored (the watch parameter), obtains the Jest configuration, and finally runs Jest.

For example, with Travis for continuous integration, the actions at the successful end can be abstracted:

const spawn = require('cross-spawn')
const {
  resolveBin,
  getConcurrentlyArgs,
  hasFile,
  pkg,
  parseEnv,
} = require('.. /utils')

console.log('installing and running travis-deploy-once')

const deployOnceResults = spawn.sync('npx'['travis-deploy-once@5'], {
  stdio: 'inherit',})if (deployOnceResults.status === 0) {
  runAfterSuccessScripts()
} else {
  console.log(
    'travis-deploy-once exited with a non-zero exit code',
    deployOnceResults.status,
  )
  process.exit(deployOnceResults.status)
}

// eslint-disable-next-line complexity
function runAfterSuccessScripts() {
  const autorelease =
    pkg.version === '0.0.0 - semantically - released' &&
    parseEnv('TRAVIS'.false) &&
    process.env.TRAVIS_BRANCH === 'master' &&
    !parseEnv('TRAVIS_PULL_REQUEST'.false)

  const reportCoverage = hasFile('coverage') && !parseEnv('SKIP_CODECOV'.false)

  if(! autorelease && ! reportCoverage) { console.log('No need to autorelease or report coverage. Skipping travis-after-success script... ')},else {
    const result = spawn.sync(
      resolveBin('concurrently'),
      getConcurrentlyArgs(
        {
          codecov: reportCoverage
            ? `echo installing codecov && npx -p codecov@3 -c 'echo running codecov && codecov'`
            : null,
          release: autorelease
            ? `echo installing semantic-release && npx -p semantic-release@15 -c 'echo running semantic-release && Unlike react-scripts, kcd-scriptse'`
            : null,
        },
        {killOthers: false},
      ),
      {stdio: 'inherit'},
    )

    process.exit(result.status)
  }
}
Copy the code

This code determines whether automatic release or test coverage reporting is required after the continuous integration phase is over. Use seman-Release and codecov, respectively, if necessary.

To use:

"scripts": {
    "after-release": "lucas-script --release",
    // ...
Copy the code

Finally, whether it’s react-scripts or Lucas-scripts, or various XXX-scripts, these infrastructure utility class scripts will definitely support user customization. However, unlike the React-scripts solution of Create React App, I think the script design should be more open. In addition, xxX-scripts should just work. Default configurations also need to be exposed externally for developers to master.

This is especially true with the Babel and Webpack plug-in architectures and Eslint configurations. Taking the Eslint configuration as an example, an ideal design scenario would be for developers to add the following to their custom.eslintrc files:

{"extends": "./node_modules/lucas-scripts/eslint.js"}
Copy the code

This line of code can be combined with the default Lint. The same design is shown in the Babel configuration, we just need to:

{"presets": ["lucas-scripts/babel"]}
Copy the code

Jest configuration:

const {jest: jestConfig} = require('lucas-scripts/config')

module.exports = Object.assign(jestConfig, {
  // your overrides here

  // for test written in Typescript, add:
  transform: {
    '\\.(ts|tsx)$': '<rootDir>/node_modules/ts-jest/preprocessor.js',}})Copy the code

Of course, I have encapsulated more scripts, and more util functions related to engineering aspects, for those who are interested or want to learn more about it, please pay attention to my subsequent courses. If you want to start with the basics and work your way up, there is also an introduction to my online courses at the beginning of this article.

conclusion

Jslib, mentioned repeatedly in this article, can help developers create scaffolding and basic code for a library runtime called Just Work with simple commands. If you want to write a library, then I suggest you consider using it for the first step. But I don’t want to “sell” this work. What’s really important is that if you want to understand how to design a project from scratch, you can probably learn from it.

In this article we went from a “library for creating libraries” to some of the best practices of modern front-end development, to Monorepo organizing projects, to the NPM script build process. The infrastructure of an application project or library involves many aspects, and many of the details in this article are worthy of in-depth analysis. We will produce more content in the future, welcome to discuss and learn together.

To share

My course: Advanced core knowledge of front-end development

Mobile click to learn more:

Happy coding!