This article is the original, reprint please attach the author information: LMjben

Project address: github.com/lmjben/cdfa…

preface

In actual development, initializing a project from scratch is often cumbersome, so a variety of scaffolding tools have been created. Scaffolding tools like crea-React-app, vue-cli, @angular/cli, etc., create a project structure and a development environment with a single command.

Scaffolding tools are really easy to use and developers can focus on their business without having to worry too much about setting up the environment. But the authors argue that it is also important to learn the process behind the scaffolding tools, in case we can still build the project after the scaffolding is down. For this purpose, the author built the CDFang-Spider project from scratch.

Let’s take this project as an example and build it from scratch.

Project selection

  1. Which of the three frames?

    • React is a personal hobby.
    • React-router defines routes.
    • React Context State management.
    • React hooks componentization.
  2. Introducing strongly typed languages?

    • The typescript. Provide type support for JS, editor friendly, increase code maintainability, use the heart at ease.
    • When using third-party libraries, you can write more compliant code, avoid API misuse, and so on.
    • A large number of @types/ XXX packages are relied upon in the project, which increases the size of the project virtually.
    • When the editor performs type check on TS files, it needs to traverse all @types files in node_modules directory, which causes the editor to be stuck.
    • There are still many libraries that don’t have @types support and are not very convenient to use.
  3. CSS selection?

    • Precompiler less. The project used variable definition, selector nesting, selector reuse, less is enough.
    • To resolve naming conflicts, use CSS Modules, not CSS in JS.
    • Use the BEM naming convention.
    • Use the PostCSS plug-in autoprefixer to increase CSS compatibility.
  4. Which build tool to choose?

    • Webpack. Built-in Tree Shaking, Scope Hosting, high packaging efficiency, active community.
    • Webpack-merge Merges different environment configuration files.
    • Configuration externals. CDN is introduced to replace larger packages in node_modules.
    • Gulp. Used to package node-side code.
  5. Code specification check?

    • Eslint. Assist code specification implementation, effectively control code quality. It also supports validation of typescript syntax.
    • Configure the eslint-config-airbnb rule.
    • Configuring eslint-config-prettier Disables the rule where prettier conflicts with eslint-config-prettier.
  6. Test frame selection?

    • Jest. Large and complete, including: test framework, assertion library, mock data, coverage, etc.
    • Enzyme. Test the React component.
  7. Back-end framework selection?

    • Koa. Simple and easy to use, powerful middleware mechanism.
    • Apollo – server. Help set up the graphQL backend environment.
  8. Database selection?

    • Mongo. Json class error format, convenient storage, front-end friendly.
    • Configure Mongoose to facilitate mongodb database modeling.
  9. Interface mode selection?

    • Graphql. You can obtain data in the required format to reduce redundant data on the interface.
    • Graphql Schema defines the parameters, operations, and return types of the back-end interface without the need to provide interface documentation.
    • The front-end can be developed after the schema definition, and the data format can be controlled by oneself.
    • Schema Can be spliced. You can compose and connect multiple GraphQL apis for cascading queries and more.
    • Community friendly, there are many excellent libraries to use directly: Apollo, Relay, etc.

After selecting the basic framework, we started to build the project environment.

Build the TypeScript environment

TypeScript is a superset of JavaScript, meaning it is fully compatible with JavaScript files. However, TypeScript files do not run directly in the browser, and need to be compiled to generate JavaScript files.

1. Create tsconfig.json file.

  • Tsc-init generates an initialization tsconfig.json file.
  • Vscode will check the tsconfig.json file for dynamic type checks, syntax errors, etc.
  • The TSC command converts ts code to JS code according to the rules configured in the tsconfig.json file.
  • Tslint will read the rules in the tsconfig.json file to assist in encoding specification verification.
    • Tslint will be deprecated and replaced by ESLint.
    • Eslint will also use the contents of the tsconfig.json file.

2. Configure ESLint.

Configure ESLint support for typescript according to the typescript-ESLint boot.

  • @typescript-eslint/parser Parses TS syntax.
  • @typescript-eslint/eslint-plugin Applies ESLint and TSLint rules to TS files.

3. Choose a typescript compiler, TSC or Babel?

Use the Babel. The benefits are as follows:

  • There are many nice plug-ins in the Babel community, babel-Preset -env can support specific compatible browser version numbers, which the TSC compiler doesn’t.
  • Babel can compile JS and TS at the same time, so there is no need to compile TS files after introducing TSC. Only one compiler is managed, which is more maintainable.
  • Babel compiles faster. The TSC compiler needs to traverse all the type definition files (*.d.ts), including those in node_modules, to make sure they are used correctly in the code. Too many types can cause stutter.

Babel process analysis

Babel is a JS syntax compiler that has three stages at compile time: parsing, conversion, and output.

  • Parsing phase: Parsing JS code into an abstract syntax tree (AST).
  • Transformation phase: Modify the AST to produce a transformed AST.
  • Output stage: Output the transformed AST into JS files.

The plugin and preset

  • Plugin: parse, convert, and output converted JS files. For example: @babel/plugin-proposal-object-rest-spread will output support{... }Deconstruct the syntax of the JS file.
  • Preset: is a set of assembled plugins. For example: @babel/preset-env enables code to support the latest ES syntax, automatically introducing plugins that need to support new features.

4. Collect all ts, TSX pages (webpack for front-end environment, gulp for Node project) and compile them into JS files via Babel.

Setting up the React Environment

React is a component-based library that uses the following syntax:

  • Es6 modular.
  • JSX syntax.
  • Typescript syntax.
  • CSS preprocessor.

The syntax cannot be implemented directly in the current browser. It needs to be packaged and compiled, which is the main work of setting up the React environment.

Specific steps

1. Create a new HTML file and create a root node in the body to mount the dom generated by React.

2. Create a new index.tsx file to bring in all the components in the project and call the Render method to render the components to the root node.

3. React Project hierarchy.

  • Containers directory, which holds individual pages
  • Components directory, which stores components, a component contains both JSX and CSS parts.
  • Context directory, which stores the public React context.
  • Config directory for storing common configuration files.
  • Utils directory, a common library of function components.
  • Constants Directory for holding static variables.

4. Configure webPack, take index. TSX as the entry file, package and compile.

  • Because different environments use different packaging methods, the configuration files of the development environment, online environment, and optimization environment are abstracted and merged using webpack-merge.
  • To configure the CSS preprocessor, use less-loader.
  • Configure the TS compiler, using babel-loader.
    • @babel/preset-env: Compiles the latest ES syntax.
    • @babel/ preth-react: Compiles the react syntax.
    • @babel/preset-typescript: Converts typescript syntax.
  • Configure url-loader to package image resources in the project.
  • Configure the HTML-webpack-plugin to inject the js and CSS generated in step 1 into the HTML.
    • Use ejS templates to configure the CDN imported from the development environment and online environment.
  • Development environment configuration, using webpack-dev-server out of the box.
    • Webpack-dev-server can automatically listen for file changes, automatically refresh pages, and default source-map functions.
    • Configure hot module replacement, react-hot-loader.

Webpack packaging principles

The packaging process of Webpack is like an assembly line, starting from the entry file, collecting the dependencies of all files in the project. If a module cannot be recognized, the corresponding Loader is used to convert it into a module that can be recognized. Webpack can also control the webPack output by using the plugin to mount custom events in the pipeline lifecycle.

5. Write NPM script and open the development mode with one key.

// use cross-env to set the environment variable "scripts": {"dev:client": "cross-env NODE_ENV=development webpack-dev-server --open"}Copy the code

6. Now run NPM run Dev :client and have fun writing client code.

Build the NodeJs environment

Because The Node side uses typescript and the latest ES syntax, it needs to be packaged and compiled.

  • Configure gulp, iterate through every TS file, call gulp-babel, convert TS code to JS code.
  • Configure the Supervisor to automatically restart the Node service. (The Nodemon cannot monitor non-existent directories.)
  • Write NPM script to start the Node side development environment in one click.
"scripts": {
  "dev:server": "cross-env NODE_ENV=development gulp & cross-env NODE_ENV=development supervisor -i ./dist/client/ -w ./dist/ ./dist/app.js",
}
Copy the code

Once gulp is configured, you can run NPM run dev:server to start the server-side development environment in one click.

Hierarchical structure division

The project adopts the traditional MVC pattern for hierarchical division.

Model layer

The main work of Model layer: connect database, encapsulate database operations, such as: new data, delete data, query data, update data and so on.

  • Create a new Model folder where each file corresponds to a table in the database.
  • The model file contains add, delete, change, and query operations on a table.
    • Using Mongoose makes it easier to read and write mongodb databases.
  • The Model file returns the encapsulated object for use by the Controller layer.

The Controller layer

The main job of the Controller layer is to receive and send HTTP requests. Based on the front-end request, the Model layer is called to retrieve the data and return it to the front-end.

Traditional backends also typically contain a Service layer, which is dedicated to handling business logic.

  • According to the front-end request, the corresponding Model layer is found to obtain data, which is returned to the front-end after processing.
  • Write middleware, record system logs, error handling, 404 pages, etc.
  • BrowserRouter in front React-Router is supported. The route is configured based on the front-end route and the back-end route. The matching result is an index.html file.
  • The GraphQL used in the project is fairly basic and is handled directly in the Controller layer.

The View layer

The main job of the View layer is to provide the front-end page template. In the case of server-side rendering, data from the Model layer is injected into the View layer and returned to the client through the Controller layer. As react rendering is used in the front end of this project, the View layer is directly the webpack packed page.

  • Use KOA-static to provide a static file server for accessing HTML files generated after the front-end packaging.

Build the GraphQL environment

GraphQL is a query language for APIS that requires GraphQL support to be configured on the server side and requests to be made on the client side using GraphQL syntax format.

Build graphQL environment faster using Apollo.

  • Apollo-server is configured on the server.
    • Use schema to define the type of request and the format returned.
    • Use resolvers to process the corresponding schema.
  • Apollo-client is configured on the client.
    • Request data according to the schema defined by Apollo-server.

Setting up the MongoDB Environment

MongoDB is a document storage oriented database, and it is very simple to operate.

Mongoose provides mongodb with a straightforward, Scheme-based structure to define your data model. It has built-in data validation, query build, business logic hooks, and more, right out of the box.

  • Use Mongoose to establish a connection with the local mongodb.
  • Create a model that corresponds to a table in mongodb.
  • Add, delete, change and check functions according to the model package, and return to the Controller layer for use.

The next steps are to install mongodb, start the service, and you’re done.

Setting up the Test Environment

This project uses JEST as the testing framework. Jest contains assertions library, testing framework, Mock data and other functions. It is a large and comprehensive testing library. Because the react project is used on the front end, a special enzyme library is introduced to test react.

1. Create the jest. Config. js file.

  • Initialize the setup.ts file.
    • Configure the corresponding enzyme Adapter based on the React version.
    • Mock global variables such as fech, canvas, etc.
  • Configure the file to test.
  • Configure the mock data file.
  • Configure how the test file is compiled.
    • Ts code is compiled using TS-Jest.
  • Configure code coverage files.

2. Write test files.

  • Create a new __mocks__, __tests__ directory to store test files and mock data files.
  • Create the corresponding test file directory according to the directory in SRC.

3. Write test scripts and upload coverage scripts.

"scripts": {
  "test": "jest --no-cache --colors --coverage --forceExit --detectOpenHandles",
  "coverage": "codecov"
}
Copy the code

Configuring the Online Environment

With the various environments installed, the next step is to consider bringing the project online.

Configuring the Server Environment

  • Install the NodeJS environment. NVM installation node
  • Install the PM2 daemon.npm i pm2 -g
  • Install the mongo. Mongodb Official Documentation
  • Install the free HTTPS certificate.Letsencrypt website
    • The domain name needs to be put on record first (using Ali Cloud for the record, if the data is ready, it can be approved in about 10 days).

Code release

This project release is very simple and takes only one step, which is the result of continuous integration configuration.

# clone with Git Bash
git clone https://github.com/yhlben/cdfang-spider.git

# change directory
cd cdfang-spider

# install dependencies
npm i

# build for production with minification
npm run build
Copy the code

Everything is done under the build command, so let’s look at what the NPM run build command does.

  • Eslint syntax error checking.
  • Unit testing.
    • Upload test coverage.
  • Package the client code.
    • After packaging, the HTML file is generated as the View layer on the Node side, which is bound to the back end.
    • Other static resources are automatically uploaded to qiniu-upload-Plugin for one-click upload after webpack is packaged.
  • Package server-side code.

This is done by creating an NPM script, but the commands should not be typed manually every time. This is done by configuring travisCI to run the commands automatically every time the master branch commits code.

TravisCI configuration

TravisCI is a continuous integration platform. When Github submits code, travisCI is notified, and then performs operations based on the travisCI configuration information. The results are reported to users in a timely manner. The travisCI configuration file can be found in the.travis. Yml file at the root of the project. The core of the configuration file is the configuration of script.

script:
  - npm run build
  - npm run test
after_success: npm run coverage
Copy the code

As you can see, every github commit, travisCI will execute a build task named Build. The task is divided into two steps: build, test, and coverage. TravisCI will notify us by email if any errors occur during the execution of the command. When it’s time to go live, check the CI status first, and if you’ve gone through all the steps, you don’t have to worry about releasing code with problems.

conclusion

So far, the selection and construction process of the whole project has been introduced. Of course, there are still some details that have not been written in. If you don’t understand something, you can issue it or add it to wechat YHL2016226.

Write a short summary of the following four aspects.

  • Development: The project connected the front end, back end and database end to form a small full stack project, which deepened my understanding of the whole development process.
  • Testing: Accumulated experience in automated testing by writing unit tests, UI tests and API tests.
  • Operations: Continuous integration, daemons, nginx, HTTPS, etc. enabled me to implement the deployment of small projects.
  • Technical aspects: The project uses some relatively new technologies, such as hooks API, GraphQL, etc., but they are very basic, mainly for practice, and further study.

For the late update of the project, it is mainly based on the following aspects: GraphQL, Docker, K8S, microservices, Serverless, etc. There are too many things, we still need to learn more, 😂

Refer to the link

  • The TypeScript and Babel
  • Front end decision tree