Introduction The birth of Docker There is no doubt that Docker has become one of the hottest and even most subversive technologies in recent years. First of all, let’s briefly describe the basic process of project release in traditional projects.

Local development + test, compile, package and publish to the test environment if there are no problems

Test in the test environment and release to production when the test is complete

Do the final test in production, and if there are no problems, then everything is OK

If you’ve been working as a developer for more than a year, I think 100% of you have experienced this firsthand — you can test it locally and then release it to a test environment, production environment, and then you have a problem! Make oneself very distressed! Very tangled! What went wrong? It’s the same code and everything

So in this case, there are actually N possibilities, but I’m going to suggest a common reason here, which is:

The configuration of the software environment in the local development environment, test environment, and production environment may be inconsistent, causing problems sometimes when the same code runs in different environments.

Problems: Software environments are inconsistent on different machines. (Core issues)

The company bought a new server on Aliyun. In order to release the project normally, the premise is to reinstall some software environment (such as Node, mysql, nginx, etc.) on the server. During the installation of software environment, configuration errors are very likely to occur. Some more complex environment configuration steps will be many, many people can not remember the specific steps and commands, but also have to search……

Problems: The configuration of the software environment is various and the commands are not clear.

Like JDK, Tomcat and other basic environment construction, have been very skilled, every time there is a new machine, have to build again, so that caused repetitive work, low efficiency, cumbersome configuration, error-prone and other situations.

Problems: Repetitive construction of software environment, low efficiency.

Of course, there are other problems, too, which I won’t elaborate on here.

Therefore, we in the development, if we want to deploy 10 projects, that will need to find 10 virtual machine, and then give the corresponding corresponding 10 10 virtual machine deployment environment, repetitive work, but in the process of deployment will derive a lot of problems, the above mentioned problem is just one part of it, This will lead to the landing of the project and cause great problems, and the efficiency is also extremely low. Therefore, Docker technology helps us solve the pain points I mentioned above. Good news for programmers. Next, we briefly understand the basic concept of Docker and how it solves the problems and pain points mentioned above.

Docker is essentially a container with virtualization technology, which is repackaged based on Linux containers, so that users do not care about container management and simplify application operations.

Traditional virtualization is implemented based on hardware. If you want to deploy 10 applications, you need to create 10 virtual machines, while Docker is virtualization based on operating system, that is, reuse local host operating system. When deploying and operating 10 applications, only 10 isolated applications are needed.

We need to configure the corresponding environment on each server before, but with the appearance of Docker, we just need to go to the docker public community and download the image environment through the Docker command.

By launching the container command, you can use the corresponding environment on your server, which allows developers to use the environment without knowing the corresponding installation command. It also takes up fewer resources on the server.

Developers only need to add the corresponding Dokcerfile file in the project structure, you can realize the code and environment container in the server, so that the project environment will be configured according to the current Dockerfile corresponding environment, to solve the problem that the project can not run normally due to the different environment.

Once you’ve used the Docker command to deploy a project, now the crux of the problem is that every time we create a new project we have to go to the server and execute the same command to start the container, to start the project. This is a repetitive operation. For example, for 10 projects, the developer needs to run the same command in the server 10 times. This operation can be fatal.

So the next thing to do is “automate the repetitive”!!

DevOps automates the repetitive. It enables project management, automated deployment, automated release, automated testing, container clouds for continuous integration, continuous delivery, and continuous deployment.

Docker and Devops automated deployment practices

DevOps (a portmantega of Development and Operations) is a collective term for a group of processes, methods, and systems. It enables continuous integration, continuous delivery, and continuous deployment.

Continuous integration:

In the industry, we often refer to it as CI, which is a development, deployment practice. Developers push their changes to the main point for integration on a daily basis, often many times a day. From a higher perspective, CI enables developers to find bugs in modules or features more quickly. Regardless of the size of the submission, CI allows the entire team to see the impact of each submission on the overall project. When a problem occurs in the process, the CI allows you to know exactly which commit caused the problem, and it is possible to see from the error message which line of code caused the problem.

Continuous delivery, continuous deployment:

Continuous interaction, or CD for short in the industry, refers to direct deployment through code after everything has been automated. As the final step in the process, set up monitoring alarms to ensure that your deployed service is running smoothly. Be the first to notify developers when problems arise.

Now that you know Devops, let’s look at the implementation process

The process of Devops

1. After the developer completes the code development locally, it is submitted to the local branch, and docker is used to simulate the production environment for testing. After passing the test, it is merged into the remote main branch;

2. Once the master branch is updated, continuous integration software is triggered for packaged integration (common integration tools: Gitlab-vi, Travis, or Jenkins) automatically builds docker images and pushes them to remote repositories (docker Cloud or the enterprise’s own Docker Registry)

3. Use docker Cloud and other continuous deployment to the Web server.

4. Configure the publisher to pull the image from the repository, run it, and stop the old version. An automated integration deployment is completed.

Next, an NPM package was made through Node, and an automatic deployment was realized by combining worker Bee and Tencent cloud container service CODING. So that we can not understand any underlying principle of the situation, just through the command line operation, you can realize the automatic deployment of the project, experience the charm of technology.

The flow chart is as follows

Simply describe the process

Step1 run the bluej initVue project name command to deploy the domain name

After running commands, will obtain the VUE template from the git repository, at this time in the root directory of the project file will be more than three files, respectively is Dockerfile Jenkinsfile, nginx. Conf three files, is for the sake of being behind the automated deployment of authentic spare.

Step2 Name of blueJ Run project

The project goes into development mode, which is our normal development mode, and we just do what we need to do here

Step3 use git command to create a new project

These are the basic operations of git, there is no more discussion, when the remote repository successfully contact, go to step4

Step4 use container service CODING of Tencent cloud to realize automatic deployment and configuration

Before deploying, you need to manually create an artifact to hold images of the project and set the permissions to be available to everyone on the team.

Create a new build plan in the team project.

Choose a custom build process.

After selecting the associated worker bee project and setting the pipeline source as the pipeline setting of the project itself, click OK and build immediately.

Next, the automated deployment is performed step by step in a custom pipeline.

After the deployment is complete, you can access the domain name.

At the same time, we can access the running status of the container in the server remotely through the terminal to check whether the container of the current project is running. We can check the running container through the docker ps command, and bluej_test is the project that has just been automatically deployed.

conclusion

1. Standardized the internal development process of the team through automatic deployment to improve development efficiency.

2. Automatic deployment improves server performance and maximizes server resource utilization.

3. Through automatic deployment, real-time monitoring of the operation of each project, timely remind the developers to make corresponding processing, to avoid the situation of dying in the middle of the project.

4. There is a trend of docker-oriented front-end engineering at present. Learning and mastering this technology is conducive to improving one’s competitiveness.

5. Docker-oriented front-end helps front-end development engineers to focus more on development itself, weaken the software environment and reduce the difficulty of project deployment