【 Beijing 】 IT technical personnel face interview, job-hopping, promotion and other problems, how to grow quickly, get the entrance qualification of big factory and promotion and salary chip? Face to face with dachang technology masters to answer your questions. My Career Anxiety and Redemption: A Journey from a young professional to a technical director

Congratulations FPX, new king, LPL * B we are the champions

The original link: segmentfault.com/a/119000002…

Recently, in the process of the company’s implementation of Docker Swarm cluster, Jenkins is needed for automatic deployment. There are many schemes for automatic deployment of Jenkins. You can directly write a Job on the Jenkins page and set some operations and scripts through the page. I could have written Pipeline scripts for every project, but none of these would have touched my heart as a brilliant and extreme programmer as I am. I will tell you how I used Pipeline scripts to automate deployment solutions and achieve multi-branch builds. It also implements a shared Pipeline script for all projects.

Use some of the Settings before Jenkins

In order to build Jenkins quickly, HERE I use Docker to install and run Jenkins:

$ sudo docker run -it -d \
  --rm \
  -u root \
  -p 8080:8080 \
  -v jenkins-data:/var/jenkins_home \
  -v /var/run/docker.sock:/var/run/docker.sock \
  -v "$HOME":/home \
  --name jenkins jenkinsci/blueocean
Copy the code

When using Jenkins for the first time, password verification is required before entering the Jenkins page. We need to enter the Docker container to check the password:

$ sudo docker exec -it jenkins /bin/bash
$ vi /var/jenkins_home/secrets/initialAdminPassword
Copy the code

Docker installed Jenkins has a slight defect, shell version is not compatible with CenOS host version, then we need to enter Jenkins container manually set shell:

$ sudo docker exec -it jenkins /bin/bash
$ ln -sf /bin/bash /bin/sh
Copy the code

Since our Pipeline also needs to perform tasks on the remote server and need to connect via SSH, we need to generate SSH public key in Jenkins:

$ sudo docker exec -it jenkins /bin/bash
$ ssh-keygen -C "root@jenkins"
Copy the code

Add Jenkins’ public key (id_rsa.pub) to ~/.ssh/authorized_keys on remote node

You also need to install some necessary plug-ins:

  1. Pipeline Maven Integration
  2. SSH Pipeline Steps

After installing the plugin, you also need to go to the global tool to add Maven:

Jenkinsfile is useful here.

MutiBranch multi-branch build

Since our development was based on multi-branch development, with one branch for each development environment, the existing development deployment requirements were not met by normal Pipeline automation builds, so we needed to use Jenkins’ mutiBranch Pipeline.

Create a mutiBranch job:

Next, set the branch source, which is your project’s Git address, and select Jenkinsfile in the project’s path

Next, Jenkins will scan the Jenkinsfile under each branch in the branch source. If there is Jenkinsfile under this branch, a branch Job will be created

The branch jobs of this job are as follows:

Note that Jenkinsfile is added only to the branch that needs to be deployed, otherwise Jenkins will create a branch job for the rest of the branches.

Generalize the Pipeline script

Here before, the basic can be based on the Pipeline automation script deployed, but if you are a pursuit of perfection, unwilling in mediocre programmers, you’ll want to, as the project increases, increasing Pipeline script, which can cause more and more maintenance costs, as the growth of the business, hard to avoid can modify something in the script, This would involve too many Pipeline scripts, and these scripts are basically the same, so for me as a good programmer, how can not think of this problem, I first thought of generic Pipeline scripts. Fortunately, Jenkins spotted my agitation and handed me a Shared library.

What are Shared Libraries? As the name implies, it is a shared library, its main role is used to put the common Pipeline script in one place, other projects can get a global universal Pipeline script from it, between projects through the variable parameters pass, to achieve the purpose of generalization.

Let’s start by creating a Git repository for storing generic Pipeline scripts:

The warehouse catalog is not added arbitrarily, Jenkins has a strict specification, the following is the official description:

The vars directory is used to store generic Pipeline scripts, and resources is used to store non-Groovy files. So HERE I put the Pipeline required to build scripts and choreography files are concentrated here, completely hidden to business engineers, the purpose of doing so is to avoid business engineers do not understand a few random changes, resulting in bugs.

After creating the Git repository, we also need to define the Global library in Jenkins’ Manage Jenkins » Configure System » Global Pipeline Libraries:

The name here can be referenced in jenkinsfile with the following command:

@Library 'objcoding-pipeline-library'
Copy the code

Here we look at the general Pipeline script writing rules:

#! groovy def getServer() { def remote = [:] remote.name = 'manager node' remote.user = 'dev' remote.host = "${REMOTE_HOST}" remote.port = 22 remote.identityFile = '/root/.ssh/id_rsa' remote.allowAnyHosts = true return remote } def call(Map map) { pipeline { agent any environment { REMOTE_HOST = "${map.REMOTE_HOST}" REPO_URL = "${map.REPO_URL}" BRANCH_NAME = "${map.BRANCH_NAME}" STACK_NAME = "${map.STACK_NAME}" COMPOSE_FILE_NAME = "docker-compose-" + "${map. STACK_NAME}" + "-" + "${map. BRANCH_NAME}" + "yml"} stages {stage (' access code ') {steps {git ([url: "${REPO_URL}", branch: "${BRANCH_NAME}"])}} stage(' compile code ') {steps {withMaven(maven: 'Maven 3.6') {sh "mvn-u -am Clean package-dskiptests"}}} stage(' Build mirror ') {steps {sh "wget -o build.sh https://git.x-vipay.com/docker/jenkins-pipeline-library/raw/master/resources/shell/build.sh" sh "sh build.sh ${BRANCH_NAME} "}} stage('init-server') {steps {script {server = getServer()}}} stage(' execute version ') {steps { writeFile file: 'deploy.sh', text: "wget -O ${COMPOSE_FILE_NAME} " + " https://git.x-vipay.com/docker/jenkins-pipeline-library/raw/master/resources/docker-compose/${COMPOSE_FILE_NAME} \n" + "sudo docker stack deploy -c ${COMPOSE_FILE_NAME} ${STACK_NAME}" sshScript remote: server, script: "deploy.sh" } } } } }Copy the code
  1. Since we need to perform tasks on a remote server, we define a remote server information where remote.identityFile is the address of the key we generated in the container above;
  2. Define a call() method to be called in the Jenkinsfile of each project. It must be called Call.
  3. Define a pipeline in the call() method;
  4. The environment parameter is a variable generic parameter that is given a value by passing the parameter Map, which is passed from the parameters defined in each project.
  5. The “compile code” step needs to be filled in with maven, which we set up above in the global utility class. The “Build Image” build script is cleverly pulled from this remote repository using wget. The choreographer 5 file for executing the release does the same, and the “init-server” step basically initializes a server object for “executing the release.”

The script illustrates the kind of thinking Jenkins would like to promote in the future: configuration is code.

Now that the generic Pipeline script is written, we need to create a Jenkinsfile script in the root directory of the branch of each project that requires automated deployment:

Next I’ll explain the contents of Jenkinsfile:

#! groovy// Under the multi-branch build, jenkinsfiles are strictly specified to exist only on branches that can be published // reference Jenkins' globally defined Library library'objcoding-pipeline-library'Def map = [:] // Address of the remote management node (used to perform release) map.put('REMOTE_HOST'.'xxx.xx.xx.xxx'// Project gitlab code address map.put('REPO_URL'.'https://github.com/objcoding/docker-jenkins-pipeline-sample.git'// Branch name map.put('BRANCH_NAME'.'master'// Service stack name map.put('STACK_NAME'.'vipay'// Call the build.groovy script build(map) in the var directory of the library.Copy the code
  1. throughlibrary 'objcoding-pipeline-library'Referring to the global library we defined in Jenkins, define a map parameter;
  2. The next step is to save the project-specific parameters into the map and pass them to the generic Pipeline script by calling the build() method.

Shared Libraries greatly improve the versatility of Pipeline scripts, avoid the problems caused by too many scripts, and conform to the aesthetics of a good programmer. If you are an ambitious programmer, you will definitely love it.

If you want to learn programming, please searchCircle T community, more industry related information and industry related free video tutorials. It’s totally free!