The AUTHOR’s CI/CD pipeline is mainly implemented by Jenkins Shared Library

Jenkins Shared Library is literally confusing; Easy to understand, can be simply considered as a common module, similar to the binary package, to achieve pipeline logic reuse.

In the early days of Jenkins release, each code repository often needed to place a Jenkfiles, which defined a large number of repeated CI/CD logic. These repeated logic can be extracted to form a common module, which is the so-called Jenkins Shared Library

Shared Library structure

In my opinion,Shared Library is not only a common module, but also a development framework, which is written according to the specification logic, the official structure of Shared library

(root)
+- src                     # Groovy source files
|   +- org
|       +- foo
|           +- Bar.groovy  # for org.foo.Bar class
+- vars
|   +- foo.groovy          # for global 'foo' variable
|   +- foo.txt             # help for 'foo' variable
+- resources               # resource files (external libraries only) | + -org
|       +- foo
|           +- bar.json    # static helper data for org.foo.Bar
Copy the code
  • Var puts a common function or logic that defines the implementation code for the main logic of a pipeline
  • Resources places resource files that are equivalent to the Resources directory in a Java project
  • SRC places common utility functions, such as alarm and message notification functions

“Degradation” of Jenkinsfile

The early common practice was to put the pipeline implementation logic in Jenkinsfile; With Shared Library functionality, it is easy to define reusable logic in jenkinsfiles as global variables in the var directory, and the original Jenkinsfile “degrades” to just a configuration file

As shown below, we define global variable — javadeploy.groovy in the var directory and implement pipeline logic in its call() method:

def call(Object WorkflowScript) {
    // Accept the parameters configured in Jenkinsfiledef application = WorkflowScript.application ? :' 'def port = WorkflowScript.port ? :' 'def project = WorkflowScript.project ? :' 'def receivers = WorkflowScript.dingTalkRecivers ? : ['defaultReceviers'] def jvmOptions = WorkflowScript.jvmOptions ? :'-server -Xms512m -Xmx1g -XX:MetaspaceSize=128m -XX:MaxMetaspaceSize=512m'
    
    pipeline {
      agent any
      stages {
        stage('Git Clone') {
            steps {
                checkout scm
            }
        }
        
        stage('testing') {
            steps {
             ...
            }
        }

        stage('compiled') {
            steps {
              ...
            }
        }
        stage('Push image') {
            steps {
              ...
            }
        }
        stage('deployment'){ steps { ... }}}}Copy the code

So, how to reference the global variable? Jenkinsfile is placed in the code root directory, the relevant publishing parameters are configured in Jenkinsfile, and the parameters are passed by calling javaDeloy at the end:

cat Jenkinsfile

import groovy.transform.Field

/ / application name
@Field
String application = 'demo'

// Application port
@Field
String port = '9903'

// JVM stack space parameter default value
@Field
String jvmOptions = '-server -Xms512m -Xmx1g -XX:MetaspaceSize=128m -XX:MaxMetaspaceSize=512m'

// Pin a list of names or nicknames, multiple users separated by commas to publish notification of results
@Field
List dingTalkRecivers = ['a'.'b']

// Call the global variable
javaDeploy(this)
Copy the code

The Shared Libray logic is also stored in the repO like the business code, which is loaded when Jenkins’ job actually runs

This reduces the content of Jenkfiles to just a configuration file; Developers no longer have to deal with weird Pipeline syntax, and operations people no longer have to worry about developers making arbitrary changes

Introduction of the K8S plug-in

The original Jenkins system of the company was a structure of one master and three subordinates, but it was unable to cope with the daily release volume of hundreds of times, and often broke down due to OOM. After several hardware upgrades, it barely managed to support it. Therefore, considering cost and performance, we decided to introduce K8S plug-in to realize flexible and dynamic functions. If there is a build task, create pod and execute it, and destroy it immediately after execution

So, how does Jenkins introduce it into his Pipeline? Take the official Declarative Pipeline as an example:

pipeline {
  agent {
    kubernetes {
      yaml ' '' apiVersion: v1 kind: Pod metadata: labels: some-label: some-label-value spec: containers: - name: maven image: maven:alpine command: - cat tty: true - name: busybox image: busybox command: - cat tty: true '' '}}Copy the code

As you can see, there is a large section of agent Pod definition unrelated to the main logic, resulting in the overall code logic is a bit bloated

For code brevity, we can separate the POD definition into a separate file and then import it into the main logic of the pipeline

As shown below.

jenkinsAgent.yaml

apiVersion: v1
    kind: Pod
    metadata:
      labels:
        some-label: some-label-value
    spec:
      containers:
      - name: maven
        image: maven:alpine
        command:
        - cat
        tty: true
      - name: busybox
        image: busybox
        command:
        - cat
        tty: true
Copy the code

Introduce Agentyaml in Jenkinsfile to load static files through libraryResource and convert them to string variables

 @Field
 def jenkinsAgentYaml = libraryResource 'com/xxxx/java/jenkinsAgent.yaml'

 pipeline {
     agent {
       // enable kubernetes plugin
       kubernetes{
         yaml jenkinsAgentYaml
       }
     }
   ...
}
Copy the code

This makes the code look cleaner and cleaner

The introduction of Dockerfile

With dockerfiles, the convention is to place them in the same REPO as the business code; However, considering that the Dockerfile used by most of the company’s projects is basically the same, these common logic are also extracted and stored in Jenkins Shared Library, uniformly placed in the Resource resource directory, and called for processing in the main logic of pipeline

Java project Dockerfile

FROM BASE_IMAGE
MAINTAINER beiguo <[email protected]>

ARG module_name

RUN cp /usr/share/zoneinfo/Asia/Shanghai /etc/localtime && echo "Asia/Shanghai" >/etc/timezone

ADD ${module_name}.jar /
ADD entrypoint.sh /
ENV module=${module_name}
ENTRYPOINT ["sh"."/entrypoint.sh"]
Copy the code

The startup script used in entryPoint.sh is also placed in the Resource directory

Then call it in Jenkinsfile

// Load the Java application Dockerfile
@Field 
def dockerFile = libraryResource 'com/xxx/java/Dockerfile'

pipeline {
      ...
        stage('Build') {
            steps {
                    ...
                    writeFile file: "Dockerfile",text: dockerFile
                    sh "docker build --build-arg module_name=${params.module_name}-t xxx.com/java/demo". }}}Copy the code

Dynamic generation of JVM parameters

JVM parameters are also used differently in different environments, such as online, test, and pre-release, so JVM parameters need to be generated dynamically in pipeline

Utility functions such as these can be placed in the SRC directory

src/com/xxx/Utils.groovy

package com.xxx

class Utils implements Serializable{

     /** * Generate JVM parameters based on the deployment environment * @param deployEnv Deployment environment * @Param Module Java application module name * @Param jvmOpts JVM stack space parameter p.S.-server-xMS512m -Xmx512m -XX:MetaspaceSize=128m -XX:MaxMetaspaceSize=512ms * @return */
    def generateJavaOpts(String deployEnv,String module,String jvmOpts,String naocsConfigServer) {
        
        def logDir="/log/${module}"
        def sentinelService="sentinel:8280"
        def skywalkingAgent="/usr/skywalking/agent/skywalking-agent.jar"
        def skywalkingBackend="oap.skywalking:11800"
        def skywalkingServiceName="${module}"

        // JVM GC options
        def gcOpts=("-verbose:gc"
                       +" -XX:+PrintGCDetails"
                       +" -XX:+PrintGCDateStamps"
                       +" -XX:+PrintGCTimeStamps"
                       +" -XX:+UseGCLogFileRotation"
                       +" -XX:NumberOfGCLogFiles=100"
                       +" -XX:GCLogFileSize=100K"
                       +" -Xloggc:${logDir}/gc.log"
                       +" -DLOG_PATH=${logDir}/log.log")
        // HeapDump options
        def dumpOpts="-XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=${logDir}"
        // sentinel options
        def sentinelOpts="-Dcsp.sentinel.api.port=8719 -Dcsp.sentinel.dashboard.server=${sentinelService}"
        // nacos options
        def nacosOpts="-DNACOS_CONFIG_SERVER=${naocsConfigServer}"
        // skywalking options
        def skywalkingOpts=("-Dskywalking.agent.service_name=${skywalkingServiceName}"
                          + " -Dskywalking.collector.backend_service=${skywalkingBackend}"
                          + " -javaagent:${skywalkingAgent}")
        def javaOpts = ' '
        switch(deployEnv){
            case "test":
               javaOpts = "${jvmOpts} ${gcOpts} ${dumpOpts} ${nacosOpts} -Dproject.name=${projectName}"
               break
            case "production":
               javaOpts = "${jvmOpts} ${gcOpts} ${dumpOpts} ${sentinelOpts} ${nacosOpts} ${skywalkingOpts} -Dproject.name=${projectName}"
               break. }return javaOpts
    }
}
Copy the code

conclusion

In addition to the Jenkins Shared Library, there are other flexible features such as dynamic generation of K8S YAML manifest files