**【 key reading 】**
This paper focuses on explaining the advanced way of interface test solution, how to build an efficient interface based team collaboration platform GoAPI from the typical Java +TestNG+Jenkins, centering on the goal of improving interface test ROI (reducing investment cost and increasing utilization rate) and combining business application practice step by step. Let the interface test everyone can do, everyone can use, everywhere available, and how to achieve 10 thousand + interface business interface testing and management.

I. Historical interface testing and management

1.1 A “live” interface

Interfaces are a natural product of application development. Whether you are a developer, tester, or operations person, you will have a lot of contact with interfaces. Developers are the creators of interfaces, they define them, and they give them flesh and blood. Testing is the health keeper of the interface, silently finding “To bugs” that harm their health, at any stage. After listening to my YY, do you think the interface is alive? If not, you can see the following figure

The “Growth History” of interfaces

In this sense, interfaces have a life cycle, and when combined with the software development process, different roles do different things around interfaces at different stages. Below, a “typical” interface life cycle

A “typical” interface lifecycle

Each phase of the interface life cycle requires “careful care,” which involves testing and managing interfaces. Next, take a look at how interfaces have been “taken care of” historically.

1.2 “History” interface management and testing epitome

Thinking of 2014, which is a year of…, our interface management and testing is like this: the interface document management adopts wiki, the interface manual testing adopts Postman/Jmeter, the interface automation is based on TestNG test framework. Jenkins CI – driven interface line monitoring is adopted.

Interface testing and management in 2014

Take a look at the interface automation testing framework packaged at the time

Java+TestNG interface testing framework

Speaking of the features of the framework, I was “proud” of the following features: Support for concurrent testing at the same time, supports error rerunning of the test at the same time, supports parameterization of the test at the same time, supports test grouping, and manages detailed reports on the test at the same time using testng.xml. I was able to conduct secondary development according to my own needs. At the same time, I also investigated python scheme and Python Nose scheme. The comparison between the two is as follows:

In the process of business development, there are many problems and bottlenecks in interface testing and management, which may be similar to the following:
** The same thing is being done over and over again :**
When developers develop an API, deploy it to the development environment, and then write their own automated scripts or use the POSTMAN tool to verify that the API meets their expectations, they have actually done a simple API test. To mention phase measurement development will write good API interface document to the tester, the tester will deploy code to test environment, and then through the testNG or other interface automation test framework to write test cases, we found that the positive cases to test the API, developers do once, testers did it again in a different way, Is it possible that after the developer is done, the tester can just take it and use it in the test environment?
** Unmeasurable development test quality :**
Developer testing, usually just doing the smoke test case, submitting the interface documentation, verbally describing the interfaces that I’ve verified in the development environment to meet the criteria for testing. However, for testers, this oral statement is not able to measure the quality of the test. Testers lack objective data to evaluate whether the quality of the interface meets expectations, which leads to the phenomenon of version rollback due to quality problems in the later stage, delaying the development cycle. Is there any way to objectively measure the quality of development?
** Stacking effect caused by API changes :**
API interface changes are common, but in the existing process, the change of an interface will involve a series of changes, including the change of interface documentation, the change of interface test cases, the change of interface test code and the change of continuous integration…… The cost of time increases instantly. Is there a way to fix the change in one place and then everything else works out?

The additive effect of “change”

All of the above problems lead to inefficiency in interface testing and management, which is often referred to as low ROI. An intuitive analysis is shown below

Interface testing and management ROI analysis

The ROI of interface testing is f (cost, benefit), and to increase ROI, you need to do two things: reduce investment costs and increase usage. To reduce input costs, we can start from the following aspects:

  • Reduced cost of use case writing (1 interface /10 use cases /2 hours)
  • Reduce the cost of use case optimization (use cases are clearly described and difficult to be layered)
  • Reduced cost of use case maintenance (1 interface change, average maintenance hour)
  • Reduce tool development costs (public test services are not public, duplicate wheels)
To increase the utilization rate, you can start from the following aspects:

  • Can manually
  • Everyone can use
  • When the tool with
  • For each stage
For this reason, roI-specific activities around interface testing and management had to be implemented. A full life cycle analysis of the challenges facing interface testing and management.

Problems and Challenges

Second, the solution of interface testing and management is advanced

2.1 Six elements of interface test writing

How to really reduce the cost of “use case writing, optimization and maintenance”, we should analyze what parts interface test case writing should have, and extract elements that can be optimized for detailed analysis and appropriate medicine. Here are the six elements listed in interface test writing:

Six elements of interface test writing

Examples of the two elements are presented to show the problems and solutions of the analysis
** [Problems] **
The first thing to clarify is the definition of parameterization and generator. Parameterization: the process of converting a specific value into a parameter is parameterization. Generators: Generate values for specific features based on requirements. It can be predicted that when constructing an interface test scenario, an interface input parameter may be a fixed value or a variable, and the generation of variables needs to be generated specifically according to the requirements of the scene. For example, random number is needed to create an account. Parameter signature and encryption and decryption; The construction of timestamp parameters; And so on; On the other hand, parameters are positional and scoped. The location property may be in the path, URL, request body, or request header. A scoped property may be a public parameter that is globally valid for all interface test scenarios, may be valid within a scenario, or may even be temporary. Therefore, to solve the parameterization and generator problems, we need to combine the characteristics of the above analysis.

** [solution] **
The solution of parameterization and generator is applied to different interface test scenarios and interface test stages. The implementation of parameterization can be divided into two categories: general parameterization and service parameterization. The implementation of general parameterization is completed around the full life cycle test platform of GoAPI interface, while the implementation of business parameterization is a self-developed test data service platform

2.2 More than 5 interface tests were executed

Multidimensional analysis from interface test execution, involving more than 5:

  • Multi-granularity: single interface, interface scenario, and execution set
  • Multiple environments: test environment, pre-release environment, production environment
  • Multiple roles: development, testing, operation and maintenance
  • Multiple data: global data, local data, and temp data
  • Multi-stage: development and debugging, self-test smoke, functional test, release verification, online regression, continuous integration, online monitoring

Five “multiples” of interface test execution

Analyze the problems and challenges facing interface test execution, involving four levels:

  • Execution environment constraints
  • Limitation of use phase
  • Limitations on using characters
  • Online surveillance is not systematic

2.2.1 Limited Execution environment

Analysis history For typical interface testing frameworks using Java +TestNG+ Jenkins, there are three problems in the execution environment: low node resource utilization, resource configuration conflicts, and inconvenient extension node migration. In this way, the idea of “containerization” was adopted in the first stage: Jenkins integrated Docker to realize slave containerization, constructed slave Node image templates of different types according to different environments, and automatically pulled up corresponding containers each time to complete automatic destruction.

In the implementation of the second stage, after interface test platformization is completed, the same idea is used to solve the problem, that is, the concept of actuators is introduced into GoAPI. Each actuator is a container, and a service will be started inside the container to interact with the GoAPI platform. No matter which environment you are in, you can dynamically modify the hosts during interface test tasks to ensure seamless environment switching and completely solve the problem of limited execution environment. The entire container management approach is Rancher +K8s, as shown below

This solution has six characteristics:

  • Multi-node: Stable and reliable
  • Multi-environment: meet daily functional testing, regression testing, pre-release verification
  • Multi-room: Quickly verify the high availability architecture of remote multi-room
  • Controllable: Privatized deployment of online monitoring actuators for security control of online test requests
  • Schedulable: Integrates with automated publishing, scheduling actuators immediately after publishing to trigger automated validation
  • Shareable: Share actuator information to facilitate rapid self-testing and collaborative QA testing for developers

2.2.2 Limited interface test phase

When you solve the problem of interface test writing and execution, and increase the interface coverage of each business to more than 95%, you need to implement how to maximize the value of existing automation. How to maximize the use of existing automated use cases throughout the software development cycle, including development, testing, operations, and so on

Combined with the above graphic analysis, the existing problems are imperfect implementation stage and limited role use. The solution can be implemented from the following three aspects:

  • Implement development self-test verification with continuous integration
  • Implement PE release process verification with release platform
  • Implement complete online interface monitoring with GoAPI
Let’s look at how these three aspects are implemented.

Implement development self-test verification with continuous integration

Implement development self-test verification with continuous integration

The above is the continuous integration pipeline currently being implemented by the internal business, which integrates the existing automation in GoAPI into the continuous integration link. It only needs to use the very friendly OpenAPI provided by GoAPI. Similarly, it is also very good to spend some time watching the drama to write a GoAPI execution driver plug-in. The above process is reversed as follows:

  • The development submission code automatically triggers the construction through Webhook, and the construction is successful.
  • Execute unit test card points, all PASS and coverage card points PASS;
  • Perform static code check, with sonar to complete the card point judgment;
  • Package images to upload to a private repository and perform deployment services;
  • Call GoAPI interface test platform openAPI to perform corresponding interface automation;
  • Call Smartauto Intelligent UI automation platform openAPI to perform corresponding UI automation;
  • Call NPT performance pressure measuring platform openAPI to perform the corresponding performance automatic regression;
  • Call CR change code coverage platform to obtain this round of change code coverage;
  • Output overall quality test report, covering multi-dimensional quality measurement;
** Implement PE release process verification with release platform **

There are hundreds of machines in an online application cluster, and the online regression execution cannot completely cover the availability of application instances on each machine, which may cause an application instance to go online with problems due to unknown factors, leading to online failure. So for PE, their demand is to release every application instance through automatic regression, based on this, combined with the internal publishing platform implementation plan is as follows

Publish automatic regression schemes

A general publishing platform should have the following steps: Offline, deploy, Check, and online. First take the current instance offline, then deploy it, then check the service availability, and finally go online to provide the service. However, the current check step is only a health check, not a service functional verification. In this way, GoAPI openAPI can be called based on the check extension to implement automatic execution, and then automatically online after passing. After the implementation of this solution, PE will no longer be “on tenterhooks” every release, because every application instance is strictly functional regression after the launch. From the software research and development cycle above, solutions are given for self-testing and release of development, but the most important link is online interface monitoring. It is very important to find out whether the interface is unavailable in time by monitoring, so as to reduce the use and experience of users. Next, we will talk about the problems and solutions faced by online interface monitoring

2.3 Online interface monitoring closed-loop

Back in 2014, the implementation of online interface monitoring based on Java +TestNG+ Jenkins faced four major challenges:

  • Multiple factors affect execution stability (environmental factors, use case dependency issues, etc.)
  • The alarm mechanism is not perfect (only a single alarm notification and no alarm policy can be configured)
  • Failure analysis Time (It takes a long time to perform log analysis based on TestNG and link data dependence in use case scenarios)
  • Closed loop is not well formed (low alarm accuracy and low recall rate of online interface problems)

In view of the above problems and interface monitoring, analysis of monitoring should have six elements:

  • Continuous and stable 7*24 hour monitoring
  • Perfect alarm notification mechanism
  • Interface return code monitoring
  • Interface Abnormal Monitoring
  • Interface Performance Monitoring
  • Interface service logic monitoring

Interface monitoring six elements and solutions

The internal monitoring system is used to monitor interface anomalies, return codes, and performance, while the service logic directional monitoring relies on GoAPI interface management platform to form an effective closed loop from monitoring, alarm, processing, archiving, and statistics. The following figure shows the monitoring implementation of one service line:

2.4 Benefits of interface testing and management

A number of practical activities were carried out centering on the two aspects of reducing input cost and increasing utilization rate, which greatly improved the overall ROI. The following is a statistical analysis based on an internal business

ROI of interface testing and management

An industry-leading interface testing platform

GoAPI is a team collaboration platform focused on interface lifecycle management and improving the efficiency of r&d and testing. The platform provides convenient interface management, no threshold and multi-dimensional automated testing, perfect OpenAPI extension and other rich capabilities, greatly reducing enterprise research and development and testing costs. Experience access: [GoAPI interface test platform] (https://www.163yun.com/product/goapi?fromyice=M_juejin_6857026426718453768)

Interface based team collaboration platform

A XX business application details to feel how the 10000 + interface test is done:

An overview of the product

The interface test

Scenario testing

Perform set

【 Key summary 】

  • The birth of a platform must be due to the continuous evolution of business problems and challenges;
  • The core of how to improve ROI is to reduce investment costs and increase utilization;