Isn’t ApiTestEngine an interface testing framework for performance testing?

Yes, you read that right. ApiTestEngine integrates with the Locust performance testing framework, enabling interface automation and interface performance testing at the same time with a single test case, making it even easier to use than Locust itself without changing any of its features.

If you’re new to Locust, this article is probably not for you. But I highly recommend that you check it out. Locust is an open source performance testing tool written and implemented in Python. It is simple, lightweight, and efficient. The concurrency mechanism is based on the GEvent coroutine, which enables high concurrency stress generated by single-machine simulation. I’ve written about Locust features and tutorials before, and you can find them on my blog.

If you’re not interested in the implementation process, skip to the bottom of the article for the final implementation section.

inspiration

Interface testing and performance testing are two distinct domains in the current testing tools on the market. This also means that when we want to implement interface automation test and interface performance test for the server interface of the same system, we usually use different tools to maintain two test scripts or use cases respectively.

That’s what I did before. But after doing for a period of time, I was thinking, whether it is interface function test, or interface performance test, the core is to simulate the interface to initiate a request, and then the interface response content parsing and verification; The only difference is that the interface performance test has the concept of concurrency, which is equivalent to simulating a large number of users doing the interface test at the same time.

In this case, interface automation test cases and interface performance test scripts should be combined into one set, thus avoiding repetitive script development efforts.

During the development of ApiTestEngine, as mentioned in previous articles, ApiTestEngine implements HTTP request handling entirely based on the python-requests library, and can reuse all features of python-requests when writing interface test cases. Locust is based entirely on the Python-requests library for HTTP Requests.

Based on this relationship, I propose a bold idea: could there be some way or means to make the YAML/JSON interface test cases written in ApiTestEngine also be called directly by Locust?

Inspired by a preliminary

Once you have an idea, you start exploring ways to implement it.

First, we can look at the scripting form of Locust. The following example is a simple scenario (taken from the front page of the official website).

from locust import HttpLocust, TaskSet, task

class WebsiteTasks(TaskSet):
    def on_start(self):
        self.client.post("/login", {
            "username": "test_user"."password": ""
        })

    @task
    def index(self):
        self.client.get("/")

    @task
    def about(self):
        self.client.get("/about/")

class WebsiteUser(HttpLocust):
    task_set = WebsiteTasks
    min_wait = 5000
    max_wait = 15000Copy the code

In the Locust script, we describe the behavior of a single user in the TaskSet subclass, and each method with the @Task decorator corresponds to an HTTP request scenario. One of the great things about Locust is that all test case scripts are Python files, so you can implement complex scenarios in Python.

Wait a minute! Emulating a single user request, in pure Python, isn’t something we already implemented in our interface tests?

For example, the following code is a test case captured from a unit test.

def test_run_testset(self):
    testcase_file_path = os.path.join(
        os.getcwd(), 'examples/quickstart-demo-rev-3.yml')
    testsets = utils.load_testcases_by_path(testcase_file_path)
    results = self.test_runner.run_testset(testsets[0])Copy the code

Test_runner. Run_testset is a method already implemented in ApiTestEngine that passes in the path to the test case (YAML/JSON) and then loads the test case to run the entire test scenario. Also, since we already described validators, the validation part of the interface, in our test case YAML/JSON, we don’t need to validate the interface response.

Next, the implementation is very simple.

All we need to do is make a template file of LocustFile.py, which looks like this.

#coding: utf-8
import zmq
import os
from locust import HttpLocust, TaskSet, task
from ate import utils, runner

class WebPageTasks(TaskSet):
    def on_start(self):
        self.test_runner = runner.Runner(self.client)
        self.testset = self.locust.testset

    @task
    def test_specified_scenario(self):
       self.test_runner.run_testset(self.testset)

class WebPageUser(HttpLocust):
    host = ' '
    task_set = WebPageTasks
    min_wait = 1000
    max_wait = 5000

    testcase_file_path = os.path.join(os.getcwd(), 'skypixel.yml')
    testsets = utils.load_testcases_by_path(testcase_file_path)
    testset = testsets[0]Copy the code

As you can see, in the entire file, only the path of the test case file is relevant to the specific test scenario, and everything else can remain unchanged.

Therefore, in different test scenarios, you only need to replace testCase_file_path with the path of the interface testcase file to implement the interface performance test in the corresponding scenario.

➜ ApiTestEngine git:(master) qualify locust-fLocustfile. Py [11:30:01 2017-08-27, 829] bogon/INFO/locust. Main: Starting Web monitor at *:8089 [2017-08-27 11:30:01,831] Bogon /INFO/locustCopy the code

The rest of the operation is Locust and used in exactly the same way.

Optimization 1: Generate LocustFile automatically

Through the previous exploration and practice, we have basically realized a test case that has both interface automation test and interface performance test functions.

However, it is not convenient enough to use for two main reasons:

  • You need to manually modify the template filetestcase_file_pathThe path;
  • locustfile.pyThe path to the template file must beApiTestEngineIn the project root directory of.

So I came up with the idea of having the ApiTestEngine framework itself automatically generate the LocustFile.py file.

In the process of implementing this idea, I thought about two ways.

First, Locust has a load_locustFile method in main.py that can load a Python file and extract locust_classes (a subclass of Locust). Next, you pass locust_CLASSES as a parameter to the Locust Runner.

With this approach, we could implement a load_LocustFile method that dynamically generates LOCUST_classes from the contents of the YAML/JSON file and then passes it to the Locust Runner. This involves creating classes and adding methods on the move, which has the benefit of eliminating the need to generate locustFile.py intermediate files and provides maximum flexibility, but has the disadvantage of requiring changes to Locust’s source code, namely reimplementing several functions in Locust’s main.py. Although not too difficult, this solution was abandoned in consideration of the maintenance workload required to be consistent with Locust updates.

The second option is to generate an intermediate file like LocustFile.py and pass the file path to Locust. The advantage of this is that we can use Locust directly without changing anything about it. The difference from the traditional use of Locust is that instead of starting Locust with parameters in Terminal, we now start Locust with Python code in the ApiTestEngine framework.

Specifically, I added the locusts command to entry_points in setup.py and bound the corresponding program entry.

entry_points={
    'console_scripts': [
        'ate=ate.cli:main_ate'.'locusts=ate.cli:main_locust']}Copy the code

The main_locust function has been added in ate/cli.py as the entry to the locusts command.

def main_locust(a):
    """ Performance test with locust: parse command line options and run commands. """
    try:
        from locust.main import main
    except ImportError:
        print("Locust is not installed, exit.")
        exit(1)

    sys.argv[0] = 'locust'
    if len(sys.argv) == 1:
        sys.argv.extend(["-h"])

    if sys.argv[1] in ["-h"."--help"."-V"."--version"]:
        main()
        sys.exit(0)

    try:
        testcase_index = sys.argv.index('-f') + 1
        assert testcase_index < len(sys.argv)
    except (ValueError, AssertionError):
        print("Testcase file is not specified, exit.")
        sys.exit(1)

    testcase_file_path = sys.argv[testcase_index]
    sys.argv[testcase_index] = parse_locustfile(testcase_file_path)
    main()Copy the code

If you perform locusts -V or locusts -h, you will find that the effect is exactly the same as locust.

$locusts -v [2017-08-27 12:41:27,740] Bogon /INFO/stdout: Locust 0.8 A2 [2017-08-27 12:41:27,740] Bogon /INFO/stdout:Copy the code

In fact, as can be seen from the above code (main_locust), the locusts command only encapsulates locust, with roughly equivalent usage. The only difference is that when the -f argument specifies a use case file in YAML/JSON format, it is converted to Python format LocustFile.py, which is then passed to Locust.

The parsing function parse_LocustFILE is also simple to implement. All we need to do is save a template file of LocustFile.py (ATE/LocustFILe_template) in the framework and replace testCase_file_path with a placeholder. Then, in the parsing function, you can read the entire template file, replace the placeholders with the actual path of the YAML/JSON use case file, save it as LocustFile.py, and return its path.

The specific code is not posted, if you are interested, you can view it yourself.

With this round of optimization, ApiTestEngine inherits all the functionality of Locust and can directly specify a YAML/JSON file to start Locust for performance testing.

$ locusts -f17:20:43 examples/first - testcase. Yml [2017-08-18, 915] Leos - MacBook - Air. Local/INFO/locust. Main: Starting the web monitor the at * : 8089 [the 17:20:43 2017-08-18, 918] Leos - MacBook - Air. Local/INFO/locust. Main: Starting locust 0.8 a2Copy the code

Optimization 2: Start multiple Locust instances with one click

After the first round of optimization, this should be the end of the story, because it’s easy to switch between interface automation and interface performance testing.

Until one day, in a reply to a TesterHome discussion about Locust, @KeithMork said this.

Look forward to ApiTestEngine becoming more popular than Locust itself

I literally burst into tears when I saw this. I’ve been working hard to maintain Apache Engine, but I never had that kind of hope.

But on the flip side, why not? ApiTestEngine has inherited all the functions of Locust, and can write and maintain test cases in YAML/JSON format without affecting the existing features of Locust. A test case can be used for interface automation and interface performance testing at the same time.

These are all features that Locust does not have, and they are practical features for users.

The new goal was to make Locust user experience more enjoyable by better encapsulating Locust in ApiTestEngine.

Then I thought about an open source project I had worked on, Debugtalk/Stormer. The original intention of this project was that in order to use the performance of all cpus of the machine when we used Locust for pressure testing, we needed to use master-slave mode. Because Locust runs single-process by default, it can only run on one CPU core in the manometer. By using master-slave mode and starting multiple slaves, different slaves can run on different CPU cores, thus giving full play to the performance of the multi-core processor of the manometer.

In actual use of Locust, you can only manually start the master at a time and multiple slaves in turn. If you encounter test script adjustments, you need to end all Locust processes one by one and then repeat the previous startup steps. Those of you who have used Locust know the painful experience. With this in mind, I developed debugTalk/Stormer with the goal of starting or destroying multiple Locust instances at once. After the script was created, I enjoyed it very much and some of my friends on Github liked it.

Now that you want to improve the ease of using ApiTestEngine for Locust, this feature should definitely be added. At this point, the Debugtalk/Stormer project was scrapped and officially merged into the Debugtalk /ApiTestEngine.

Once the idea is clear, it’s easy to implement.

The principle remains the same: do not change the nature of Locust itself, and only operate in the middle layer when passing parameters.

Specifically, we can add a –full-speed parameter. If this parameter is not specified, the value is the same as before. With the –full-speed parameter, multiple instances can be started in a multi-process manner (the number of instances is equal to the number of processor cores on the manometer).

def main_locust(a):
    # do original work

    if "--full-speed" in sys.argv:
        locusts.run_locusts_at_full_speed(sys.argv)
    else:
        locusts.main()Copy the code

The implementation logic is in ATE /locusts.py:

import multiprocessing
from locust.main import main

def start_master(sys_argv):
    sys_argv.append("--master")
    sys.argv = sys_argv
    main()

def start_slave(sys_argv):
    sys_argv.extend(["--slave"])
    sys.argv = sys_argv
    main()

def run_locusts_at_full_speed(sys_argv):
    sys_argv.pop(sys_argv.index("--full-speed"))
    slaves_num = multiprocessing.cpu_count()

    processes = []
    for _ in range(slaves_num):
        p_slave = multiprocessing.Process(target=start_slave, args=(sys_argv,))
        p_slave.daemon = True
        p_slave.start()
        processes.append(p_slave)

    try:
        start_master(sys_argv)
    except KeyboardInterrupt:
        sys.exit(0)Copy the code

The key is the use of multiprocessing.process, which calls the main() function of Locust in different processes.

Final effect

After the previous optimizations, it’s easy to use when performing performance tests with ApiTestEngine.

After ApiTestEngine is installed, the locusts command is available in the system, which is used in almost the same way as Locust in the Locust framework, and it is perfectly possible to use the locusts command instead of the native Locust command.

For example, the following command is exactly the same as locust.

$ locusts -V
$ locusts -h
$ locusts -f locustfile.py
$ locusts -f locustfile.py --master -P 8088
$ locusts -f locustfile.py --slave &Copy the code

The difference is that Locusts is much richer in functionality.

Interface test case files in YAML/JSON format written in ApiTestEngine, run directly to start Locust to run performance tests.

$ locusts -f17:20:43 examples/first - testcase. Yml [2017-08-18, 915] Leos - MacBook - Air. Local/INFO/locust. Main: Starting the web monitor the at * : 8089 [the 17:20:43 2017-08-18, 918] Leos - MacBook - Air. Local/INFO/locust. Main: Starting locust 0.8 a2Copy the code

With the –full-speed parameter, you can start multiple Locust instances at the same time (the number of instances equals the number of processor cores) to maximize the performance of the pressure machine’s multi-core processor.

$ locusts -fExamples /first-testcase.yml --full-speed -p 8088 [2017-08-26 23:51:47.47,071] bogon/INFO/locust. Main: Starting Web Monitor at *:8088 [2017-08-26 23:51:47,075] Bogon /INFO/locust. [2017-08-26 23:51:47,078] Bogon /INFO/ Locust. Main: [2017-08-26 23:51:47,080] Bogon /INFO/ Locust. Main: [2017-08-26 23:51:47,083] Bogon /INFO/ Locust. Main: Starting Locust 0.8 a2 [23:51:47 2017-08-26, 084] bogon/INFO/Locust. The runners: Client'bogon_656e0af8e968a8533d379dd252422ad3'Reported as ready. Currently 1 clients ready to swarm. [23:51:47 2017-08-26, 085] bogon/INFO/locust. The runners: Client'bogon_09f73850252ee4ec739ed77d3c4c6dba'Reported as read. Currently 2 clients ready to swarm. [2017-08-26 23:51:47.084] Bogon /INFO/locust. Main: Starting Locust 0.8 a2 [23:51:47 2017-08-26, 085] bogon/INFO/Locust. The runners: Client'bogon_869f7ed671b1a9952b56610f01e2006f'Reported as ready. Currently 3 clients ready to swarm. [23:51:47 2017-08-26, 085] bogon/INFO/locust. The runners: Client'bogon_80a804cda36b80fac17b57fd2d5e7cdb' reported as ready. Currently 4 clients ready to swarm.Copy the code

In the future, ApiTestEngine will continue to optimize, welcome your feedback and suggestions for improvement.

Enjoy!