After some reflection and accumulation in my life, I finally started the first round of updates. Besides fixing bugs, I also made many functional updates. In addition, local deployment is complete. If you’re interested, try it out for yourself. Distributed performance testing framework single-node internal test

Determine the name

Determine the project name DCS_FunTester and DCS stands for Distributed Control System. The suffix FunTester. The current project is the Slave project, and we are still thinking about the Master project. However, the multi-node deployment of the Slave project is not affected. The slave project does not depend on the master project and can be deployed independently or with multiple nodes. To call the HTTP interface, you need to request the interfaces of all slave nodes.

Swagger support

Tried a number of ways that swagger or the optimal solution and interface documents address: http://124.70.188.11:8080/swagger-ui/#, over time the server due, you try to replace the domain name and port when local deployment.

Local deployment

The local deployment was planned before, but the progress was a little behind. This week, I pushed the code to Gitee, and you can download and deploy it at will. The actual deployment documentation hasn’t been written yet, because some things are still being tweaked during beta. The deployment procedure is as follows:

buildFunTester

Gitee address, https://gitee.com/fanapi/tester, the latest branch oker, simply pull the latest code directly. Build tool Gradle, execute build or JAR, no Groovy SDK is required.

buildDCS_FunTester

Here rely on FunTester jars, gradle build reference the compile files (‘/Users/oker/Library/groovy – 3.0.8 / lib/FunTester – 1.0. Jar ‘), you modify the directory address directly. Gitee address, https://gitee.com/fanapi/dcs, the default branch master, pull the latest code. Build tool Gradle, execute build or JAR, no Groovy SDK is required.

Start the service

You don’t use docker yet, you use java-jar **** &.

Changing the Authentication Mode

Due to the previous careless consideration, the key authentication was put in the interface parameters, resulting in a series of problems. I’ve done it the other way around: put it in the header for validation. Ignore GET requests. This function is only used for internal test services on the public network (the cloud server will be offline due to its expiration date). This function is not required for local deployment.

Org.funtester.dcs.com code coordinates mon. Wapper. WrappingFilter# doFilter, content is as follows:

        String headerKey = req.getHeader(DcsConstant.HEADER_KEY);
        String method = requestWrapper.getMethod();
        if(! method.equalsIgnoreCase("get") && (StringUtils.isEmpty(headerKey) || ! headerKey.equalsIgnoreCase(DcsConstant.HEADER_VALUE))) { response.getOutputStream().write(Result.fail("Verification failed!").toString().getBytes());
            return;
        }
Copy the code

Local deployment can remove this section.

Execute the use case asynchronously

To place to

The test interface used to run test cases in serial mode, which would lead to long interface response time in practice. However, the internal test interface limits the number of threads and request times, so it is basically controlled within 10s. This update will release the limit and adopt the asynchronous mode of running test cases.

Specific restrictions are as follows:

class HttpRequest extends AbstractBean implements Serializable {

    private static final long serialVersionUID = 324324327948379L;

    @NotNull
    JSONObject request

    // Pressure test mode
    @Range(min = 1L, max = 2000L)
    Integer times

    String mode;
    // Number of threads, which defaults to fixed thread mode
    @Range(min = 1L, max = 100L)
    Integer thread

    // Soft boot time
    Integer runup

    // Use case description
    @Length(min = 1, max = 100)
    String desc

}

Copy the code

Place to go

Asynchronous execution idea is simple, it is in processing the com. Funtester. Frame. The execute. The Concurrent object assembly is completed, call an asynchronous execution method, return a tag values. Use as follows:

    @PostMapping(value = "/post")
    public Result tests(@Valid @RequestBody HttpRequest request) {
        if(! ThreadBase.needAbort())return Result.fail();
        JSONObject r = request.getRequest();
        HttpRequestBase re = FunRequest.initFromJson(r).getRequest();
        Integer times = request.getTimes();
        String mode = request.getMode();
        Integer thread = request.getThread();
        Integer runup = request.getRunup();
        String desc = request.getDesc();
        if (mode.equalsIgnoreCase("ftt")) {
            Constant.RUNUP_TIME = runup;
            RequestThreadTimes task = new RequestThreadTimes(re, times);
            Concurrent concurrent = new Concurrent(task, thread, desc);
            return Result.success(execute(concurrent));
        }
        return Result.fail();
    }
Copy the code

The contents of the execute method are as follows:

    /** * Asynchronously executed use case methods **@param concurrent
     * @return* /
    public int execute(Concurrent concurrent) {
        int mark = SourceCode.getMark();
        new Thread(new Runnable() {
            @Override
            public void run(a) {
                PerformanceResultBean start = concurrent.start();
                FunData.results.put(mark, start);
            }
        }).start();
        return mark;
    }
Copy the code

Since a slave node is currently designed to allow only one task to run at the same time, the issue of thread safety is not considered here and MARK is considered unique.

Org.funtester.dcs.com mon. FunData# results is a ConcurrentHashMap < Integer, PerformanceResultBean > () cases to save the test case execution results have provided interface to access. Due to limited resources (lack of money), database storage and user differentiation were not performed. It’s directly in this map. Use a scheduled task to periodically clean up the data.

    /** * Periodically delete stored expired data *@return* /
    @Scheduled(cron = "0 0 0/3 * * ?" )
    def saveRequestBean(a) {
        def mark = SourceCode.getMark()
        List<Integer> dels = []
        FunData.results.keySet().each {
            if (mark - it > 7200) {
                dels << it
            }
        }
        dels.each {FunData.results.remove(it)}
        logger.info("Timed mission completed! Time: {}", Time.getDate())
    }


Copy the code

For the same reason, thread safety is not a concern here.

Interface for obtaining test results:

    @GetMapping(value = "/get/{id}")
    public Result getRunResult(@PathVariable(name = "id") int id) {
        PerformanceResultBean performanceResultBean = FunData.results.get(id);
        return Result.success(performanceResultBean);
    }
Copy the code

Multiple request support

This update adds multi-request thread pressure support. When creating a test task, store the list of requests, and then take the requests in order to send them out. You can also use random request, or use i++ such a way to fetch, it is ok. If you don’t want to be precise, you can do it either way. As for the function of traffic choreography (each request is sent proportionally), try to add it next time.

The implementation code is as follows:

    @PostMapping(value = "/post2")
    public Result tests2(@Valid @RequestBody HttpRequest2 request) {
        if(! ThreadBase.needAbort())return Result.fail();
        JSONArray requests = request.getRequests();
        request.print();
        List<HttpRequestBase> res = new ArrayList<>();
        requests.forEach(f -> {
            res.add(FunRequest.initFromString(JSON.toJSONString(f)).getRequest());
        });
        Integer times = request.getTimes();
        String mode = request.getMode();
        Integer thread = request.getThread();
        Integer runup = request.getRunup();
        String desc = request.getDesc();
        if (mode.equalsIgnoreCase("ftt")) {
            Constant.RUNUP_TIME = runup;
            ListRequestMode task = new ListRequestMode(res, times);
            Concurrent concurrent = new Concurrent(task, thread, desc);
            return Result.success(execute(concurrent));
        }
        return Result.success();
    }
Copy the code

Custom class org. Funtester. DCS. The template. ListRequestMode code is as follows:

package org.funtester.dcs.template

import com.funtester.base.constaint.FixedThread
import com.funtester.base.constaint.ThreadBase
import com.funtester.httpclient.FunLibrary
import org.apache.http.client.methods.HttpRequestBase

class ListRequestMode<List> extends FixedThread {


    ListRequestMode(List<HttpRequestBase> res, int times) {
        super(res, times, true)}@Override
    protected void doing(a) throws Exception {
        // FunLibrary.executeSimlple(res.get(index.getAndDecrement() % res.size()))
        FunLibrary.executeSimlple(random(f))
    }

    @Override
    ThreadBase clone(a) {
        return newListRequestMode(f, limit); }}Copy the code

Getting test progress

The premise here is that a node can only run one use case at a time, So I’m com. Funtester. Base. Constaint. ThreadBase adds an attribute in the com. Funtester. Base. Constaint. ThreadBase# progress running state used for external access.

HTTP access interface:

    @GetMapping(value = "/progress")
    public Result progeress(a) {
        String s = ThreadBase.progress == null ? "No task running" : ThreadBase.progress.runInfo;
        return Result.success(s);
    }
Copy the code

Com. Funtester. Frame. The execute. Progress# runInfo format is as follows: runInfo = the String. Format (” % s: % s % s, current QPS: %d”, taskDesc, getManyString(ONE, (int) (pro * LENGTH)), getPercent(pro * 100), getQPS()); .


FunTester.Tencent Cloud Author of the Year,Boss direct hire contract author.Official GDevOps media partner, non-famous test development, welcome to follow.

  • FunTester test framework architecture diagram
  • Soft start of performance test
  • How to become a Full stack automation engineer
  • FunTester Moco Server framework architecture diagram
  • Use Case Scheme of Distributed Performance Testing Framework (PART 1)
  • Use Case Scheme of Distributed Performance Testing Framework (PART II)
  • Appium 2.0 quick reference
  • Probe into branch problems in link pressure measurement
  • Agile testing is two or three things
  • A complete guide to automated Testing Frameworks
  • Moco framework interface hit ratio statistics practice
  • Distributed performance testing framework single-node internal test