In my last blog post on JMeter Performance Testing 3.0 era – New JMeter Plug-in Management I said I would be writing about real JMeter 3.0 new features, but this weekend, AFTER two weeks, I paused my schedule to continue the unfinished series. This article introduces a new feature introduced in JMeter3.0: Dashboard Report, a graphical multi-dimensional test Report in HTML format. With the help of this feature, we can greatly reduce the difficulty of the result presentation when we build the Performance test platform based on JMeter, and put more energy on the back-end platform functions rather than learning the front-end chart library temporarily.

Why this new feature

Before JMeter3.0, the official only provided graphical display of some dimensions of test results on the UI of the tool, which caused me two problems:

  1. In practice, after integrating JMeter into the platform, we need to display TPS curves on the page, average response time curves and other charts, and we need to manually manipulate the front-end chart library such as Hightcharts/Echarts.
  2. To view the historical test results, JMeter needs to start the graphical interface and import the saved CSV results. The process is tedious, and when the result set is large, JMeter needs to spend a considerable amount of time to display the graphical report on the interface.

The new features discussed in this article bring a better solution to both of these problems:

  • The new feature provides excellent visualization of the resulting data, generating reports in HTML page form and containing most of the metrics of interest in the actual test, which can be easily embedded into the platform to view each test run from the browser.
  • As long as the generated HTML page is retained, to view the test results later, just need to open in the browser, convenient and quick.

Ii. Introduction of new features

JMeter3.0 provides an extension module for generating graphical reports in HTML page format. This module supports generating multi-dimensional graphical test reports in two ways:

  1. At the end of the JMeter performance test, an HTML graphical report of the test is automatically generated
  2. Use an existing result file, such as a CSV file, to generate an HTML graphical report of the result

The default measurement dimensions are:

  1. APDEX(Application Performance Index)
  2. Aggregated report
  3. Errors report
  4. Response time curve
    • Shows the average response time over time
    • Similar to JMeter Plugins on the UIjp@gc – Response Times Over Time
  5. Data throughput time curve
    • Shows how the data throughput per second changes over time
    • Similar to JMeter Plugins on the UIjp@gc – Bytes Throughput Over Time
  6. Latency time curve
    • Show Latency changes over time
    • Similar to JMeter Plugins on the UIjp@gc – Response Latencies Over Time
  7. Clicks per second curve
    • Similar to JMeter Plugins on the UIjp@gc – Hits per Second
  8. HTTP status code time distribution curve
    • Displays the distribution of response status codes over time
    • Similar to JMeter Plugins on the UIjp@gc – Response Codes per Second
  9. Transaction Throughput Time Curve (TPS)
    • Shows the number of transactions processed per second over time
    • Similar to JMeter Plugins on the UIjp@gc – Transactions per Second
  10. Graph of average response time versus requests per second
    • Shows the average response time in relation to the number of requests per second (QPS)
  11. Latency time versus requests per second
  12. Percentile plot of response time
  13. Change curve of active threads
  14. Average response time versus number of threads
    • Shows the average response time in relation to the number of threads
    • Similar to JMeter Plugins on the UIjp@gc – Response Times vs Threads
  15. Histogram of response time distribution

Latency = time at which the first byte of a response is received – time at which the request will be sent

From just before sending the request to just after the first response has been received — Apache JMeter Glossary

Response time (Elapsed Time in JMeter terms) = the point at which all response content is received – the point at which the request is sent

From just before sending the request to just after the last response has been received — Apache JMeter Glossary

Note 2: The Apdex standard, from the user’s point of view, transforms the performance of application response time into the user’s satisfaction rating of application performance, which can be quantified into a range of 0-1.

Apdex (Application Performance Index) is an open standard developed by an alliance of companies. It defines a standard Method for reporting and comparing the performance of software applications in Computing. — Wikipedia

3. Quick start

1. Confirm basic configurations

  • Verify the following configuration items in jmeter.properties or user.properties:
    jmeter.save.saveservice.bytes = true
    jmeter.save.saveservice.label = true
    jmeter.save.saveservice.latency = true
    jmeter.save.saveservice.response_code = true
    jmeter.save.saveservice.response_message = true
    jmeter.save.saveservice.successful = true
    jmeter.save.saveservice.thread_counts = true
    jmeter.save.saveservice.thread_name = true
    jmeter.save.saveservice.time = true
    # the timestamp format must include the time and should include the date.
    # For example the default, which is milliseconds since the epoch: 
    jmeter.save.saveservice.timestamp_format = ms
    # Or the following would also be suitable
    jmeter.save.saveservice.timestamp_format = yyyy/MM/dd HH:mm:ss
    Copy the code
  • If you want to display more detailed data in the Errors report, make sure you do the following
    • jmeter.save.saveservice.assertion_results_failure_message = true
    • If a Transaction Controller is used, confirmGenerate parent sampleIs not selected

2. Generate reports

A. Report at the end of the stress test

  • Basic command formats:

    jmeter -n -t -l -e -o
  • Sample:

    jmeter -n -t F:\PerformanceTest\TestCase\script\getToken.jmx -l testLogFile -e -o ./output

B. Use the existing stress test CSV log file to generate a report

  • Basic command formats:

    jmeter -g -o
  • Sample:

    Jmeter -g D:\apache-jmeter-3.0\bin\testLogFile -o./output

Both samples produce the following files (folders) in the apache-jmeter-3.0 bin output directory:

Open the index.html file in your browser to view various graphical reports:



4. Custom configuration

JMeter3.0 reportgenerator have been added to the bin directory. The properties files saved all the default configuration on graphical HTML report generation module, to change the configuration, suggest not to edit the file directly, but is recommended in the user. The properties of configuration and coverage.

1. General configuration

The overall configuration is jmeter.ReportGenerator. For the prefix. Such as: jmeter. Reportgenerator. Overall_granularity = 60000

  • overall_granularity: Define the sample point granularity, the default is 60000ms, usually in tests other than stability, we may need to define a finer granularity, such as 1000ms, we can inuser.propertiesAdd the following configuration at the end of the file:
    # Change this parameter if you want to change the granularity of over time graphs.
    jmeter.reportgenerator.overall_granularity=6000
    Copy the code
  • report_title: Defines the title of the report, which we may need to define as the actual test item name
  • apdex_satisfied_threshold: Define Apdex evaluationSatisfied with theThreshold in ms
  • apdex_tolerated_threshold: Define Apdex evaluationCan tolerateThe threshold

    Apdext = (Satisfied Count + Tolerating Count / 2) / Total Samples

Also, in jmeter.properties, there are default values for the three percentiles in the collection report:

aggregate_rpt_pct1 : Defaults to 90
aggregate_rpt_pct2 : Defaults to 95
aggregate_rpt_pct3 : Defaults to 99
Copy the code

Can be found inuser.propertiesTo override it, for example:aggregate_rpt_pct1 = 70, the effect is as follows:

2. Chart configuration

Each chart configuration in jmeter. Reportgenerator. Graph. < name > chart. For the prefix.

  • classnameThe implementation class of the chart, if it has its own custom implementation, writes the value of that configuration to the class name of the custom implementation class
  • titleIcon title, for example, when you want to be Chinese, configure Chinese title here
  • property.set_granularitySet the granularity of the sampling point for ICONS. If this parameter is not specified, the granularity is used by default

3. Output the configuration

Output configuration with jmeter. Reportgenerator. Exporter for prefix.

  • property.output_dirConfigure the default report output path. You can override this configuration by using the -o option on the command line.
  • html.series_filterUsed to filter display content. Add the following configuration to user.properties:

    jmeter.reportgenerator.exporter.html.series_filter=(^Login)(-success|-failure)?

    The final report shows only the data for the sampler named Login. This configuration consists of two parts,(-success|-failure)?isTransactions per secondThe configuration on which the chart depends. The previous section accepts a regular expression for filtering.

In live.

The Dashboard Report feature, which is essentially a timely update to Apache JMeter’s approach to visualizing test result data, is long overdue and not cool, but at the very least it’s a boon for anyone who needs to run performance tests based on it. Finally, thanks to the contributors to the Apache JMeter project for continuing to update it.

References

  1. Apache JMeter Dashboard Report
  2. Apache JMeter Glossary