Why performance testing

Look at performance through the eyes of the end user

Performance measurement

  • availability
  • The corresponding time
  • throughput
  • utilization

Performance criteria

  • More than 15 seconds: Rule out interaction. If it does happen, it should be designed so that the user can move on to other actions and continue to request a response for a later time.
  • More than 4 seconds: This delay impedes problem-solving activities and disrupts data entry. However, after a transaction has completed, a delay of 1-15 seconds is tolerable.
  • Less than 2 seconds: if the user needs to remember several pieces of information, it should not exceed 2 seconds
  • Subsecond: Thought-intensive work, especially a graphical application, needs very short response times to retain the user’s interest and hold his attention for a long time
  • 0.1 seconds: response to on-screen typing.

Commercial value

Performance problems have a nasty habit of waiting until it’s too late to find them, and the later you find them, the more expensive they become to fix. Planning: Applications that deploy successfully as planned, with few or no problems after the system is deployed, begin to provide immediate benefits to the enterprise. Practical: Very often during product development, actual deployment and deployment goals do not match due to the amount of time and money spent solving performance problems. This is not good news for the enterprise, as the application failed to deliver the expected benefits.

Performance test Maturity

  • Fire-fighting: Performance testing is rarely or never performed, resulting in performance problems found in the production environment having to be resolved immediately.
  • Performance verification: The company schedules a separate period of time for performance testing. There are still quite a few performance defects during development (30%)
  • Performance-driven: System performance is considered at every stage of the application life cycle. On-line performance issues are not as great (5%)

Lack of performance considerations during system design

How many users

Underestimating website popularity

Automated testing

It is almost impossible to perform effective performance testing without using automated testing tools

Basic principles for effective application performance testing

There are a number of issues you need to address before implementing an effective performance testing strategy.

  • Choose an appropriate performance testing tool
  • Design an appropriate performance test environment
  • Design realistic performance test objectives
  • Ensure and write that the application under test is sufficiently stable
  • Do code freeze
  • Identify and write business-critical scripts
  • Provide quality and adequate test data
  • Ensure mouth performance test design
  • Identify key performance indicators (KPIs) for monitoring servers and networks
  • Schedule adequate time for effective performance testing

Environmental audit

  • Number of Servers
  • Load Balancing Policy
  • Hardware information
  • Software listing
  • List of application components
  • External connection

Key Performance Indicators

  • Availability or uptime
  • Concurrency, scalability, and throughput
  • The response time
  • Network capacity (data volume, data throughput, data error rate)
  • Server capacity (CPU, memory, disk I/O, disk space)

Ensure that the application is stable enough to ensure that every request is successful

Focus on the following potential hidden problems

  • Big data display
  • SQL statements that are not executed efficiently
  • Massive network data interaction
  • Unknown error in application

Code freeze

Ensure that the code does not change after testing and notify the tester of any changes.

Identify and validate business-critical transactions

You need to identify the large number of critical tasks that the average user performs most often during a workday. These transactions will be all part of your performance testing.