The original article was published on wechat official account: Jzman-blog

Time complexity and space complexity can help us to choose the appropriate algorithm according to the specific platform, to learn to space for time or time for space design ideas, such as in the MCU is generally a relatively tight memory space, in the pursuit of optimal algorithm should be appropriate to time for space design, Of course, on large memory devices, the optimal algorithm can be designed with the design idea of space for time. Therefore, the time and space complexity can be used as a judgment method to judge the speed of an algorithm or code block under certain restrictions. The time and space complexity can be understood and learned mainly from the following aspects:

  1. Data structure and algorithm relationship
  2. Time complexity
  3. Spatial complexity
  4. conclusion

Data structure and algorithm relationship

A data structure is a set of data storage structures, and an algorithm is a set of methods for manipulating data, so a data structure serves an algorithm, and an algorithm acts on a particular data structure.

Big O complexity notation

The big-o complexity notation gives a rough idea of the time efficiency of a code, such as this one:

 int cal(int n) {
   int sum = 0;
   int i = 1;
   for (; i <= n; ++i) {
     sum = sum + i;
   }
   return sum;
 }

Copy the code

If the execution time per line of valid code (with assignment) is one unit time, then the execution time of the above code can be expressed as 2n + 2 units time. Here we can see that the execution time of the code is proportional to the number of times the valid code is executed per line.

If this is the code:

 int cal(int n) {
   int sum = 0;
   int i = 1;
   int j = 1;
   for (; i <= n; ++i) {
     j = 1;
     for(; j <= n; ++j) { sum = sum + i * j; }}}Copy the code

Under the assumption that the execution time of the above code can be expressed as 2n*n + 2n + 3, it can also be seen that the code execution time is proportional to the number of times of execution of each line of valid code, which can be expressed by the formula:

T(n) = O(n)
Copy the code

Then the relationship between the above two code execution times and code execution times can be expressed as:

T(n) = O(2n + 2)
T(n) = O( 2n*n + 2n + 3)
Copy the code

With the increasing scale of data, the constant term will not affect the code behind the change trend of the execution time grow along with the scale of data, the above two functions obviously is a linear function, one is a quadratic function, with the increase of n, the value of quadratic function will eventually function more than once, so we take the highest order to compare and other secondary to remove, simplified as follows:

T(n) = O(n)
T(n) = O(n*n)
Copy the code

Now we can say that the time complexity of the above two pieces of code can be expressed in big O notation as O(n) and O(n*n).

Time complexity

Time complexity: The conclusion from the above, we know that use big O notation indicates that this code is the change trend of the execution time increases with the data size, big O notation assumptions is with the increasing scale of data, only keep top item can estimate the time complexity of the code to run, so the complexity analysis, focus on the largest scale of time complexity.

The common time complexity orders from smallest to largest are:

Constant order (O (1)) (O (logn)) < < logarithmic order linear order (n) < linear logarithmic order (O (nlogn)) < square order (O (n ^2()) < the third order (O(n^3) < factorial order (O(n!) ) < order n (O(n^n))Copy the code

In the above order of magnitude, the factorial order and the power order are all of the non-polynomial order. When the data size increases sharply, the non-polynomial order algorithm takes longer and longer to execute, and this algorithm is also the least efficient algorithm. The following points for the polynomial order are described.

O(n) : No matter how many lines of code, only the number of execution can be determined, then its time complexity magnitude is expressed as O(1), no O(2), O(3) and other conditions.

O(logn) : The logarithmic time complexity calculation is mainly to find the satisfying conditions and calculate the number of times the code is run, that is, how many times the code is run before it is completed. For example:

 i=1;
 while (i <= n)  {
   i = i * 2;
 }
Copy the code

For the above code, we only need to know how many times the code is executed, that is, to find the relationship between I and n. The values of I are 1, 2, 8, 16, etc., that is, 2^0, 2^1, 2^3, 2^4, etc., so the relationship between I and n is 2^t = n. And then the time complexity of figuring out t and getting rid of the irrelevant terms is the logarithmic time complexity. Of course, linear logarithmic order O(nlogn) is the result of repeating this code n times.

O(m+n) : the following code cannot be added directly to the highest order when indicating complexity:

int cal(int m, int n) {
  int sum_1 = 0;
  int i = 1;
  for (; i < m; ++i) {
    sum_1 = sum_1 + i;
  }

  int sum_2 = 0;
  int j = 1;
  for (; j < n; ++j) {
    sum_2 = sum_2 + j;
  }

  return sum_1 + sum_2;
}

Copy the code

M and n represent two data scales, and we cannot determine which order of magnitude m or n is larger. In this case, the time complexity is expressed as O(m) + O(n). If SUM_1 * sun_2, the corresponding time complexity is expressed as O(m) * O(n).

The time complexity analysis is basically as above. Let’s continue with the special case complexity analysis. Analyze the time complexity of the following code:

N indicates the length of the array
int find(int[] array, int n, int x) {
  int i = 0;
  int pos = -1;
  for (; i < n; ++i) {
    if (array[i] == x) pos = i;
  }
  return pos;
}

Copy the code

Analysis process: The assignment operation of I and POS is 2 times in total, which will not affect the trend change of code execution time with the increase of data scale and can be ignored. In the for loop, if the condition in the if statement inside the for loop is satisfied when I increases to m, the time complexity at this time is expressed as (1+m)n. So the event complexity of this code must be O(n), because the if statement does not exit the for loop even if it finds a value equal to x.

N indicates the length of the array
int find(int[] array, int n, int x) {
  int i = 0;
  int pos = -1;
  for (; i < n; ++i) {
    if (array[i] == x) {
       pos = i;
       break; }}return pos;
}
Copy the code

If the value of x is equal to the value of x in the array, then there are two possibilities for the event complexity (1+m)n. One is to find a value that matches the if statement condition and exit the loop. In this case, m must be a constant value and can be ignored. Of course, if it is known that the judgment condition of the if statement is not satisfied, and the event complexity of this code is still O(n), it can be seen that the same code may have different time complexity under different conditions. In view of this situation, the time complexity is refined into three types:

  • Best-case time complexity analysis: the optimal time complexity of executing a piece of code, such as O(1) when finding a value in the array that satisfies the if statement condition;
  • Worst-case time complexity analysis: refers to the complexity of executing a piece of code in the worst case, corresponding to the above code is never found in the array to meet the condition of exit for loop, the time complexity of the above code is O(n) is the worst time complexity;

Spatial complexity

Space complexity reflects the tendency of storage space to increase with data size, as shown in this code:

void print(int n) {
  int i = 0;/ / stack memory
  int[] a = new int[n];/ / heap memory
  for(i; i <n; ++i) { a[i] = i * i; }}Copy the code

There are only two places in the above code about memory application, where the memory space of variable I is fixed, ignore, where the array declaration is applied for memory, and the size of n int memory size, so the space complexity of this code is O(n).

conclusion

Time complexity reflect the change of code execution time grow along with the scale of data trends, focus on a nested loop, such as space complexity reflect storage space along with the change of data scale growth trend, the time complexity is more common than the space complexity, the complexity of the development is often said that if you do not specify general said is time complexity. In addition, for a specific piece of code, it can also be refined into best-case time complexity, worst-case time complexity, average time complexity and amortized time complexity. The analysis idea is basically the same, but the constraints are different.

See the wechat official account for more content.