A, description,

Time complexity and space complexity are two criteria used to evaluate the efficiency of an algorithm. As a developer, I’m sure you hear these two concepts a lot, but what do they mean?

These two concepts can also be seen literally:

Time complexity: The amount of time it takes to execute an algorithm, the faster the better. For example, if you open up a calculator on your computer, who will use it if it takes a minute to do an ordinary calculation? You might as well do it by yourself. Time complexity is a very important metric for algorithms, even more important than spatial complexity. Because in most conditions today, computers have plenty of memory and storage. However, the user experience will be better if the results can be produced in a short time.

Space complexity: That is, the amount of storage required to execute the current algorithm is as small as possible. A computer’s storage resources are limited, and if your algorithms always consume a lot of storage, it will also put a lot of burden on the machine. Especially in the field of embedded development, memory and storage space is very limited, so the spatial complexity of algorithms will be very important.

Stability: Stability: If a is in front of B and a= B, a is still in front of B after sorting. Unstable: If a is in front of B and a= B, a may appear behind B after sorting.

Second, time complexity calculation

Said method

We generally use “big O notation” to express time complexity: T(n) = O(f(n)) where n is the factor that affects the change in complexity and f(n) is the complexity specific algorithm.

A common order of time complexity

  • Constant order O (1)
  • Linear order O (n)
  • The logarithmic order O (logN)
  • Linear log order O(nlogN)
  • Squared square order O (n)
  • Cubic order O (n) after
  • Order N to the K.
  • The index order (2 ^ n)

Let’s take a look at the types of algorithms that correspond to different complexities.

Constant order O (1)

int a = 1;
int b = 2;
int c = 3;
Copy the code

Assuming that each line of code takes one unit of time to execute, the above three lines of code take three units of time.

Is the time complexity of this code expressed as O(n)? Not really, because the big O notation is not used to represent the actual execution time of the algorithm, it is used to represent the increasing trend of the execution time of the code.

The algorithm above does not grow with a variable, so no matter how long such code is, even if it has tens of thousands of lines, it can be expressed as O(1).

Linear order O (n)

for(i = 1; i <= n; i++) {
   j = i;
   j++;
}
Copy the code

How many times does this code run? Line 1 is executed once, line 2 is executed n times, and line 3 is executed n times, so the total execution time is 2n + 1, so the time complexity is O(2n + 1)? No ! Again: “The big O notation is not used to represent the actual execution time of the algorithm, it is used to represent the increasing trend of the code execution time.” So its time is really O(n);

The logarithmic order O (logN)

int i = 1;
while(i < n) {
    i = i * 2;
}
Copy the code

You can see that I is multiplied by 2 each time through the loop, so the total number of loops is log2n, so this code has time order logn.

So the question is, why write order log base n when it should be order log base 2n? In fact, the base here is not important for the study of program operation efficiency. What should be considered when writing code is the influence of data size N on program operation efficiency, while the constant part is ignored. Similarly, if the multiple relationship of different time complexity is constant, it can also be approximated as time complexity of the same magnitude.

Linear log order O(nlogN)

for(m = 1; m < n; m++) { i = 1; while(i < n) { i = i * 2; }}Copy the code

Linear log order O(N logn) is actually very easy to understand, so if I loop O(logn) code N times, it’s N times O(logn), which is order N (logn).

Squared square order O (n)

for(x = 1; i <= n; x++){ for(i = 1; i <= n; i++) { j = i; j++; }}Copy the code

If you loop O(n) code again, it’s O(n ^ 2).

Cubic O (n) after, K power O (n ^ K) refer to the above O (n squared) to understand, O (n) after the equivalent of three layer n cycle, other similar.

Third, space complexity calculation

Space complexity O(1) If the temporary space required for algorithm execution does not change with the size of a variable N, that is, the space complexity of this algorithm is a constant, which can be expressed as O(1).

int i = 1;
int j = 2;
++i;
j++;
int m = i + j;
Copy the code

The space allocated by I, j and M in the code does not change with the amount of data processed, so its spatial complexity S(n) = O(1).

Space complexity O(n)

int[] m = new int[n]
for(i = 1; i <= n; ++i) {
   j = i;
   j++;
}
Copy the code

In this code, the first line of the new array, the size of this data is N, although there is a loop, but no new space allocation, so the space complexity of this code mainly depends on the first line, that is, S(n) = O(n).

Four,

The efficiency of an algorithm is mainly evaluated by its time complexity and space complexity. May some developers contact time complexity and space complexity of optimization is not too much (especially the client), but in the server application is more extensive, in the case of massive concurrency, a small number of time complexity and space complexity of optimization can bring huge performance improvements, is very necessary to understand.

If have wrong leak, welcome big guy to clap brick ~

Note: this article refers to some articles, if there is infringement, contact me, immediately delete.