An overview,

1. Sort, also known as Sort Algorithm, is the process of arranging a group of data in a specified order.
2, sorting classification:
1) Internal sorting:
Loading all the data to be processed into internal storage for sorting.
2) External sorting method:
The amount of data is too large to be loaded into the memory, and you need to use external storage to sort data.
3) Classification of common algorithms:
    

3. Time complexity of ten algorithms (blue)

  

A,Algorithm time complexity

  Two ways to measure the execution time of a program (algorithm)

1) Post-statistical method This method is feasible, but there are two problems:

    First, to evaluate the performance of the designed algorithm, it is necessary to run the program.

Second, the statistics of the time obtained depend on the hardware, software and other environmental factors of the computer. In this way, it is necessary to run in the same state of the same computer to compare the algorithm faster.

2) The method of prior estimation

By analyzing an algorithmTime complexityTo determine which algorithm is better

Second,Time frequency

1) Basic introduction

Time frequency: The time taken by an algorithm is directly proportional to the number of statements executed in the algorithm. The algorithm that has more statements executed takes more time. The number of statements executed in an algorithm is called statement frequency or time frequency. Notes for T (n)

2)For example – ignore the constant term

  

    

Conclusion: The execution curves of 2n+20 and 2n are infinitely close as n increases, and 20 can be ignored. 3n+10 and 3n are infinitely close as n increases, and 10 can be ignored

3) Examples – Ignore low-order terms

  

  

Conclusion: 2n^2+3n+10 and 2n^2 as n gets larger, the execution curve is infinitely close to 3n+10 n^2+5n+20 and n^2 as n gets larger, the execution curve is infinitely close to 5n+20

4) Examples – ignore coefficients

  

  

Conclusion: As the value of n increases, 5n^2+7n and 3n^2 + 2n, the execution curves overlap, indicating that in this case, 5 and 3 can be ignored. And n^3+5n and 6n^3+4n, perform the curve separation, show how many times the mode is critical

Time complexity

1) In general, the number of repeated execution of the basic operation statement in the algorithm is a function of problem size N, which is represented by T(n). If there is some auxiliary function F (n), such that when n approaches infinity, the limit value of T(n)/f(n) is a constant not equal to zero, then F (n) is said to be a function of the same order of magnitude as T(n). Denoted as T(n)=O(f(n)), O(f(n)) is called the asymptotic time complexity of the algorithm, or time complexity for short. 2) T(n) is different, but the time complexity may be the same. For example, T(n)=n²+7n+6 and T(n)=3n²+2n+2 have different T(n), but the same time complexity, both are O(n²). 3) Method for calculating time complexity:

  • Replace all the addition constants in the running time with constant 1 T(n)=n²+7n+6 => T(n)=n²+7n+1
  • In the modified run-time function, only the highest order term T(n)=n²+7n+1 => T(n)=n² is retained
  • T(n) = n² => T(n) = n² => O(n²)

Common time complexity

  1. Constant order O (1)
  2. The logarithmic order O (log2n)
  3. Linear order O (n)
  4. Linear log order O(nlog2n)
  5. Square order O (n ^ 2)
  6. Cubic order O (n ^ 3)
  7. Order N to the k.
  8. The index order O (2 ^ n)

  

Description:

The time complexity of common algorithms is as follows: O (1) < O (log2N) < O (N) < O (n2) < O (N3) < O (NK) < O (2N) : As the scale of problem N increases, the time complexity increases and the execution efficiency of the algorithm decreases. Therefore, avoid using the exponential algorithm

1) Constant order O(1)

It doesn’t matter how many lines of code it executes, as long as it doesn’t have loops or anything, the time complexity of the code is O(1).

  

When the above code is executed, its consumption time does not increase with the growth of a variable, so no matter how long the code is, even if it has tens of thousands of lines, its time complexity can be expressed by O(1). 2) Logarithmic order O(log2n)

  

Description:

In the while loop, every time you multiply I by 2, you get closer and closer to n. Let’s say that after x times, I is greater than 2, and then the loop ends, which means that 2 to the x is equal to n, so x is equal to log base 2n which means that after x to the log base 2n, this code ends. So the time complexity of this code is order (log)2N). O(log2This 2 of n, which depends on the code, is order log of I = I times 33n)

  

3) Linear order O(n)

  

Description:

This code, the code inside the for loop will execute n times, so the time it takes will change with n, so this kind of code can be expressed as O(n) time complexity 4) current logarithmic order O(nlog n)

  

Description:

Linear log order O(N logn) is actually very easy to understand, so if I loop O(logn) code N times, then the time complexity is N times O(logn), which is O(N logn) 5) square order O(n2).

Description:

Order O(n ^ 2) is a little bit easier to understand, so if I nested O(n) code again, it would be O(n ^ 2), so this code would be O(n*n), so it would be O(n ^ 2) if I changed the n of one of the loops to m, So it’s O(m*n) 6) order n cubed, order K O(n^ K).

Description:

Just think of it as O(n ^ 2) up here, O(n ^ 3) equals three n cycles, and so on

5. Spatial complexity

Basic introduction

1. Similar to the time Complexity discussion, the Space Complexity of an algorithm is defined as the amount of storage the algorithm consumes as a function of the problem size N. Space Complexity is a measure of the amount of storage Space temporarily occupied by an algorithm while it is running. The number of temporary units of work required by some algorithms is related to the problem size N, which increases with the increase of N. When N is large, more storage units will be occupied, such as quicksort and merge sort algorithms. 3. From the perspective of user experience, the speed of program execution is more important. Some caching products (Redis, memcache) and algorithms (radix sort) essentially trade space for time.

6. Related terms

1) Stability: if a is in front of B, and a= B, a is still in front of B after sorting; 2) Instability: If A was originally in front of B and A = B, a may appear behind B after sorting; 3) Internal sort: all sort operations are completed in memory; 4) External sort: because the data is too large, so put the data in disk, and sort through disk and memory data transfer can be carried out; 5) Time complexity: the time spent on the execution of an algorithm. 6) Space complexity: the amount of memory required to run a program. 7) n: data size 8) K: number of buckets 9) in-place: does not occupy extra memory 10) out-place: occupies extra memory