preface

Algorithm is very important, but in general, mobile development is not often used, so many students have long algorithm dozen a gift package sent to the teacher, and many students have not learned the algorithm. This series is for those of you who have a headache with algorithms to quickly grasp the basic algorithms. During the holiday period of the Chinese New Year, I played the game NBA2K17 career mode, and the days without matches are also training, and these training are spontaneous, no one forces you, from morning to night, the attribute does not rise, but if accumulated over a long period of time, the attribute value of people who do not train and train will produce a big gap. This suddenly made me realize the reality of the world. To become a soccer star, you need years of deliberate training. Just stop playing games and start writing articles.

1. Efficiency of the algorithm

Although the computer can do the processing quickly, in reality, it needs to consume some processor resources depending on the size of the input data and the efficiency of the algorithm. To write programs that run efficiently, we need to consider the efficiency of the algorithm. The efficiency of the algorithm is mainly evaluated by the following two complexities: Time complexity: evaluates the time required to execute the program. You can estimate how much your program uses the processor. Space complexity: Estimates the storage required to execute the program. You can estimate how much memory your program is using on your computer.

When designing algorithms, it is generally necessary to consider the system environment first, and then weigh the time complexity and space complexity to choose a balance. However, time complexity is more likely to cause problems than space complexity, so the algorithm research is mainly time complexity, without special description, complexity refers to time complexity.

2. Time complexity

Time frequency The time consumed by the execution of an algorithm cannot be calculated theoretically and can only be known by running tests on the computer. But we can’t and don’t need to test every algorithm on the machine, just know which algorithm takes more time and which algorithm takes less time. And the time an algorithm takes is directly proportional to the number of statements executed in the algorithm. Whichever algorithm has more statements executed, it takes more time. The number of statements executed in an algorithm is called statement frequency or time frequency. Notes for T (n).

Time complexity In the time frequency T(n) mentioned earlier, n is called the size of the problem, and as n keeps changing, the time frequency T(n) also keeps changing. But sometimes we want to know how it changes, and for that we introduce the concept of time complexity. In general, the number of repeated operations in the algorithm is a function of the problem size N, expressed by T(n). If there is some auxiliary function F (n), such that when n approaches infinity, the limit value of T(n) /f(n) is a constant that is not equal to zero, then F (n) is called a function of the same order of magnitude as T(n). Called T(n)=O(f(n)), it is called the asymptotic time complexity of the algorithm, or time complexity for short.

3. Big O notation

Like the previous notation of O() to represent the time complexity of the algorithm, we call it the big O notation. The algorithm complexity can be evaluated from three perspectives: the best case, the average case and the worst case. Since the average case is mostly equal to the worst case, and the worst case can be evaluated without worrying about it, it is generally necessary to directly estimate the worst case complexity when designing the algorithm. The value of f(n) in O(f(n) can be 1, n, logn, n², etc., so we can call O(1), O(n), O(n), O(logn), O(n²) constant order, linear order, logarithmic order and square order respectively. So how to deduce the value of f(n)? Let’s go on to the method of deriving order large O.

To derive the big O order to derive the big O order, we can follow the following rules, the result is the big O notation: 1. Replace all addition constants in running time with constant 1. 2. Retain only the highest order term in the modified run-time function. 3. If the highest order term exists and is not 1, remove the constant multiplied by this term.

The constant order is given as an example, as shown below.

  int sum = 0,n = 100; // Execute once
  sum = (1+n)*n/2; // Execute once
  System.out.println (sum); // Execute onceCopy the code

The function of The Times of operation of the above algorithm is F (n)=3. According to rule 1 of derivation of large order O, we need to change constant 3 to 1, so the time complexity of this algorithm is O(1). If sum = (1 +n) *n/2 is executed 10 more times, since it doesn’t depend on the size of the problem n, the time of this algorithm is still O(1), which we can call constant order.

Linear order Linear order is mainly to analyze the operation of the loop structure, as shown below.

for(int i=0; i<n; i++){// Time complexity O(1) algorithm. }Copy the code

The code in the body of the algorithm above is executed n times, so the time complexity is O(n).

Logarithmic order Then look at the following code:

int number=1;
while(number<n){
number=number*2;
// Time complexity O(1) algorithm. }Copy the code

You can see that the code above, as the number gets closer and closer to n each time you multiply it by 2, will exit the loop when the number is at least n. Assuming that the number of cycles is X, X =log₂n is obtained from 2^ X =n, so the time complexity of this algorithm is O(logn).

The code below the square order is loop nesting:

  for(int i=0; i<n; i++){for(int j=0; j<n; i++){// An algorithm of O(1) complexity. }}Copy the code

The time complexity of the inner loop was known to be O(n) when we talked about the linear order. Now after n times of the outer loop, the time complexity of this algorithm is O(n²). Let’s calculate the time complexity of the following algorithm:

  for(int i=0; i<n; i++){for(int j=i; j<n; i++){// An algorithm of O(1) complexity. }}Copy the code

Note that int j= I in the inner loop, not int j=0. When I =0, the inner loop executes n times; When I =1, the inner loop is executed n-1 times, and when I =n-1, it is executed once. We can calculate the total execution times as follows:

N + (n – 1) + (n – 2) + (n – 3) +… + 1 = (n + 1) + [(n – 1) + 2] + [(n – 2) + 3] + [(n – 3) + 4] +… = (n + 1) + (n + 1) + (n + 1) + (n + 1) +… N = (n + 1) / 2 = n (n + 1) / 2 = n squared / 2 + n / 2

According to the second rule of the derivation of order O, only keep the highest order, so keep n ^ 2 /2. According to the third rule, if we remove the constant and this term, we remove 1/2, and the final time complexity of this code is O(n²).

Other common complexities

In addition to constant order, linear order, square order and logarithmic order, there are also the following time complexity: when F (n)= Nlogn, the time complexity is O(nlogn), which can be called nlogn order. If f(n)=n ^ 3, the time complexity is O(n ^ 3), which is called cubed. When F (n)=2 or n, the time complexity is O(2), which is called exponential. f(n)=n! , the time complexity is O(n!) Can be called the factorial order. If f(n)=(√n, the time complexity is O(√n), which can be called square root order.

4. Complexity comparison

The following is a table of f(n) values that are common in the algorithm according to several typical orders of magnitude. Based on this table, we can see the differences in the complexity of various algorithms.

n logn Square root of n nlogn N squared 2 ⁿ n!
5 2 2 10 25 32 120
10 3 3 30 100 1024 3628800
50 5 7 250 2500 About 10 ^ 15 3.0 * 10 ^ 64
100 6 10 600 10000 About 10 ^ 30 9.3 * 10 ^ 157
1000 9 31 9000 1000, 000, About 10 ^ 300 4.0 * 10 ^ 2567

O(n), O(logn), O(√n), and O(nlogn) do not increase in complexity as n increases, so these complexities are efficient algorithms. Turn to O(2) and O(n! When n increases to 50, the complexity breaks into the tens. This kind of inefficient complexity is best avoided in the program, so evaluate the worst-case complexity of the algorithm you write when you start programming. Here’s a more intuitive picture:

Where x axis represents n value, y axis represents T(n) value (time complexity). The value of T(n) changes with the value of n, where it can be seen that O(n!) And O(2) the T(n) value increases greatly as n increases, but the T(n) value of O(logn), O(n), and O(nlogn) increases only slightly as n increases. The commonly used time complexity in ascending order of time consumed is:

O(1) < O (logn) < O (n) < O (nlogn) < O (n squared) < O (n after) < O (2ⁿ) < O (n!Copy the code

Reference materials: Big Talk Data Structure, Challenge Programming Contest 2, Algorithms


Welcome to pay attention to my wechat public number, the first time to get blog updates, as well as more system of Android related original technology dry goods. Scan the qr code below or long press to identify the QR code, you can follow.