What is the

  1. It is the index to evaluate the algorithm
  2. Is an indicator of the time consumed by algorithm execution
  3. Is the magnitude of the increase in the number of times of execution as the input increases

What’s the use of

  1. When judging the merits and disadvantages of an algorithm, it generally depends on whether the execution result is accurate and whether the execution times of basic statements are as little as possible. Time complexity can be used as an evaluation indicator of the execution time of an algorithm, that is to say, the lower the complexity, the better the algorithm

How to calculate

Time complexity definition: N is called the scale of the problem. When n changes constantly, the time frequency T(n), that is, the number of statements executed in the algorithm, also changes constantly. If there is some auxiliary function F (n), such that when n approaches infinity, the limit value of T(n)/f(n) is a constant that is not equal to zero, then F (n) is said to be the same order of magnitude function of T(n). Denoted as T(n)=O(f(n)), O(f(n)) is called the asymptotic time complexity of the algorithm, or time complexity for short
In simple terms, the larger the input data size n is, the more times the basic statement of the program will be executed. For example, there is an algorithm that when the input n=100, the basic statement will be executed (+2n+9 times) times, reserving the maximum number level associated with n, i.e, the time complexity of the algorithm is O()

Still not understand? The number of executions of basic statements is all the branches of a tree, and the time complexity is all the branches of a tree.

The order of time complexity

O (1) < O (LOG2N) < O (N) < O (NLOG2N) < O (n2) < O (N3) <… The < Ο (2 n) < Ο (n.)

What are the common time complexities

1. O(1)

Temp=i; i=j; j=temp;      
Copy the code

The frequency of the above three single statements is 1, and the execution time of this program segment is a constant independent of the problem size N. The time complexity of the algorithm is constant order, denoted as T(n)=O(1).

2.o (log base 2^n), which is the same thing as O()

while (i<=n)
       i=i*2; //语句1
Copy the code

Let the frequency of base statement 1 be f(n), then: 2^f(n)<=n; F (n)<=log2n, take the maximum f(n)=log2n,T(n)=O(log2n)

3. O(n)

for(i=1; i<=n; I ++){console.log(I)Copy the code

T(n)=2+n+3(n-1)=4n-1=O(n)

4. O(n^2) is the same thing as O()

for(i=1; i<=n; i++){for(j=1; j<=n/2; J ++){console.log(j)Copy the code

Because statement 1 does n * n/2= 1/2Time, Θ ((1/2)) =θ that is: remove the lower order term, remove the constant term and remove the constant parameter of the higher order term), so T(n)= =O();

5. O(n^3) is O(n^3).)

for(i=1; i<=n; i++){for(j=1; j<=n; j++){for(k=1; k<=n; K ++){console.log(k)// statement 1}}}Copy the code

Because statement 1 executes n * n * n=*So T(n)=O(n));

Little homework

Calculate the time complexity of common front-end algorithms, such as bubble sort, quick sort