Time complexity

Time complexity is an important basis for evaluating the advantages and disadvantages of an algorithm. Time complexity is also the consideration of the time dimension of an algorithm. This paper mainly describes the derivation process of time complexity to help this unknown partner quickly understand the time complexity assessment

How to derive the corresponding time complexity

I’m going to give you a couple of pieces of code before I do that, and I’m going to show you what the time complexity is, and then I’m going to show you how the time complexity is evaluated.

O(logn)

function test(n) {
  let i = 1
  while (i < n) {
    i *= 2}}Copy the code

O(nlogn)

function test() {
  for (let i = 0; i < n; i += i) {
    for (let j = 0; j < n; j++) {
      console.log('north song'); }}}Copy the code

O(n^2)

function test(n) {
  let sum = 0
  let i = 1
  let j = 1

  for (; i <= n; ++i) {
    j = 1
    for (; j <= n; ++j) {
      sum = sum + i * j
    }
  }
}
Copy the code

The calculation process of complexity will be derived step by step

Time complexity in the algorithm

The time complexity determines the quality of the current code (algorithm). The algorithm used in industry generally does not exceed O(n^2), so the time complexity marked with flag is not good. When the data scale is large enough, the algorithm implemented is very time consuming. The best is O(1), the worst is… There is no worst, only worse

Evaluate the negligible part of the code (algorithm) time complexity

  • The coefficient of
  • base
  • The low order

It doesn’t matter if you don’t know what that means, and I’ll explain that later, but remember that when you evaluate the time complexity of an algorithm you’re going to ignore those parts that are not absolute factors in the growth of the time complexity of an algorithm, right

Big O notation

T(n)=O(fn(n))
Copy the code
  • T(n) : time for code execution
  • N: Indicates the data size
  • F (n): Sum of times executed per line of code (i.e., time complexity of the current algorithm)

This formula means that the code execution time T(n) and the code execution times sum fn(n), O to represent the relationship between the two is proportional to the meaning

Here we need to take an example, n is the size of the data, let’s algebra n into it

n = 1000
T(1000) = O(f(1000))

n(100) = O(f(100))
Copy the code

When the data size of 1000 is equal to the time it takes this code to execute 1000 times, 100 times is the same thing; So it’s proportional, the larger the data, the more times the code executes. ** So if your time complexity is high, your code execution time must be higher, increasing or decreasing depending on the size of the data **, so the full term for time complexity is “progressive time complexity”.

Take a look at the following two pieces of code

function f1(n) {
  while (n--) {
    console.log('north song')}}function f2(n) {
  let i = 1
  while (i < n) {
    i *= 2}}Copy the code
f1(1000)
f2(1000)
Copy the code

Which is faster? I believe that even those who do not understand the derivation of time complexity will know that F2 is faster, and the larger the data volume, the more obvious the advantages and disadvantages between the two.

O(1) “Constant order”

function f() {
  let i = 1 // 1(unit-time)
  let j = 1 // 1(unit-time)
  console.log('north song') // 1(unit-time)
}
f(100)
Copy the code
T(100) = O(f(100))
Copy the code

Does the size of the data here affect the time complexity of the current code?

The answer is no, no matter what the size of the incoming data is there’s no loop in the code, it’s always 3(unit-time)

In the time complexity evaluation algorithm, whether assignment statement, output data, branch statement is counted as 1(untime-1 for short), so the time complexity of this code is O(3)?

Obviously we have not seen such time complexity. In evaluating time complexity, we will ignore those parts that have no absolute influence on the increasing time complexity of code, where O(3) is O(1), even with 1000 output statements.

It must not be O(1) if it has a cycle, right?

function f() {
  for (let i = 0; i < 100; i++) {
    console.log('Front Self-learning Station')
  }
}
f(100)
f(10000000)
Copy the code
Loop statement: I =1 / / 1
	i < 100 / / 100
  i++ / / 100The loop body:console.log('Front Self-learning Station') / / 100
Copy the code

There is a loop in this code, but it’s still O(1). Why?

The current loop statement is not affected regardless of the size of the data, which means that the size of the data does not affect the growth of the code execution time

O(logn)

So this one, just to remind you from high school, is the log function

function test(n) {
  let a = 1 // 1 unit-time
  while (a < n) {
    // How many times does the loop body execute?
    /* Exit the loop if a is not less than n, assuming that a is not less than n times; X: 1 => A: 2 * 1 => 2 to the first (2^1) x: 2 => A: 2 * 2 => 2 to the second (2^2) x: 3 => a: 4 * 2 => 2^3 (2^3) a^x = n, x = loga^n 2 => log base 2^2(log base 2^2) = 1 stop, not to solve the problem, from the derivation of the time complexity is: log base 2^n, as I said before, the coefficients, low order can be ignored, and finally, O(logn) */
    a *= 2  // a *= 2 => a = a * 2}}Copy the code
Time complexity derivation:1 + log2^n 
Copy the code

In mathematics we say log base 10 to the n, ignore the base when the base is 10, and in a program you can ignore whatever the base is, because it has no absolute effect on the trend of time complexity.

After ignoring the base:1 + logn
Copy the code

Ignore low prices and retain only the highest order of absolute time – increasing trends

After ignoring low order: lognCopy the code

Algebra to verify the impact of data size on algorithm time complexity

This is the formula we derived from the beginning:1 + log2^n 

T(1000) = O(f(1+log2^1000))
T(1000000) = O(f(1+log2^1000000))
Copy the code

Does the larger the data size, the more time the current algorithm takes? In the current formula, 1 is low order, the data size gets bigger, it doesn’t grow, and the absolute value of log2^1000 is 1000, not base 2, so you can ignore the base

So what is the time complexity of this code?

function test(n) {
  let i = 1 / / 1
  while (i < n) {
    // i += i => i = i + i => i = i * 2
    i += i
  }
}
Copy the code
i = i * 2And the same thing as above, assuming that after x times, I is not less than n, then I to the x is equal to n, x is equal to log of theta.2^ n). This is also order logn.Copy the code
function test(n) {
  let i = 1 / / 1
  while (i < n) {
    i *= 3}}Copy the code
i = i * 3Assuming that I is not less than n after x cycles, then I ^x = n, x = log(3^ n). This is also order logn.Copy the code

O(n) “linear order”

function test(n) {
  / * I = 0 = > 1 < n = > n i++ = > I n the outer loop execution: 1 + n + n = > 1 + 2 * n = > 1 + 2 n * /
  for (let i = 0; i < n; i++) {

    /* How many times does the body of the loop go through it, depending on when you break out of the loop, n times, and each time you can only go through one branch, which is n */
    if (n % 2= = =0) { // n
      console.log('north song')}else { // n
      console.log('Front Self-learning Station')}}}Copy the code
1+2n(loop) + n(loop body) result:1+3nAfter the law coefficient:1+ n after base: No base after lower order: n Final result: nCopy the code

Linear order should be the best derivation, nothing to say 😗.

O(nlogn) “linear logarithmic order”

function test() {
  /* I = 0 => 1 I < n => log2^n I += I => log2^n Execute the outer loop: 1+2*log2^n */
   // I += I => I => I + I => I => I * 2 => log2^n
  for (let i = 0; i < n; i += i) {
    // How many times does the body of the outer loop execute depending on when it breaks out of the loop
    // Loop body execution: log2^n

    /* j < n => n j++ => n Inner loop: 1+2*n */
    for (let j = 0; j < n; j++) {
      // Loop body execution: n
      console.log('north song'); }}}Copy the code

It might be a little bit hard to do the first time around, but I’ve graphed the modules to help you understand them

 / * 1 + 2 * log2 (outer loop) ^ n + log2 (loop body) 1 + 2 ^ n * n (inner loop) + n (loop body) = > 1 + 3 * 1 + 2 * log2 n results ^ n + log2 ^ n * n why by 1 + 3? Because the outer loop will execute log2^n times, the inner loop will execute the same number of times after the kuai coefficient: 1+log2^n+log2^n*n kuai base: logn+log^n*n kuai lower order: logn *n final result: nlogn */
Copy the code

O(n^2) square order

function test(n) {
  let sum = 0 / / 1
  let i = 1 / / 1
  let j = 1 / / 1

  /* I <= n => n ++ I => n Outer loop execution: 2*n */
  for (; i <= n; ++i) {
    // How many times does the body of the outer loop execute depending on when it breaks out of the loop
    // Loop body execution: n times

    j = 1 // n

    /* j <= n => n ++j => n Inner loop execution: 2*n */
    for (; j <= n; ++j) {
      // How many times does the inner loop body execute depending on when it breaks out of the loop
      // Loop body execution: n times
      sum = sum + i * j
    }
  }
}



Copy the code
 / * outside loop: 3 * 3 n (outer loop) + n (loop body) 2 * n (inner loop) + n (loop body) the results of 3 + 3 n + n * 3 n = > (n = = 3 * 3 n n ^ 2) = > 3 + 3 n + 3 n ^ 2 HuLv coefficient after: 3 + n + n ^ 2 HuLv after the base number: After low order: n^2 final result n^2 */
Copy the code

O(n^3) cubic order

 function test(n) {
   for (let i = 0; i < n; i++) {
     for (let j = 0; j < n; j++) {
       for (let k = 0; k < n; k++) {
         console.log('north song')}}}}Copy the code

It’s possible that a 3-tier loop will show up in the algorithm, but it’s unlikely that the code with O(n^3) will show up in the algorithm.

The derivation is order n^2) * n= > O(n^3)
Copy the code

O(2^n) exponential order

O(2^n) exponential growth trend is very unstable, n(large data size, very large growth trend), time complexity is similar to O(n^2), when the data size is large, more than 0(n^2) and (n^3) equal

function fib(n) {
  if (n <= 1) return n
  return fib(n - 1) + fib(n + 1)}Copy the code

Does this code look familiar to you? This is Fibonacci in recursion, and it’s O(2^n) in time, and it involves recursive calls, so his derivation is complicated

The time complexity of recursively called functions depends not only on the complexity in the current function, but also on how many times the current function will be called

fib(5)
Copy the code

0.5*2^n-1After the law coefficient:2^n-1After the lower order of kublu:2^n
Copy the code

So this is order 2 to the n.

As you can see, O(2^n) must run longer than O(n^2) when the data size is large enough, so our recursive Fibonacci gets stuck above 50(CPU performance). As you can imagine, (50-1, 50+1) recurses all the way down until n<=1, which is very slow. Therefore, in general, the algorithm does not have such naked recursion. For example, the common DFS (depth-first search) must have pruning judgment. Let it not recurse, and let only the branches that meet the criteria recurse.

All kinds of cases of time complexity

There are also categories for the time complexity of algorithms:

  • Average time complexity
  • Worst time complexity
  • Best time complexity

, for example, we write a program to make the robot in a 10 km of the runway to help find his mobile phone (cell phone must be on the road and will not be picked up by others), the best is the mobile phone is found in 100 meters, in the worst case is found at the end of the 10 km, and the corresponding here is best and worst case

That corresponds to this code right here

function test(n, x) {
  let sum = 0
  for (let i = 0; i <= n; i++) {
    if (i === x) break
    sum += i
  }
  return sum
}
Copy the code
test(100.100) // O(n)
test(100.1) // degenerate to O(1)
Copy the code

But we usually estimate the time complexity of the method only by looking at the average time complexity: that is, the time complexity that you derive is the result of what.

Time complexity derivation of sorting algorithm

Before we begin, let me explain what is stable and what is unstable.

  • Stable: If a is ahead of B and a= B, a will still be ahead of B after sorting.
  • Unstable: If a is in front of B and a= B, a may appear behind B after sorting.

So what does an unstable sorting algorithm do to us?

let list = [
  {
    id: 'a'.sort: 1.name: 'Project Settings'.path: '~/xx/xx'
  },
  {
    id: 'b'.sort: 1.name: 'Cell setup'.path: '~/xx/xx'},]Copy the code

Like the set above, the item setting will always be first if the sorting algorithm is stable, but the unit setting may be first if the sorting algorithm is unstable

Bubble sort time complexity derivation

function bubbleSort(arr) {
    let l = arr.length;
    for (let i = 0; i < l - 1; i++) {
      for (let j = 0; j < l - 1 - i; j++) {
        if (arr[j] > arr[j + 1]) {
          let temp = arr[j + 1];
          arr[j + 1] = arr[j]; arr[j] = temp; }}}return arr;
  }
Copy the code

Data size (arr. Length = n), the outer loop will go n times, I increases by +1 each time, I =1,n-1; I = 2, n – 2, the inner loop every time in the original basis of n – 1, exchange of the operation of the two elements for (n – 1) + (n – 2) + (n – 3) +… +2+1 => n*(n-1)/2

After ignoring the lower order: n*n= > n^2
Copy the code

This is his average case, and if you want to optimize the current algorithm for the best case you have to make sure that all the sequences themselves are sorted; We also need to change the algorithm a little bit, so if we don’t swap places when we do the first sort, we can make sure that it’s a sorted array, so we don’t have to do anything about it.

function bubbleSort(arr) {
    let l = arr.length;
  	let isDidSwrap = false
    for (let i = 0; i < l - 1; i++) {
      isDidSwrap = false
      for (let j = 0; j < l - 1 - i; j++) {
        if (arr[j] > arr[j + 1]) {
          let temp = arr[j + 1];
          arr[j + 1] = arr[j];
          arr[j] = temp;
          isDidSwrap = true}}if(! isDidSwrap)break;
    }
    return arr;
  }
Copy the code

Write in the last

Practice makes perfect, shortage in play

If there is that piece of writing is not good or there is a problem, you are welcome to point out, I will also keep modifying in the following article. I hope I can grow with you as I progress. Friends who like my article can also follow it, I will be grateful for the first batch of people to follow me. At this time, young you and I, travel light; Then, rich you and I, full of goods.