🌟🌟🌟🌟🌟

Taste: Spicy chicken

Cooking time: 15min

This article has been featured on Github github.com/Geekhyt, thanks to Star.

After the data structure and algorithm of the pilot brainwashing, DO not know the importance of the data structure and algorithm of the recognition of a level. (Though the amount of reading is pitifully low). If you haven’t seen it, you should see how the front end handles data structures and algorithms.

But that’s ok, at least some of you are looking forward to the next article in the comments section, so keep writing this series, and today I’ll talk to you about the concepts that make up half of all data structures and algorithms.

Time complexity and space complexity

One way to tell if an engineer is an algorithm whiz is to measure his mastery of complexity analysis. It may seem a bit fanciful, but algorithms-savvy complexity analysis is all about feeling.

Ike: Sometimes, time has its benefits. Polarity reversal!

  • Take league of Legends, for exampleIGTo winSAfter a series championship, you’ll find a bunch of onlyconsciousness, but the operation has been unable to keep up with the players appear in the field of vision in qualifying. (Yes, us.) In our school days, we were diamond masters, too. However, because of work, we can only bid farewell to our youth. However,consciousnessAlso in!

Back in this article, time complexity and space complexity are also a problem when the interviewer is testing your algorithmic abilities. Not only do you need to be able to come up with multiple solutions, but you also need to be able to find the optimal solution from a complexity perspective in order to impress the interviewer. In engineering, it is more important to select the optimal algorithm. An excellent algorithm can save the system cost and maintenance cost are huge.

Don’t say much, on the talent!

(Above concept)

First understand time and space:

  • Time: The time spent executing the current algorithm
  • Space: How much memory is required to perform the current algorithm

Plus complexity:

  • Time complexity: The full name is progressive time complexity, which represents the increasing relationship between the execution time of an algorithm and the size of the data.
  • Space complexity: The full name is progressive space complexity, which represents the growth relationship between the storage space and the data size of the algorithm.

In other words, the execution efficiency of the algorithm is determined by execution time and storage space. Complexity analysis is used to analyze the relationship between algorithm execution efficiency and data size, including time complexity and space complexity.

Why do these two concepts come up? Not enough concepts for me to understand?

In fact, you can also do post-hoc statistics, commonly known as “hindsight.” But since it’s called an afterthought, it certainly has its disadvantages.

  • Post-statistical tests are generally dependent on the specific environment, such as the configuration of DEV, SIT, UAT and other environment machines of the company, so the measured results will be different. To put it simply, if you take the same piece of code and run it on different processors (i9, I5, i3), the test results will be different.

  • In addition to the environment, test results are also greatly influenced by the size of the data. Those of you who are familiar with sorting algorithms must know that the efficiency of sorting algorithms varies with data sizes.

So, we need a kind of complexity analysis to do a prior analysis. It helps us write code with minimal complexity so that it can execute as efficiently as possible in different environments. Moreover, this method does not need to test the data of specific data size, so that the execution efficiency can be roughly calculated. So that gives Cover the downside of the post-hoc statistical method.

Big O notation

The big O symbol was first introduced by Paul Bachmann, a German number theorist, in his book Analytic Number Theory in 1892, and later popularized by Edmund Landau, another German number theorist.

T(n) = O(f(n))

The execution time T(n) of all code is proportional to the number of times n of execution per line of code.

  • T(n) : time for code execution
  • N: Data scale
  • F (n) : total number of times executed per line of code
  • O: indicates that T(n) is proportional to f(n)

Note that beginners might think that this method represents the actual code execution time, but not the actual code execution time as the data size increases.

Common time complexity

The order of magnitude increases as follows:

  • Constant order O (1)
  • The logarithmic order O (logn)
  • Linear order O (n)
  • Linear log order O(nlogn)
  • Square order O (n ^ 2)
  • Cubic order O (n ^ 3)
  • The index order O (2 ^ n)
  • Factorial order O (n!

Among them, exponential order and factorial order will increase sharply with the increase of data size N, which is very inefficient, so we will not analyze it for the time being. Let’s take a look at the rest of the time complexity in code.

Constant order O (1)

const a = 1;

let b = 2;

Copy the code

Above code, execution time is not affected by the growth of a variable (n), so its time complexity is O(1). In other words, except for circular statements and recursive statements, the time complexity is O(1).

The logarithmic order O (logn)

let i = 1;

const n = 6;

while (i < n) {

  i = i * 2;

}

Copy the code

Look at the code above. After x loops, the loop exits. That means 2 to the x is equal to n. So x is log base 2 to the n, which is log base 2 to the n times and then exits, which is order log base n. The time of binary search is order logn.

Linear order O (n)

const n = 996;

for (let i = 0; i <= n; i++) {

    console.log('来过' + i +'The front canteen for meals');

}

Copy the code

Of course, the code in the for loop executes n times, so the time complexity of such code is O(n). The time complexity of counting sort, radix sort and bucket sort is O(n).

Linear log order O(nlogn)

let j = 1;

const n = 6;

for (let i = 0; i <= n; i++) {

    while (j < i) {

        j = j * 2;

    }

}

Copy the code

So if you understand logarithmic and linear order, linear log is easy to understand, because if you loop code n times, it’s order nlogn. The time complexity of merge sort, quicksort and heap sort is O(nlogn).

Square order O (n ^ 2)

const n = 6;

for (let i = 0; i <= n; i++) {

    for (let j = 0; j <= n; j++) {

        console.log('The food in the front canteen smells good.');

    }

}

Copy the code

Order squared is O(n) code nested with another layer of loop, its time complexity is O(n^2). The time complexity of bubble sort, insertion sort, and selection sort is O(n^2).

O(n^3) is O(n^2), and then we have another loop. (Russian Nesting dolls)

When we use the big O notation for complexity analysis, we can ignore the coefficients. In general, we only need to focus on the code with the most cycles for analysis.

In addition, “there are best-case time complexity, worst-case time complexity, average-case time complexity, and amortized time complexity”. In practice, most of the time, it’s not very common, so I won’t expand it here.

In real life, code is often complex, so here are a few tips for judging time complexity:

  • Single piece of code see high frequency: loop
  • Multiple code maximization: in the case of loops and multiple loops, take the complexity of multiple loops
  • Product of nested code: Recursion in a loop
  • Multiple scale summation: two parameters control the number of two cycles, take the complexity of the two add

Common spatial complexity

  • O(1)
  • O(n)
  • O(n^2)

Let’s go through the code again:

O(1)

const a = 1;

let b = 2;

Copy the code

The space occupied by variables A and B defined by us will not change with the change of a variable, so its space complexity is O(1).

O(n)

let arr = [];

const n = 996;

for (let i = 0; i < n; i++) {

    arr[i] = i;

}

Copy the code

The memory occupied by ARR is determined by n and increases as n increases, so its space complexity is O(n). “If you initialize a two-dimensional array n*n, its space complexity is O(n^2).”

In addition, the logarithmic order space complexity such as O(logn) and O(nlogn) is also rare in ordinary times, so it is no longer expanded here.

In practice, space complexity depends on the size of the array you initialize. In addition, it also depends on the depth of recursion.

Transformation of time and space

Time complexity and space complexity are often interrelated and cannot be combined. In engineering and algorithm problem solving routines, according to the actual situation, the common practice is space for time. For example: memorized search, cache and so on.

A follow-up algorithm series is planned

  • LeetCode brush tips
  • Common algorithms for solving problems

❤️ Love triple punch

1. Please give me a “like” when you see this. Your “like” is the motivation for my creation.

2. Pay attention to the front canteen of the public account, “your front canteen, remember to eat on time”!

3. This article has been included in the front canteen Github github.com/Geekhyt, for a small Star, thanks to Star.