“This is the second day of my participation in the November Gwen Challenge. See details of the event: The last Gwen Challenge 2021”.

I originally wanted to call it “Introduction to Front-end Algorithms”, but I was afraid that I was not good enough to lead you to the bottom, so this series of articles was renamed to learn front-end algorithms with me

We take a problem and we write an algorithm out of it. At this point in the mind straight call: I too fierce 🐮. But the question is, how do we prove that the algorithm we wrote is good enough? Xiaoming next door also solved this problem with violence algorithm, the algorithm you just wrote better than what he wrote there?

If you go to Baidu at this point, you’ll be told that an algorithm is usually measured by how long it takes to run and how much memory it takes. Because the most important resources of a computer are CPU and memory, your algorithm can affect other programs if it is too heavy on both.

And that theory, we now know, requires comparing the amount of time and memory that different algorithms take. But how do we actually judge that?

Why don’t you actually run it and count how much time and memory it takes?

Not to mention that it would take a lot of effort, but even if it did, it wouldn’t make much sense. CPU performance and memory size may vary from computer to computer, and the actual magnitude and size of the calculated data may be different. The result of one set of data from one computer may be reversed in another case.

So is there a reasonable way to evaluate the merits of an algorithm?

The answer is yes, we generally use time asymptotic complexity, and space asymptotic complexity, both of which are represented by the big O notation.

Time complexity

To calculate the time complexity, we usually estimate the number of operational units of the algorithm, and each unit runs for the same amount of time. Therefore, the total running time and the number of operation units of the algorithm differ by at most a constant coefficient.

What is the operating unit? We can simply understand that the CPU executes an instruction, but the number of instructions corresponding to different operations of our program is different, and we do not remember which instructions will be executed. Therefore, it can be simply considered as an operation, including assignment, judgment, addition, subtraction, multiplication and division. In the following example, we assume that the number of data to be processed is N. The first algorithm multiplicates and then adds each data, so the number of operation units can be considered as 2N. The second algorithm only subtracts each data once, so the number of operation units can be considered as N.

const newArr1 = arr.map(n => n * 2 + 4);
const newArr2 = arr.map(n => n - 2)
Copy the code

However, this can be a problem. Many algorithms do not simply loop through each item, but have various conditional statements, so that different input values of the same size (data volume) may cause different algorithm running time.

Const newArr = arr.map(n => {if (n < 100) {return n; // If (n < 100) {return n; } else {return n ** (n * 0.6-10); }})Copy the code

So we usually use the worst-case complexity of the algorithm, called T(n), defined as the maximum running time required for input N of any size.

So if you have T(n), can you analyze and compare the running time of the algorithms? It’s not really a comparison.

For example, T(n) of algorithm A = 100n, T(n) of algorithm B = 5n*n, is 100 * 5 > 5 * 25 when n = 5? Can we say that time complexity of A is higher than that of B? Obviously that’s not quite right.

Therefore, in order to further facilitate the comparison of the running time of the algorithm, we have the time asymptotic complexity represented by the big O symbol.

The Big O notation is a mathematical notation used to describe the asymptotic behavior of a function. Rather, it describes an asymptotic upper bound on the order of magnitude of a function in terms of another (usually simpler) function. In mathematics, it is generally used to describe truncated infinite series, especially the residual term of asymptotic series.

F (n) is said to be of the same order of magnitude as T(n) if there exists a function f(n) such that the limit value of T(n)/f(n) is not equal to zero as n approaches infinity. Let’s call it T(n) = O(f(n)), O(f(n)).

Big O is used to reflect the order of the function, the coefficient and so on can be ignored, and only look at the largest order of the function.

Therefore, the general steps to derive the asymptotic time complexity are: 1. Find the term with the highest order of the function; 2. Ignore the coefficient on this term.

For example: T(n) = 10n ^ 3 + 1000n ^ 2 + 10000n + 10000, first only look for the highest order, which is 10n ^ 3, and then ignore the coefficients to get O(n ^ 3).

Similarly, we can get a simple algorithm: if two algorithms are combined, they generally can only be added or multiplied. When they are added, the higher complexity is directly taken, and if they are multiplied, the complexity is multiplied.

Here’s an example:

  • O(1) + O(n) = O(n)
  • O(n) * O(log n) = O(n log n)

The following is a list of common function categories when analyzing algorithms. All of these functions are at n goes to infinity, and the ones that grow slowly are listed at the top. C is an arbitrary constant.

symbol The name of the
O(1) Constant (order, same below)
O(log*n) The iterationlogarithmic
O(log n) logarithmic
O[(log n)^c] More logarithmic
O(n) Linear, sublinear
O(n log n) Linear logarithmicOr,logarithmicLinear, quasilinear, superlinear
O(n^2) square
O(n^c),Integer(c>1) Polynomials, sometimes called algebras.
O(c^n) Exponents, sometimes called “geometry”
O(n!) factorial, sometimes called a combination.

conclusion

Ahhhh, it’s half past eleven!!

That’s the end of this article, and we’ll talk about space complexity tomorrow. In this series of articles, you will learn about front-end algorithms step by step. Subsequent articles will be included in the column for you to view, you can follow me or collect this column.

The next article will give a brief introduction to some of the things you often hear about in algorithms, such as spatial complexity.