Writing in the front

When learning about data structures and algorithms, you’ll often see O(1),O(n), etc., used to represent time and space complexity, so what does that mean? We often have different solutions to the same problem, such as sorting algorithm there are ten kinds of classic sort (quick sort, merge sort, etc.), although the sorting results are the same, but the sorting process in the cost of time and resources are different.

The way to measure the difference between sorting algorithms is to measure the time and space that the program takes to execute.

High school math

function

Let A and B be non-empty sets of numbers. If, according to some definite correspondence f, for any number x in set A, there is A unique definite number F (x) corresponding to it in set B, then f: A→B is called A function from set A to set B. Y =f(x), x∈A. Where x is called the independent variable, and the range A of x is called the domain of the function; Correspond to the value of x, y value is called the function, the set of function values {f (x) | x ∈ a.} range of values of the called function.

For example: given that the domain of f(x) is [3,5], find the domain of *f(2x-1)*.

Power function


y = x k y=x^k

Exponential function


function y = a x ( a > 0 and a indicates 1 ) It’s called the exponential function, the independent variable is called the exponential, a It’s called the base. The function y=a^x (a>0 and a\neq1) is called an exponential function, the independent variable is called an exponential, and a is called the base.

Logarithmic function


if a ( a > 0 . a indicates 1 ) the b Power is equal to the N . namely a b = N . then b Called to a At the bottom N The logarithm of theta, let’s call it theta l o g a N = b . Among them a It’s called the base of the logarithm, N Called count If a(a>0,a\neq1) to the power b is equal to N, that is,a ^b=N, then b is called logarithm base a of N, which is called log_aN=b, where a is called the base of the logarithm and N is called the real number

Time complexity

F (n) is a function of the same order of magnitude as T(n) if there is a function f(n) such that the limit value of T(n)/ f(n) is a constant that does not equal zero as n approaches infinity. T(n)= O(f(n)), O(f(n)) is the asymptotic time complexity of the algorithm, or time complexity for short.

This is simply the amount of time (or the total number of times code is executed) that an algorithm or program takes to run.

In the following program:

int sum(int n) {1.int value = 0; 2.int i = 1; 3.while(I <= n) {④ value = value + I; (5) i++; } 6.returnvalue; } where n =100, the execution times of this method is ①(1(2). (2).1(3), (4)100() 4.100() 5.100⑥ (= ⑥)1Times) combined1+1+100+100+100+1 = 303timeCopy the code

The above result, if expressed as a function: f(n) = 3n+3, is expressed as follows in the computer algorithm.

Said method

Big O notation: The time complexity of an algorithm is usually expressed in big O, defined as T(n) = O(f(n)), where T represents time.

T(n) = O(3n+3)

The important point here is that time complexity is about orders of magnitude, and the principle is:

  1. Omit the constant. If the running time is a constant order of magnitude, use the constant 1
  2. Keep the highest order terms
  3. The coefficient of the highest order term is 1

For example, 2n 3 + 3n2 + 7, the elliptic constant becomes O(2n 3 + 3n2), the highest order term is retained as O(2n 3), and the coefficient of the highest order term becomes O(n 3) after changing to 1, that is, O(n 3) is the time complexity of 2n 3 + 3n2 + 7.

Similarly, in the above program T(n) = O(3n+3), the time complexity is O(n).

Note: Look only at the most complex operation, which is the inner loop in the above program.

Order of time complexity

The order of time complexity is divided into the following categories

Constant order O (1)

int n = 100; System.out.println(" constant order: "+ n);Copy the code

No matter what n is, the program will only be executed once, T(n) = O(1).

The logarithmic order O (logn)

For (int I =1; int I =1; i <= n; I = I * 2) {system.out.println (" log order: "+ n); }Copy the code

The value of I grows as the number of pairs of n is read as logarithm base n of 2, i.e., f(x) = log base 2n, T(n) = O(log base 2n), or O(log base n).

Then the quadrant with logarithmic base greater than 1 is generally expressed as:

Linear order O (n)

for (int i = 1; i < n; I++) {system.out.println (" linear order: "+ n); }Copy the code

The program is run as many times as the value of n, like y = f(x), or T(n) = O(n).

Order nlog of linear logn)

for (int m = 1; m <= n; m++) { int i = 1; while (i < n) { i = i * 2; System.out.println(" linear logarithm order: "+ I); }}Copy the code

Linear logarithmic order O(nlogn), which is pretty easy to understand, if I loop through logarithmic order O(nlogn) code n times, then it’s n times O(nlogn), so it’s O(nlogn), and merge sort is O(nlogn).

If n= 2, the program is executed 2 times, if n=4, the program is executed 8 times, and so on

Square order O (n2)

for (int i = 1; i <= n; i++) { for (int j = 1; j <= n; J++) {system.out.println (" square order: "+ n); }}Copy the code

If n = 2, print 4 times, if n = 3, print 9, that is, T(n) = O(n2).

Similarly, the cubed order is O(n3), and if you change the 3 to k, it’s O(nk) to the k, which is more complicated.

The above five time complexity relationships are:

It can be concluded from the figure above that when the value of n on the X-axis gets larger and larger, the time spent on the Y-axis is:

O(1) < O(logn) < O(n) < O(nlogn) < O(n2)

There are more than four of them in programming algorithms, such as O(n3), O(2n), O(n!). O(nk) and so on.

To see how this is proved mathematically, you can look at the master theorem if you’re interested.

Note: The following data is from the Big-O Cheat Sheets, a list of commonly used Big O markup methods, and their performance comparison with input data of different sizes.

Big O notation Compute 10 elements We compute 100 elements We compute 1,000 elements
O(1) 1 1 1
O(logN) 3 6 9
O(N) 10 100 1000
O(NlogN) 30 600 9000
O(N2) 100 10000 1000000
O(2N) 1024 1.26 e+29 1.07 e+301
O(N!) 3628800 9.3 e+157 4.02 e+2567

The complexity of common data structure operations

The data structure The connection To find the insert delete
An array of 1 n n n
The stack n n 1 1
The queue n n 1 1
The list n n 1 1
Hash table n n n
Binary search tree n n n n
B tree log(n) log(n) log(n) log(n)
Red and black tree log(n) log(n) log(n) log(n)
AVL tree log(n) log(n) log(n) log(n)

The complexity of the array sort algorithm

The name of the The optimal On average, The worst memory stable
Bubble sort n n2 n2 1 is
Insertion sort n n2 n2 1 is
Selection sort n2 n2 n2 1 no
Heap sort n log(n) n log(n) n log(n) 1 no
Merge sort n log(n) n log(n) n log(n) n is
Quick sort n log(n) n log(n) n2 log(n) no
Hill sorting n log(n) Depending on the gap sequence n (log(n))2 1 no

Spatial complexity

The spatial complexity represents the relationship between the storage space of an algorithm and the data, that is, the space consumed by an algorithm while it is running.

Order of space complexity

The space complexity is much simpler than the time complexity, we only need to understand the common O(1), O(n), O(n2).

Constant order O (1)

int i;
Copy the code

Linear order O (n)

int[] arr;
Copy the code

Square order O (n2)

int[][] arr;
Copy the code

conclusion

For the purpose of this article, I left a question when updating the stack and the queue: are the time complexity of the stack push and the stack out the same as the time of the queue insert and the queue remove? Yes, they are. They both use O(1). Therefore, I thought that it would be very unfriendly for students who just started or did not understand the complexity, or I did not understand the data structure and algorithm in the subsequent articles, so I wrote an article on complexity.

This article only made a brief introduction to the time complexity and space complexity, there are errors can be corrected, do not hard bar, bar is I lose.

reference

  • A set of diagrams to understand the time complexity
  • The time complexity and space complexity of the algorithm

All are one of their own, like to follow me to accept 🤞