1. What are data structures? What is an algorithm?

  • To put it simply, data structure is the way of data storage. For example, array is to store data in a continuous segment of memory, while linked list is to store data in any available memory through the association of Pointers. Stacks are fifo, queues are fifO.

  • The algorithm is the operation method of these data, such as data insertion, search, delete, sort, etc.

  • The data structure serves the algorithm, and the algorithm operates on the specified data structure.

2. Complexity analysis?

  • The purpose of learning data structure and algorithm is to optimize the use of memory in practical application and improve the efficiency of program operation, while complexity analysis is to provide us with a standard to measure the quality of code.

  • If we can qualitatively know the memory and time consumption of our code without running the program, this will give us an overall assessment of our current program and future improvements.

  • Running the program directly can know the execution time and memory footprint of the algorithm, but this process is often affected by the running environment and the size of the data, so we need a way to estimate the efficiency of the algorithm without specific testing, which is called complexity analysis.

3. Time complexity

3.1 Large O complexity representation

int cal(int n) 
{
   int sum = 0;
   int i = 1;
   for (; i <= n; ++i) 
   {
      sum = sum + i;
   }
   return sum;
 }
Copy the code
  • Assuming that each line of code takes t, the first and second lines of code take 2 * T, and the third and fourth lines take 2n * t, and the total time is (2n+2) * t. The total running time of the code is proportional to n.
  • It can be expressed as O(2n+2) by big O method, which does not represent the actual execution time of the code, but only represents the change trend of the code execution time with the data size.
  • When n is large enough, the low order, constant and coefficient can be ignored and expressed as O(n) directly.

3.2 Common analysis methods

  • Loop most code, focus on
  • Serial code, adding complexity
  • Nested code, multiplied by complexity

3.3 Several common complexities

Polynomial magnitude

  • Constant order

  • The logarithmic order

  • Linear order

  • Linear logarithmic order

  • Power order

Non-deterministic Polynomial

  • The index order

  • Factorial order

  • The execution time of non-polynomial algorithm will increase sharply with the expansion of data scale, and it is a very inefficient algorithm.

3.4 Progression

  • Best Case Time Complexity
  • Worst Case Time Complexity
  • Average Case Time Complexity

For example, take a look at the following code

Int find(int[] array, int n, int x) {int I = 0; int pos = -1; for (; i < n; ++i) { if (array[i] == x) { pos = i; break; } } return pos; }Copy the code

The best case time complexity is that in the optimal state of the program, the first element in the array is the element we’re looking for, and we only need to look it up once; And the worst-case time complexity is that in the worst-case state of the program, the last element of the array is the element we’re looking for, we need to look for the entire array;

In fact, the element we’re looking for could be anywhere in the array, or it might not even be in the array, so we’re going to average the time complexity of each case, considering the probability of each case, which is the average case time complexity.

  • Amortized Case Time Complexity
Array. length = n int[] array = new int[n]; int count = 0; void insert(int val) { if (count == array.length) { int sum = 0; for (int i = 0; i < array.length; ++i) { sum = sum + array[i]; } array[0] = sum; count = 1; } array[count] = val; ++count; }Copy the code

The function of this code is to insert an element into the array, when the array is not full, directly insert, time complexity O(1); When the array is full, the sum of all elements of the array is calculated first, and then the elements are inserted. The time complexity is O(n).

Moreover, the two operations of different complexity have a certain rule: a series of O(1) inserts lead to the array being filled, followed by an O(n) insert, and the cycle continues.

In this case, we can evenly distribute the O(n) complexity operation to the previous O(1) complexity operation, and the overall time complexity becomes O(1), which is the time complexity of the amortized case.

If most cases have low time complexity and only a few cases have high time complexity, and these operations have sequential relationships, then we can apply amortized case time complexity to the analysis. In general, the time complexity of the amortized case is equal to the time complexity of the best case.

4. Calculation of time complexity

  • Set of functions of the same order

It's called the set of functions of order F (n).Copy the code

  • Set of lower order functions

It's called the set of functions lower order than f(n).Copy the code

  • Set of higher-order functions

It's called the set of functions of higher order than f(n).Copy the code

  • Strictly set of low order functions

Called f(n), a strictly lower order set of functions.Copy the code

  • Strictly set of higher-order functions

It's called the set of functions of higher order than f(n).Copy the code
  • Iterative method for solving recursive equations

  • Master’s theorem solves recursive equations

5. Space complexity

  • Space complexity represents the changing trend of the memory occupied by the program with the data size. It is only necessary to analyze the space allocated to the data in the program. Generally, the common complexity includes O(1), O(n), and O(n*n).

Resources – Geek Time: The Beauty of Data Structures and Algorithms