1. What is data structure

Data structures are the way computers store and organize data.

  • Linear structure: a linear table with n sequences of the same type (n>=0) (array, linked list, stack, queue, hash)

  • Tree structure: binary tree, AVL tree, red black tree, B tree, heap, Trie, Huffman tree, and search set

  • Graph structure: adjacency matrix, adjacency list

In practice, the most appropriate data structure is selected according to the application scenario

2. What is an algorithm

An algorithm is a series of execution steps used to solve a particular problem

/ / calculate a + b and func plus (_ a: Int, _ b: Int) - > Int {} a + b / / calculation of 1 + 2 + 3 + 4 + 5 +... Func sum(n:Int) -> Int {var result:Int = 0; for _ in 0... n { result += n } return result }Copy the code
  • Using different algorithms to solve the same problem can vary greatly in efficiency
  • Let’s say we find the NTH Fibonacci number

3. How to evaluate the quality of the algorithm?

  • If you evaluate it purely on efficiency, you might think of something like this
    • Compare the execution processing times of different algorithms for the same set of inputs. Also known as post-hoc statistics
    • Execution time is heavily dependent on hardware and various environmental factors at runtime
    • The corresponding measurement code must be written
    • The selection of test data is difficult to ensure impartiality
  • Therefore, the following dimensions are generally used to evaluate the merits and demerits of algorithms
    • Accuracy, readability, robustness (ability to respond to and deal with unreasonable input)
    • Time complexity: Estimating the number of times a program instruction is executed (execution time)
    • Space complexity: Estimate the required storage space

3.1 Big O notation

Complexity is usually described in the big O notation, which refers to the complexity of data size N

  • Ignore constants, coefficients, low order, logarithmic order and generally ignore the base
    • 9 >> O(1)
    • 2n+3 >> O(n)
    • n^2+2n+6 >> O(n^2)
    • 4n^3+3n^2+22n+100 >> O(n^3)
    • Log2n = log29 ∗ log9. So log base 2n and log base 9n are called log base n.
  • Note that the big O notation is only a rough analysis model, an estimate that helps us understand the efficiency of an algorithm in a short period of time

3.2 Common complexity

Number of executions The complexity of the Informal terms
12 O(1) Constant of the order
2n+3 O(n) Linear order
4n^2 +2n+6 O(n^2) Square order
4log2^n + 25 O(log^n) The logarithmic order
3n + 2nlog3^n + 15 O(nlog^n) Nlogn order
4n^3 +3n2 +22n+100 O(n^3) Cubic order
2^n O(2^n) The index order
  • O(1) < O(log^n) < O(n) < O(nlog^n) < O(n^2) < O(n^3) < O(2^n) < O(n!) < O(n^n)
  • Can use the size of the complexity of function generator  zh.numberempire.com/graphingcal…

The data size is small

When the data scale is large

3.2 Time complexity analysis of Fibonacci sequence FIB1 function

  • The recursive method
func fib1(_ index:Int64) -> Int64 { if index <= 1 { return index; } return fib1(index-1) + fib1(index-2)}Copy the code

  • Iterative method
func fib2(_ n:Int64) -> Int64 { if n <= 1 { return n } var first:Int64 = 0 var second:Int64 = 1 for _ in 0... N-2 {let sum:Int64 = first + second first = second second = sum} return second}Copy the code
  • Algebraic formula (not all algorithms can be solved directly by formula), time complexity O(1)
  • What is the difference between recursion and iteration (n = 64)
    • If you have a normal computer at 1GHz, you can do 10 to the ninth calculations per second
    • O(n) takes about 6.4 x 10^-8 seconds
    • O (2^n) takes about 584.84 years
    • Sometimes the gap between algorithms is bigger than the gap in hardware

4. Multiple data sizes

func test(_ n:Int,_ k:Int){ for _ in 0... n { print("test") } for _ in 0... k { print("test") } }Copy the code

The complexity of O (n + k)

5. Optimization direction of the algorithm

  • Use as little storage space as possible
  • Use as few steps (time) as possible
  • Depending on the situation, you can: time for space, space for time

More complexity

  • Best, worst complexity.
  • Amortized complexity.
  • Complexity oscillation
  • Average complexity

*…