Insertion sort

So, to borrow an example from Introduction to Algorithms, when we play cards, every time we pick up a new card, we insert it in order, which is essentially insertion sort.


Sister Qi declared: Although we use the example of playing cards, we can’t imitate Mr. Hu Shi.


What about arrays?

There is an important idea, called the baffle method, which uses a baffle to divide an array into two ranges:

  • Left side of baffle: sorted
  • Right side of baffle: not sorted

So there are three steps to sort:

  1. The initial block is at the left of the array to ensure that there is no number in the sorted range, or it can contain a number.

  2. The core idea is: iterating through the elements in the unsorted interval, finding the correct position to insert in the sorted interval;

  3. Repeat this process until the unsorted interval is empty.

For example: {5, 2, 1, 0}

The first step, the baffle is originally here:


In the second step, insert 2 into the correct position of the sorted interval, becoming:


Repeat this step to arrange the 1s:


Finally, put the 0 in order:


The code is also very simple:

public void insertionSort(int[] input) {
 if(input == null || input.length <= 1) {
  return;
 }
 for(int i = 1; i < input.length; i++) {
 int tmp = input[i];  int j = i - 1;  while(j >= 0 && input[j] > tmp) {  input[j+1] = input[j];  j --;  }  input[j+1] = tmp;  } } Copy the code

Let’s analyze the time and space complexity of this algorithm.

Time complexity

There are two important points about time complexity:

  • Is to describe the growth rate of the time required as the independent variable increases;
  • It’s asymptote complexity, which means
    • Don’t look at coefficient
    • Just look at the highest order terms

So the worst case we care about is: if the array is almost in reverse order, and each insertion is at the first position in the array, then all the elements in the sorted range are moved one bit further back, and this step is O(n) on average, so n times is O(n^2).

Spatial complexity

The focus is on the concept of a peak, not cumulative use of space.

This is order one, there’s nothing to say.

3. Sorted in place: sorted in place

Sort in place is an O(1) algorithm, because it’s not taking up any extra space, it’s just going around in place.

The idea of in-place is not unique to sorting algorithms, but sorting algorithms are one of the most well-known examples. It’s essentially a space-saving idea.

But for the sorting algorithm, it is not enough to analyze its temporal and spatial complexity, there is another important indicator:

The stability of

This means whether the relative order of elements remains the same.

For example: {5, 2, 2, 1, 0}

When this array is sorted, the relative order of these 2’s is the same, so this sort is a stable sort.

So some of you might be thinking, what does it matter if the order changes?

In fact, in the actual work, the objects we sort are not just a number, but objects one by one. Then we sort according to one property of objects first, and then according to another property, and we do not want the original order to be changed. It’s kind of abstract. Let’s do an example.

For example, in the stock trading system, there are offers from both sides, how is that matched?

  • Sort by price first;
  • Sorted by chronological order of bids in equal prices.

Generally speaking, the system will maintain a price sequence sorted by time, so it only needs to use a stable sorting algorithm, and then sort by the price size. Because the stable sorting algorithm can keep two objects of the same size still maintain the original chronological order.

So is insertion sort a stable sort?

The answer is yes. Because when we insert new elements, we check them from back to front, not like when we insert random positions that can’t guarantee relative order.

You can see the following animation [1] is very clear ~


To optimize the

Insert sort actually has a lot of room for optimization, if you look up “Hill sort.”

At the beginning of learning, depth is important, but because of the lack of breadth, if you learn too much depth may be very painful. A knowledge point will be endlessly extended, which is not an efficient way of learning.

Balance breadth and depth when time is limited:

  • Spend more time and energy on commonly used knowledge points and pursue depth;
  • In some of the knowledge points on the extension of the point, know that there is such a thing on the line.

Keep an open mind, and there will be qualitative improvement in the later period.



Selection sort

Selection sorting is also the use of the “baffle method” this classic idea.

The left side of the baffle is sorted and the right side is unsorted, so the “selection” each time is to find the minimum value of the right side of the unsorted range, and then switch with the first value behind the baffle, and then move the baffle one bit to the right to ensure that the sorted elements are on the left side of the baffle.

For example: {5, 2, 0, 1}

We use a baffle to separate the sorted array and a pointer j to find the minimum value of the unsorted interval.


In the first round, j starts at 5, and then it goes through the whole unsorted interval, and then it ends at 0, so 0 swaps with the first element behind the baffle, so it swaps with 5, and the baffle moves one bit to the right, ending the first round.


In the second round, j starts at 2 behind the baffle and eventually points to 1. Then 1 switches with the first element 2 behind the baffle, and the baffle moves one bit to the right, ending the second round.


On the third round, j starts at 2, eventually points to 2, and then switches with 2 itself, and the baffle moves one spot to the right, ending the third round.


We have one element left, we don’t have to iterate, we’re done.

In contrast to the insertion sort, note two things:

  1. Baffles must start at 0, not 1. Although the physical meaning of baffles in both algorithms is to separate sorted and unsorted ranges, the meanings of elements placed in sorted ranges are different:
  • Select sort can only put the current minimum value in, not anything else;
  • The first element of the insertion sort can be any value.

So you can’t start with any elements to the left of the select sort baffles.

  1. In the outer loop,
  • The last round of selection sort can be omitted because only the largest element is left;

  • The last round of the insertion sort cannot be omitted because its position has not been determined.

    class Solution {
      public void selectionSort(int[] input) {
        if(input == null || input.length <= 1) {
          return;
        } 
     for(int i = 0; i < input.length - 1; i++) {  int minValueIndex = i;  for(int j = i + 1; j < input.length; j++) {  if(input[j] < input[minValueIndex]) {  minValueIndex = j;  }  }  swap(input, minValueIndex, i);  }  }  private void swap(int[] input, int x, int y) {  int tmp = input[x];  input[x] = input[y];  input[y] = tmp;  } } Copy the code

Time complexity

The innermost if statement is O(1) every time it is executed. How many times should it be executed?

  • When I = 0, n-1;
  • When I = 1, n-2;
  • .
  • And then finally 1;

So that adds up to: (n-1) + (n-2) +… + 1 = n*(n-1) / 2 = O(n^2)

That’s how you do it, instead of just saying two cycles is O(n^2).

Spatial complexity

This is pretty simple, at most when you call swap(), and then each layer on the call stack uses a finite number of variables, so O(1).

So that’s sort in place.

The stability of

The answer is no, there is no stability in selection sorting.

For example: {5, 5, 2, 1, 0}, the first swap is 0 and the first 5 swap, so the first 5 goes to the last bit of the array, and there is no place to go, so the relative order of the first 5 and the second 5 is out of order.

This problem is in stone elder brother’s that Google face classics article have been tested to oh, if have not seen this face classics article, in “code farmland small qi” public number reply “Google” two words, you can see.

To optimize the

One step of the selection sort is to pick the minimum value for each round, so this step can be optimized from O(n) to O(logn) using Heapify (), which actually becomes heapSort.


The resources

[1]

Animation: https://visualgo.net/en/sorting