The last article detailed bubble sort and its optimization, interested can have a look:
How to optimize bubble sort?
SelectionSort
 Algorithm idea: first find the smallest (large) element in the unsorted sequence, store it to the beginning of the sorted sequence, and then continue to find the smallest (large) element from the remaining unsorted elements, and then put it at the end of the sorted sequence. And so on until all the elements are sorted.
 Sorting process :(default ascending)
 Find the minimum value from the original sequence and swap with the first element of the array;
 Find the minimum value from the remaining unsorted sequence except the first element and swap it with the second element of the array;
 There are N1 trips, and each trip finds the unsorted minimum and puts it after the sorted sequence.
As shown in the figure, find the unsorted minimum for each trip and place it after the ordered sequence (that is, the current trip corresponds to the number of elements in the array).
 Java implementation of selection sort:
private static <T extends Comparable<? super T>> void selectionSort(T[] nums) {
if (null == nums  nums.length == 0) {
throw new RuntimeException("Array null or length 0");
}
int length = nums.length;
int minValueIndex = 0;
T temp = null;
for (int i = 0; i < length  1; i++) {
minValueIndex = i;
for (int j = i + 1; j < length; j++) {
if (nums[j].compareTo(nums[minValueIndex]) < 0) {
minValueIndex = j;
}
}
if(minValueIndex ! = i) {
temp = nums[i];
nums[i] = nums[minValueIndex];
nums[minValueIndex] = temp;
}
}
}
Copy the code
 Time and space complexity and stability analysis:
 Best time complexity: The best case is that the input sequence is in ascending order, and n*(n1)/2 times need to be compared, but there is no need to exchange elements, that is, the number of exchange is: 0; So the best time is O(n^2).
 Worst time complexity: The worst case is if the input sequence is in reverse order, then each trip needs to be swapped. That is, we need to compare n times (n1)/2 times, and the number of elements exchanged is: n1 times. So the worst time is order n^2.
 Average time complexity: O(n^2).
 Space complexity: Only one temporary variable is used, so the space complexity is O(1);
 Stability: Unstable sort. Such as sequence 3,5,3,1. The first swap results in 1,5,3,3, and we find that the first 3 in the original sequence comes after the second 3.
InsertSort

Algorithm idea: By constructing ordered sequence, unordered data can be scanned from back to front in the sorted sequence to find the corresponding position and insert. Insert sort Therefore, in the process of scanning from back to front, it is necessary to repeatedly move the sorted elements backward step by step to provide insert space for the latest elements.

Sorting process :(default ascending)
InsertionSort is the same process as when you pick up cards one by one from a poker table and sort them on your hand.
For example:
Input: {4, 3, 8, 5, 2, 6, 1, 7}.

First pick up the first card with {4} in your hand.

Pick up the second card 3 and insert 3 into the card {4} in your hand to get {3, 4}.

Pick up the third card 8 and insert 8 into the card {3, 4} to get {3, 4,8}.

And so on.
Insertion sort consists of N minus one sort. Insertion sort guarantees sorted elements from position 0 to position P after sorting p=1 to n1. That is, the insertion sort takes advantage of the ordered condition from position 0 to position P1, and inserts the element at position P forward to the appropriate position.
As shown: on the PTH trip, we move the element at position P to the left until it is found in the correct position in the previous p+1 element (including the element at the current position).

Java implements insert sort
private static <T extends Comparable<? super T>> void insertSort(T[] nums) {
if (null == nums  nums.length == 0) {
throw new RuntimeException("Array null or length 0");
}
int length = nums.length;
T temp = null;
int i = 0;
for (int p = 1; p < length; p++) {
temp = nums[p];
for (i = p; i > 0 && (temp.compareTo(nums[i  1]) < 0); i) {
nums[i] = nums[i  1];
}
nums[i] = temp;
}
}
Copy the code 
Time and space complexity and stability analysis:
 Best time complexity: In the best case, the sequence is already in ascending order, in which case n1 comparisons need to be made. That is, the best time complexity is O(n).
 Worst time complexity: In the worst case, if the sequence is in descending order, n(n1)/2 comparisons are required; The number of moves (assignment operations) is the number of comparisons minus n1 (because each cycle has one more comparison than the assignment, a total of N1 cycles), that is, n(n1)/2 – (n1); So the worst time is O(n^2).
 Average time complexity: O(n^2).
 Space complexity: Only one temporary variable is used, so the space complexity is O(1);
 Stability: Stable.
Third, summary
The main advantage of selection sort has to do with data movement. If an element is in the correct final position, it will not be moved. Selection sort Each time a pair of elements are swapped, at least one of them will be moved to its final position, so sorting a list of n elements does a total of N1 swaps. Of all the sorting methods that rely entirely on swapping to move elements, selective sorting is the best one. The best and worst time complexity of selection sort is O(n^2), and the space complexity is O(1), which belongs to unstable sort.
Insertion sort is not suitable for sorting applications with large amounts of data. However, if the amount of data to be sorted is small, for example, the magnitude is less than thousands; Or if you know that the input elements are roughly in order, then insertion sort is a good choice. The best time complexity of insertion sort is O(n), the worst time complexity is O(n^2), and the space complexity is O(1), which belongs to stable sort.