JAVA nine sort algorithms in detail

A, sorting:

Introduction: Sorting algorithm is to reorder one or more groups of data in accordance with the established pattern through a specific algorithm factor. This new sequence follows certain rules and reflects certain rules. Therefore, the processed data is convenient for screening and calculation, which greatly improves the calculation efficiency.

1. Sorting definition:

The so-called sorting algorithm is to resort one or more groups of data according to the established pattern through a specific algorithm factor. The new sequence follows certain rules and reflects certain rules, so the processed data is easy to screen and calculate, and greatly improves the computational efficiency.

For sorting, we first require it to have a certain stability, that is, when two identical elements appear in a sequence at the same time, after a certain sorting algorithm, their relative positions before and after the sorting will not change. In other words, even if two identical elements are sorted differently, they should not be confused.

Sorting is an important operation in computer programming. It rearranges any sequence of data elements (or records) into a sequence of ordered keywords.

2. Sorting:

Sorting is to sort the elements in the set in a certain order together, sorting can be divided into two categories: internal sorting and external sorting. In the sorting process, all records are stored in memory, it is called internal sort, if the sorting process needs to use external memory, it is called external sort. All of the sorts we’re going to talk about are internal sorts.

Internal sort can be divided into the following categories:

  • Insert sort: direct insert sort, binary insert sort, Hill sort
  • Selection sort: simple selection sort, heap sort
  • Swap sort: bubble sort, quick sort
  • Merge sort
  • Radix sort

Next, we will elaborate on these kinds of sorting

Insert sort directly

1. Basic Ideas:

At each step, insert a record to be sorted into the appropriate position of the sorted word sequence according to its sequential code size (after finding the appropriate position from back to front) until all insertion sorting is finished

2. Implementation process:

  • The original array value is:

    32, 43, 23, 13, 5

  • The first round of comparison: sort the first and second numbers, then form an ordered sequence, and get the result:

    32, 43, 23, 13, 5

  • The second round of comparison: the third number is inserted to form a new ordered sequence, and the result is as follows:

    23, 32, 43, 13, 5

  • The third round of comparison: the fourth number is inserted to form a new ordered sequence, and the result is:

    13, 23, 32, 43, 5

  • The fourth round of comparison: insert the fifth number to form a new ordered sequence, and the result is:

    5, 13, 23, 32, 43

The realization process is shown in the figure:

3. Implementation code:

package cn.hz;

/ * * *@author hz
 * @versionFor (int I =1; for(int I =1; i<length; I++), I don't have to insert that one. * Sets the number of inserts and gets the number of digits of the last number in the sorted sequence. InsertNum and j = I - 1. * Loops forward from the last number, moving the current number back one bit if the number inserted is less than the current number. * Places the current number in the empty position, that is, j+1. * /
public class Demo1 {
    public static void main(String[] args) {
        int[] a={49.38.65.97.76.13.27.49.78.34.12.64.1};  // Define the array to be sorted
        System.out.println("Insert directly before sort");
        for (int i:a){
            System.out.print(i+"\t");
        }
        int length=a.length;// The length of the array is extracted for speed.
        int insertNum;// The number to insert
        for(int i=1; i<length; i++){// The number of inserts
            insertNum=a[i];// The number to insert
            int j=i-1;// The number of sorted sequence elements
            while(j>=0&&a[j]>insertNum){// The sequence loops from back to front, moving the number greater than insertNum back one space
                a[j+1]=a[j];// The element moves one space
                j--;
            }
            a[j+1]=insertNum;// Place the number to be inserted in the position to be inserted.
        }
        System.out.println("\n Directly after insertion sort");
        for (int i:a){
            System.out.print(i+"\t"); }}}Copy the code

Effect:

4. Analysis:

The time consumed by direct insertion sort varies greatly when the file is in different initial state. If the initial state of the file is positive order, each record to be inserted only needs to be compared once to find the appropriate position for insertion. Therefore, the time complexity of the algorithm is O(n), which is the best case. If the initial state is in reverse order, the ith record to be inserted needs to be compared for I +1 times to find the appropriate place to insert, so the time complexity is O(n2), which is the worst case.

The mean time complexity of direct insertion sort is O(n2).

5. Stability Summary:

Insertion sort is the insertion of one element at a time on the basis of an already ordered small sequence. Of course, this ordered sequence starts with only 1 element, the first element (which is ordered by default).

The comparison starts at the end of the ordered sequence. That is, the element to be inserted is compared with the largest one, and if it is larger, it is inserted directly after it.

Otherwise, keep going until you find where it should be inserted. If you encounter an equal element, place the element to be inserted after the equal element.

Therefore, the order of equal elements does not change, and the order out of the original unordered sequence is still the order after sorting, so insertion sort is stable.

Dichotomy insertion sort

1. Basic Ideas:

Basic idea: the idea of dichotomy insertion sort is the same as direct insertion, but the way to find the appropriate insertion position is different, here is to find the appropriate position by dichotomy, can reduce the number of comparisons.

2. Implementation process:

Dichotomy sort is actually an improved insertion sort, but also by looking for the insertion position to achieve the sort, which is similar to insertion sort.

The algorithm idea is that when inserting the ith element, the first 0 ~ I -1 element is halved, and the middle element is first compared with them. If it is small, the first half is then halved; otherwise, the second half is halved.

Until left<right, then all elements between the first 1 of the ith element and the target position are moved back, and the ith element is placed at the target position.

The dichotomy doesn’t actually do sorting, just lookup. So when you find the position to insert, you have to start by moving the last record, move it back one bit, and then move the second-to-last bit until the record of the position to insert is moved back one bit. As shown in the figure:

3. Implementation code:

package cn.hz.demo2;

/ * * *@author hz
 * @version1.0 * Dichotomy insertion sort * Requirement analysis: * The insertion method is similar to direct insertion, but the insertion position is different */
public class Demo2 {
    public static void main(String[] args) {
        int[] a={49.38.65.97.76.13.27.49.78.34.12.64.1};  // Define the array to be sorted
        System.out.println("Before dichotomy insertion sort");
        for (int i:a){
            System.out.print(i+"\t");
        }
        for (int i = 0; i < a.length; i++) {
             int temp = a[i];
             int left = 0;
             int right = i-1;
             int mid = 0;
             while(left<=right){
                mid = (left+right)/2;
                if(temp<a[mid]){
                    right = mid-1;
                }else{
                    left = mid+1; }}for (int j = i-1; j >= left; j--) {
                a[j+1] = a[j];
             }
            if(left ! = i){ a[left] = temp; } } System.out.println("\n dichotomy after insertion sort");
        for (int i:a){
            System.out.print(i+"\t"); }}}Copy the code

Effect:

4. Analysis:

Of course, binary insertion sort is also stable.

The number of comparisons of binary insertion sort is independent of the initial state of the records to be sorted and depends only on the number of records. When n is large, it is much less than the maximum number of comparisons for direct insertion sort. But greater than the minimum number of direct insertion sort comparison. The number of moves of the algorithm is the same as that of the direct insertion sort algorithm, with the worst case being N2/2 and the best case being N and the average number of moves being O(n2).

5. Stability Summary:

Same as direct insertion sort.

Hill sort

1. Basic Ideas:

Take d1, an integer less than n, as the first increment, and divide all records in the file into D1 groups. All records whose distances are multiples of D1 are placed in the same group. Direct insertion sequencing was performed in each group. Then, take the second increment d2

2. Implementation process:

Hill sort mainly solves the problem of large amount of data in direct sort.

  • Original array values:

    32, 43, 23, 13, 5

  • First, set the number of numbers as n, take an odd number k=n/2 (when n is odd +1), divide the books with subscript difference value k into a group to form an ordered sequence.

    There are 5 numbers above, k=n/2=3(n is odd number +1 = 6), so 32-13 is a group, 43-5 is a group, and 23 is a separate group. Sort the numbers of each group accordingly, and the results are as follows:

    13, 5, 23, 32, 43

  • Then take k=k/2 and divide the books with subscript difference value of K into a group to form an ordered sequence.

  • Repeat step 2 until k=1 and then perform simple insert sort.

    Since k is already equal to 1, simple insertion sort is performed.

    The realization process is shown in the figure:

3. Implementation code:

package cn.hz.demo2;

/ * * *@author hz
 * @version1.0 * * Hill ranking * Requirements analysis: * First determine the number of sub-groups. * Then insert sort the elements in the group. * then take length/2 and repeat 1,2 steps until length=0. * /
public class Demo3 {
    public static void main(String[] args) {
        int[] a={49.38.65.97.76.13.27.49.78.34.12.64.1};  // Define the array to be sorted
        System.out.println("Pre-hill sort.");
        for (int i:a){
            System.out.print(i+"\t");
        }
        int d  = a.length;
        while(d! =0) {
            d=d/2;
            for (int x = 0; x < d; x++) {// The number of groups
                for (int i = x + d; i < a.length; i += d) {// Group elements, starting with the second number
                    int j = i - d;//j is the last digit of the ordered sequence
                    int temp = a[i];// The element to insert
                    for (; j >= 0 && temp < a[j]; j -= d) {// traverse from back to front.
                        a[j + d] = a[j];// Move back d bit
                    }
                    a[j + d] = temp;
                }
            }
        }
        System.out.println("\n Hill after sorting");
        for (int i:a){
            System.out.print(i+"\t"); }}}Copy the code

Effect:

Analysis of 4.

We know that a single insertion sort is stable, but in different insertion sorts, the same elements may move in their own insertion sorts, and eventually their stability will be disturbed, so hill sort is unstable.

The time performance of Hill sort is better than that of direct insert sort for the following reasons:

  • When the initial state of the file is basically ordered, the number of comparison and movement required to insert sort is less.
  • When n is small, the difference between n and n2 is also small, that is, the best time complexity O(n) and the worst time complexity 0(n2) of direct insertion sort have little difference.
  • At the beginning of the increment is larger in the hill sorting, grouping more, less the number of records in each group, so each group directly insert quickly, then increment di gradually narrowed, gradually reduce the number of clusters, and each record number increase gradually, but because has been by di – 1 as a distance row preface, make file is close to the order, so the new sorting process faster

Therefore, hill sort is more efficient than direct insertion sort.

The average time complexity of hill sort is O(nlogn).

5. Stability Summary:

Hill sort is the insertion sort of elements according to the asynchronous length. When the elements are disorderly at the beginning, the step size is the largest, so the number of elements in insertion sort is very small, and the speed is very fast.

When the elements are basically ordered, the step size is small, and insertion sort is efficient for ordered sequences. So, hill sort is going to take a little bit more time than O(N^2).

Because of multiple insertion sort, we know that single-insertion sort is stable and doesn’t change the relative order of the same elements,

However, in different insertion sorts, the same elements may move in their own insertion sorts, and finally their stability will be disturbed.

So shell sorting is an unstable sorting algorithm.

Five, simple selection sort:

1. Basic Ideas:

In the set of numbers to be sorted, the smallest number is selected and swapped with the number in the first position; Then find the smallest of the remaining numbers to swap with the second position, and cycle until the penultimate number is compared to the last number

2. Implementation process:

  • Original array values:

    32, 43, 23, 13, 5

  • First round of comparison: Iterate through the sequence, putting the smallest number first, and get the result:

    5, 32, 43, 23, 13

  • Second round of comparison: Iterate over the rest of the sequence, putting the smallest number first, and get the result:

    5, 13, 32, 43, 23

  • Third round of comparison: Iterate through the rest of the sequence, putting the smallest number first, and get the result:

    5, 13, 23, 32, 43

  • Fourth round of comparison: Iterate through the rest of the sequence, putting the smallest number first, and get the result

    5, 13, 23, 32, 43

  • Round 5: Iterate through the rest of the sequence, putting the smallest number first, and get the result

    5, 13, 23, 32, 43

The realization process is shown in the figure:

3. Implementation code:

package cn.hz.demo2;

/ * * *@author hz
 * @version1.0 * * Simple selection sorting * Requirements analysis: * Traverses the entire sequence, placing the smallest number first. * Iterate through the rest of the sequence, placing the smallest number first. * Repeat the second step until only one number remains. * /
public class Demo4 {
    public static void main(String[] args) {
        int[] a={49.38.65.97.76.13.27.49.78.34.12.64.1};  // Define the array to be sorted
        System.out.println("Before simple selection sort");
        for (int i:a){
            System.out.print(i+"\t");
        }
        int length = a.length;
        for (int i = 0; i < length; i++) {// The number of cycles
            int key = a[i];
            int position=i;
            for (int j = i + 1; j < length; j++) {// Select the smallest value and position
                if (a[j] < key) {
                    key = a[j];
                    position = j;
                }
            }
            a[position]=a[i];// Switch places
            a[i]=key;
        }
        System.out.println("\n After simple selection sort");
        for (int i:a){
            System.out.print(i+"\t"); }}}Copy the code

Effect:

4. Analysis:

Simple selection sort is an unstable sort.

Time complexity: T(n)=O(n2).

5. Stability Summary:

Selection sort selects the smallest current element of the elements to be sorted for each position. For example, selecting the smallest for the first position, and the next smallest for the second position among the remaining elements,

And so on, all the way to the NTH minus first element, the NTH element doesn’t have to be chosen, because it’s the only one left with the largest element.

So, in a selection, if the current locked element is larger than the next one, and the next smaller element appears after an element equal to the current locked element, then the order of positions has obviously changed after the swap.

Ha ha! For example, in sequence 5, 8, 5, 2, 9, we know that the first selection of element 5 swaps with element 2, so the relative order of the two 5’s in the original sequence is broken.

So selection sort is not a stable sorting algorithm.

Heap sort:

1. Basic Ideas:

Heap sort is a tree selection sort, which is an effective improvement over direct selection sort.

Definition of a heap: a sequence of n elements (H1, H2… , hn), if and only if meet > = 2 (hi > = h2i, hi I + 1) or (hi < < = h2i, hi = 2 + 1) I (I = 1, 2,… N /2) is called the heap. Only heaps that satisfy the former condition are discussed here. From the definition of the heap, the top element (that is, the first element) must be the largest item (the big top heap). A complete binary tree can represent the structure of a heap intuitively. The top of the heap is the root, and the rest are left and right subtrees.

Idea: Start by thinking of the sequence of numbers to be sorted as a sequential binary tree, and adjust their storage order so that it becomes a heap, where the root node of the heap is the largest. The root node is then swapped with the last node of the heap. And then rearrange the n minus 1 number to make it a heap. And so on, until you have a heap of only two nodes, and you swap them, and you end up with an ordered sequence of n nodes. According to the algorithm description, heap sort requires two processes, one is to build the heap, and the other is to exchange the position of the top and the last element of the heap. So heapsort consists of two functions. One is the infiltration function that builds the heap, and the other is the function that calls the infiltration function repeatedly to achieve sorting.

2. Implementation process:

  • Original array values:

    32, 43, 23, 13, 5

  • Build the big top heap with the array values (the top element (i.e., the first element) must be the largest item) and get the following result:

  • Swap the root node with the last node, and then disconnect the last node.

  • Repeat steps 1 and 2 until all nodes are disconnected.

    Rebuild the Big Top heap:

    The root node swaps with the last node, and then disconnects the last node

  • Repeat steps 1 and 2 until all nodes are disconnected.

    We get 5, 13, 23, 32, 43

3. Implementation code:

package cn.hz.demo2;

/ * * *@author hz
 * @version1.0 * * Heap sorting * Requirements analysis: * Build the sequence into a big top heap. * Swap the root node with the last node, and then disconnect the last node. * Repeat steps 1 and 2 until all nodes are disconnected. * * /
public class Demo6 {
    public static void main(String[] args) {
        int[] a = {49.38.65.97.76.13.27.49.78.34.12.64.1};  // Define the array to be sorted
        System.out.println("Pre-heap sort");
        for (int i : a) {
            System.out.print(i + "\t");
        }
        int arrayLength=a.length;
        // Loop the heap
        for(int i=0; i<arrayLength-1; i++){/ / build the heap
            buildMaxHeap(a,arrayLength-1-i);
            Swap the top and last element of the heap
            swap(a,0,arrayLength-1-i);
        }

        System.out.println("\n After heap sort:");
        for (int i : a) {
            System.out.print(i + "\t"); }}/** * swap */
    private static void swap(int[] data, int i, int j) {
        int tmp = data[i];
        data[i] = data[j];
        data[j] = tmp;
    }

    /** * create a big heap from data array 0 to lastIndex */
    private static void buildMaxHeap(int[] data, int lastIndex) {
        // Start with the parent of the last node at lastIndex
        for (int i = (lastIndex - 1) / 2; i >= 0; i--) {
            //k saves the node being judged
            int k = i;
            // If the child of the current k node exists
            while (k * 2 + 1 <= lastIndex) {
                // Index of the left child of the k node
                int biggerIndex = 2 * k + 1;
                // If biggerIndex is less than lastIndex, then the right child of the k node represented by biggerIndex+1 exists
                if (biggerIndex < lastIndex) {
                    // If the value of the right child node is large
                    if (data[biggerIndex] < data[biggerIndex + 1]) {
                        //biggerIndex always records the index of the larger child nodebiggerIndex++; }}// If the value of k node is less than the value of its larger child node
                if (data[k] < data[biggerIndex]) {
                 // Swap them
                    swap(data, k, biggerIndex);
                // Assign biggerIndex to k to start the next loop of the while loop, reguaranteeing that the value of k is greater than the value of its left and right children
                    k = biggerIndex;
                } else {
                    break;
                }
            }
        }
    }
}
Copy the code

Effect:

4. Analysis:

Heap sort is also an unstable sorting algorithm.

Heap sort is better than simple selection sort for the following reasons:

In direct selection sort, n-1 comparisons are required to select the record with the smallest keyword from R[1..n], and n-2 comparisons are required to select the record with the smallest keyword from R[2..n]. In fact, many of the later n-2 comparisons may have been made in the previous N-1 comparison, but they were repeated in the next sort because they were not retained in the previous sort.

Heap sorting can save part of comparison results by tree structure, which can reduce the number of comparison.

The worst time complexity of heap sort is O(nlogn). The average heap sequence performance is closer to the worst. Heap sort is not suitable for files with fewer records because of the number of comparisons required to build the initial heap.

5. Stability Summary:

We know that the structure of the heap is that the children of node I are 2i and 2i+1 nodes. The large top heap requires that the parent node be greater than or equal to 2 of its children, and the small top heap requires that the parent node be less than or equal to 2 of its children.

In a sequence of length n, the process of heap sorting starts from the n/2 value and its children choose the largest (big top heap) or the smallest (small top heap). The choice between these three elements will not destroy the stability of course.

But when n/2-1, n/2-2… 1 When these parent nodes select elements, they break stability.

It is possible that the NTH / 2nd parent swaps the last element, but the NTH /2-1 parent does not swap the last identical element, and the stability between the two identical elements is broken.

So heap sort is not a stable sort algorithm.

Seven, bubble sort:

1. Basic Ideas:

In a set of numbers to be sorted, the two adjacent numbers are compared and adjusted from top to bottom for all the numbers that are not yet sorted, so that the larger number sinks and the smaller one rises. That is, whenever two adjacent numbers are compared and find that their sort is the opposite of the sort required, they are swapped

2. Implementation process:

  • Original array values:

    32, 43, 23, 13, 5

  • First round of comparison: Compare all elements in the sequence in pairs, putting the largest one at the end, and get the result:

    32, 23, 13, 5, 43

    As shown in the figure:

  • The second round of comparison: Compare all elements in the remaining sequence in pairs, and put the largest one at the end to get the result:

    23, 13, 5, 32, 43

    As shown in the figure:

  • The third round of comparison: Compare all elements in the remaining sequence in pairs, and put the largest one at the end to get the result:

    13, 5, 23, 32, 43

As shown in the figure:

  • The third round of comparison: Compare all elements in the remaining sequence in pairs, and put the largest one at the end to get the result:

    5, 13, 23, 32, 43

    As shown in the figure:

3. Implementation code:

package cn.hz.demo2;

/ * * *@author hz
 * @version1.0 * * Bubble sort * Requirements analysis: * Set the number of cycles. * Sets the number of digits to start the comparison, and the number to end it. * Compare them in pairs and put the smallest one first. * Repeat 2 or 3 steps until the number of cycles is up. * * /
public class Demo7 {
    public static void main(String[] args) {
        int[] a={49.38.65.97.76.13.27.49.78.34.12.64.1};  // Define the array to be sorted
        System.out.println("Before bubble sort");
        for (int i:a){
            System.out.print(i+"\t");
        }
        int length=a.length;
        int temp;
        for(int i=0; i<a.length; i++){for(int j=0; j<a.length-i-1; j++){if(a[j]>a[j+1]){
                    temp=a[j];
                    a[j]=a[j+1];
                    a[j+1]=temp;
                }
            }
        }
        System.out.println("\n After bubble sort");
        for (int i:a){
            System.out.print(i+"\t"); }}}Copy the code

Effect:

4. Analysis:

Bubble sort is a stable sorting method.

  • If the file is in positive order at the beginning, the sorting can be completed in one run, the comparison times of the sorting code is N-1, and there is no record movement, and the time complexity is O(n).
  • If the initial state of the file is in reverse order, n-1 froth is required, and n-I sorting code comparison is carried out in each trip, and each comparison moves three times, and the maximum number of comparison and movement is reached: O(n2).
  • The average time complexity of bubble sorting is O(n2).

5. Stability Summary:

Bubble sort is to move small elements forward (or large elements back). Note that the two adjacent elements are being compared, and that the swap is also required between them.

So, if these two elements are equal, I don’t think you’re going to be bored swapping them again.

If two equal elements are not adjacent, then even if the two elements are adjacent by the previous pair swap, they will not change their position, so the order of the same elements will not change.

So bubble sort is a stable sort algorithm.

Quick sort:

1. Basic Ideas:

Select a base element, usually choose the first or the last element, a trip through scanning, will stay sequence is divided into two parts, part is smaller than base element, element part of the greater than or equal to the benchmark, the benchmark elements in its correct position after the sorted, and then use the same method recursively sort divided into two parts.

2. Implementation process:

  • Original array values:

    32, 43, 23, 13, 5

  • The first round of comparison: the first number is chosen as P, the number less than P is placed on the left, and the number greater than P is placed on the right. The result is as follows:

32, 13, 5, 32, 43

As shown in the figure:

  • The second round of comparison: recursively, the numbers to the left and right of P are carried out according to the first step until they cannot be recursed, and the results are as follows:

    13, 5, 23, 32, 43

    As shown in the figure:

  • The third round of comparison: recursively, the numbers to the left and right of P are carried out according to the first step until they cannot be recursed, and the results are as follows:

    5, 13, 23, 32, 43

    As shown in the figure:

3. Implementation code:

package cn.hz.demo2;

/ * * *@author hz
 * @version1.0 * * Quicksort * Requirements analysis: * Select the first number as P, put the numbers less than P on the left, and put the numbers greater than P on the right. Recursively take the numbers to the left and right of P and repeat the first step until you can't recurse. * * /
public class Demo8 {
    public static void main(String[] args) {
        int[] a={49.38.65.97.76.13.27.49.78.34.12.64.1};  // Define the array to be sorted
        System.out.println("Before quicksort");
        for (int ia:a){
            System.out.print(ia+"\t");
        }
        // Quicksort
        quickSort(a,0,a.length-1);
        System.out.println("\n after quicksort");
        for (int ia:a){
            System.out.print(ia+"\t"); }}public static void quickSort(int[] numbers, int start, int end) {
        if (start < end) {
            int base = numbers[start]; // The selected base value (the first value is the base value)
            int temp; // Record temporary intermediate values
            int i = start, j = end;
            do {
                while ((numbers[i] < base) && (i < end))
                    i++;
                while ((numbers[j] > base) && (j > start))
                    j--;
                if(i <= j) { temp = numbers[i]; numbers[i] = numbers[j]; numbers[j] = temp; i++; j--; }}while (i <= j);
            if (start < j)
                quickSort(numbers, start, j);
            if(end > i) quickSort(numbers, i, end); }}}Copy the code

Effect:

4. Analysis:

Quicksort is an unstable sort.

The time complexity of quicksort is O(nlogn).

Fast sorting is good when n is large, but bad when the sequence is basically ordered.

5. Stability Summary:

Quicksort has two directions, the left I subscript goes straight to the right (when a[I] <= a[center_index]), where center_index is the array index of the central element, usually the 0th element in the array.

The right j subscript goes all the way to the left (when a[j] > a[center_index]).

If neither I nor j can go further, I <= j, swap a[I] and a[j], and repeat the process until I >j. Swap a[j] and a[center_index] to complete a quicksort.

When the central element and a[J] exchange, it is likely to disrupt the stability of the previous element, such as sequence 5, 3, 3, 4, 8, 9, 10, 11

Now the exchange of central element 5 and element 3(the fifth element, subscripts starting with 1) will disrupt the stability of element 3.

So quicksort is an unstable sorting algorithm that occurs when the central element and a[j] exchange.

Merge sort:

1. Basic Ideas:

Merge sort is the merging of two (or more) ordered tables into a new ordered table, that is, the sequence to be sorted into several subsequences, each subsequence is ordered. Then the ordered subsequence is combined into a whole ordered sequence.

2. Implementation process:

  • Original array values:

    32, 43, 23, 13, 5

  • The first round of comparison: two adjacent numbers are selected to form an ordered sequence, and the results are as follows:

    32, 43, 13, 23, 5

    As shown in the figure:

  • The second round of comparison: Two adjacent ordered sequences are selected to form an ordered sequence, and the results are as follows:

    13, 23, 32, 43, 5

    As shown in the figure:

  • The third round of comparison: Two adjacent ordered sequences are selected to form an ordered sequence, and the results are as follows:

    ​ 5, 13, 23, 32, 43,

    As shown in the figure:

3. Implementation code:

package cn.hz.demo2;

/ * * *@author hz
 * @version1.0 * * Merge sort * Requirement analysis: * Select two adjacent numbers to form an ordered sequence. * Select two adjacent ordered sequences to form an ordered sequence. * Repeat the second step until everything forms an ordered sequence. * /
public class Demo9 {
    public static void main(String[] args) {
        int[] a={49.38.65.97.76.13.27.49.78.34.12.64.1.8};
        System.out.println("Before merge sort:");
        for (int i = 0; i < a.length; i++) {
            System.out.print(a[i]+"");
        }
        // merge sort
        mergeSort(a,0,a.length-1);
        System.out.println();
        System.out.println("After sorting:");
        for (int i = 0; i < a.length; i++) {
           System.out.print(a[i]+""); }}private static void mergeSort(int[] a, int left, int right) {
        if(left<right){
           int middle = (left+right)/2;
            // Recurse on the left side
            mergeSort(a, left, middle);
            // Recurse on the right side
            mergeSort(a, middle+1, right);
           / / mergemerge(a,left,middle,right); }}private static void merge(int[] a, int left, int middle, int right) {
       int[] tmpArr = new int[a.length];
       int mid = middle+1; // The starting position on the right
       int tmp = left;
       int third = left;
       while(left<=middle && mid<=right){
           // Select the smaller number from the two arrays and place it in the middle array
            if(a[left]<=a[mid]){
                tmpArr[third++] = a[left++];
            }else{ tmpArr[third++] = a[mid++]; }}// Put the rest into the intermediate array
        while(left<=middle){
           tmpArr[third++] = a[left++];
        }
        while(mid<=right){
            tmpArr[third++] = a[mid++];
        }
       // Copy the intermediate array back to the original array
        while(tmp<=right){ a[tmp] = tmpArr[tmp++]; }}}Copy the code

Effect:

4. Analysis:

Merge sort is a stable sort method.

Merge sort is O(nlogn).

Second in speed to quicksort, stable sorting algorithm, generally used for the overall disorder, but the relative order of the subitems of the sequence

5. Stability Summary:

Merge sort is to recursively divide a sequence into short sequences, the recursive exit is a short sequence with only 1 element (considered directly ordered) or 2 sequences (1 comparison and swap),

Then, the sequence of ordered segments is combined into an ordered long sequence, and the combination continues until the original sequence is all sorted.

You can see that with 1 or 2 elements, 1 element will not be swapped, and if 2 elements are equal in size, no one will swap them on purpose, and this will not break the stability.

So, in the process of merging short ordered sequences, is stability destroyed?

No, during the merge process we can ensure that if two current elements are equal, we store the elements in the preceding sequence in front of the resulting sequence, thus guaranteeing stability.

So merge sort is also a stable sort algorithm.

Ten, radix sort:

1. Basic Ideas:

All values to be compared (positive integers) are unified into the same digit length, with the shorter digit preceded by zeros. Then, sort the order one by one, starting with the lowest order. So from the lowest order to the highest order, the sequence becomes an ordered sequence.

2. Implementation process:

This sorting method is more troublesome, change the value of the array to explain:

  • Original array values:

    135, 242, 192, 93, 345, 11, 24, 19

  • The first round of comparison: take out the units digit of all the numbers and sort them according to the units digit to form a sequence, and the results are as follows:

    11, 242, 192, 93, 24, 135, 345, 19

    As shown in the figure:

  • The second round of comparison: take out the ten digits of all the newly formed numbers and sort them according to the ten digits to form a sequence, and the results are as follows:

    11, 19, 24, 135, 242,345, 192, 93

    As shown in the figure:

  • The third round of comparison: take out the hundreds of all the newly formed numbers and sort them according to the hundreds to form a sequence, and the results are as follows:

    11, 19, 24, 93, 135, 192, 242, 345

    As shown in the figure:

3. Implementation code:

package cn.hz.demo2;

import java.util.ArrayList;
import java.util.List;

/ * * *@author hz
 * @version1.0 * * Radix sort * Requirement analysis: * Take out the units digit of all numbers and sort them by the units digit to form a sequence. * Take the ten digits of all the newly formed numbers and sort them by the ten digits to form a sequence. * /
public class Demo10 {
    public static void main(String[] args) {
        int[] array={49.38.65.97.76.13.27.49.78.34.12.64.1.8};
        System.out.println("Before radix sort:");
        for (int i = 0; i < array.length; i++) {
            System.out.print(array[i]+"");
        }
        // Base sort
        // First determine the number of sorts;
        int max = array[0];
        for (int i = 1; i < array.length; i++) {
            if(array[i] > max) { max = array[i]; }}int time = 0;
        // Determine the number of digits;
        while (max > 0) {
            max /= 10;
            time++;
        }
        // Create 10 queues;
        List<ArrayList> queue = new ArrayList<ArrayList>();
        for (int i = 0; i < 10; i++) {
            ArrayList<Integer> queue1 = new ArrayList<Integer>();
            queue.add(queue1);
        }
        // Perform time allocation and collection;
        for (int i = 0; i < time; i++) {
            // Allocate an array element;
            for (int j = 0; j < array.length; j++) {
                // Get the time of the number +1 digit;
                int x = array[j] % (int) Math.pow(10, i + 1)/(int) Math.pow(10, i);
                ArrayList<Integer> queue2 = queue.get(x);
                queue2.add(array[j]);
                queue.set(x, queue2);
            }
            int count = 0;// Element counter;
            // Collect queue elements;
            for (int k = 0; k < 10; k++) {
                while (queue.get(k).size() > 0) {
                    ArrayList<Integer> queue3 = queue.get(k);
                    array[count] = queue3.get(0);
                    queue3.remove(0);
                    count++;
                }
            }
        }

        System.out.println("\n After the radix sort:");
        for (int i = 0; i < array.length; i++) {
            System.out.print(array[i]+""); }}}Copy the code

Effect:

4. Analysis:

Radix sort is a stable sorting algorithm.

The time complexity of radix sort is O(d(n+r)), where D is the digit and R is the radix.

5. Stability Summary:

Radix sort is sorted by low order and then collected; And then sort it in order, and then collect it; And so on, all the way to the highest place.

Sometimes some attributes are in priority order. They are sorted first by low priority, then by high priority. The result of the final order is that the one with the highest priority comes first, and the one with the highest priority comes first if the high priority is the same.

Radix sort is based on sorting separately, collecting separately, so it is a stable sorting algorithm.

Conclusion:

1. Stability:

  • Stable: bubble sort, insert sort, merge sort and radix sort
  • Unstable: select sort, quicksort, Hill sort, heap sort

2.Average time complexity:

O(n^2): Direct insert sort, simple selection sort, bubble sort.

In small data size (within 9W), direct insertion sort, simple selection sort is almost the same. The time cost of bubble sort algorithm is the highest when the data is large. Algorithms with O(n^2) performance basically compare neighboring elements and are basically stable.

O(nlogn): Quicksort, merge sort, Hill sort, heap sort.

Among them, fast sorting is the best, followed by merge and hill, heap sort is effective when there is a large amount of data.

3. Selection of sorting algorithm:

Small scale:

  • When the basic order of the sequence is to be sorted, you can choose to insert sort directly.
  • Simple selection sort should be used if stability is not required, insert or bubble should be used if stability is required

Data size is not very large:

  • You can use the memory space, the sequence is messy, it doesn’t require stability, it’s quick sort, it’s log N extra space.
  • The sequence itself may be ordered, there are requirements for stability, space permits, should use merge sort

The data scale is huge:

  • If stability is desired, merge sort can be considered.
  • There is no requirement for stability, heap sort is preferred

The sequence is initially ordered (positive order), and is suitable for direct insertion and bubbling.