Java eight sorting algorithm

Definition of:

Sort: Sort a sequence of objects according to a keyword

Two: Bubble sort

Bubble Sort was the first sorting algorithm I touched on

Basic idea:

Bubble Sort is a simple Sort. It repeatedly goes through the sequence to be sorted, comparing two elements at a time and swapping them if they are in the wrong order. The task of visiting a sequence is repeated until there is no need to swap, that is, the sequence has been sorted.

Algorithm description

1. Compare adjacent elements and swap them if the first one is larger than the second

2. Do the same for each pair of adjacent elements, starting with the first pair and ending with the last pair. And when we do that, the last element is going to be the largest number.

3. Repeat for all elements except the last one.

4. Keep repeating the above steps for fewer and fewer elements each time until there are no pairs of numbers to compare.

Code implementation

Public static void BubbleSort(int[] Numbers){for (int I =0; i<numbers.length-1; I++){for (int j=0; j<numbers.length-1-i; J ++){if (Numbers [j] Numbers [j] Numbers [j+1]){int temp= Numbers [j]; numbers[j]=numbers[j+1]; numbers[j+1]=temp; }}}} / / call int [] n = {60,38,5,14,7,23,89,77,88,4,35,45,67,99,87}; System.out.println(Arrays.toString(n)); BubbleSort(n); System.out.println(Arrays.toString(n)); [60, 38, 5, 14, 7, 23, 89, 77, 88, 4, 35, 45, 67, 99, 87] [4, 5, 7, 14, 23, 35, 38, 45, 60, 67, 77, 87, 88, 89, 99]

Bubble Sort optimization has this situation, may have been sorted in the last few runs, but still continue to sort operation, so we can put a mark in the place of swap, if there is no swap element in that run, this group of data has been ordered, do not need to continue.

Public static void BubbleSort(int[] Numbers){// For (int I =0; i<numbers.length-1; i++){ int flag=0; For (int j=0; int j=0; j<numbers.length-1-i; J ++){if (Numbers [j] Numbers [j] Numbers [j+1]){int temp= Numbers [j]; numbers[j]=numbers[j+1]; numbers[j+1]=temp; flag=1; } if (flag==0){// If there are no swapped elements, then return is already sorted; }}}

Complexity analysis

Bubble Sort is the easiest sort to implement, and the worst case is that it needs to swap each time, which requires nearly n²/2 traversing and swapping, and the time complexity is O(n²). The best case is that after traversal of the inner loop, the sorting is found to be correct, so exit the loop, and the time complexity is O(n). On average, the time complexity is O(n²). Since only the cached temp variables in bubble sort require memory space, the space complexity is constant O(1).

Three: quicksort

Quick Sort is a sorting algorithm developed by Tony Hall. On average, sorting n items takes Ο(nlogn) comparisons. In the worst case, a Ο(n2) comparison is required, but this is not common. In fact, quicksort is usually significantly faster than other Ο(nlogn) algorithms because its inner loop can be implemented efficiently on most architectures

The basic idea

The basic idea of quicksort: Dig pit fill number + divide and conquer method.

Quicksort uses a divide-and-conquer strategy to Divide a list into two sub-lists.

Quicksort is another typical application of divide and conquer in sorting algorithm. Essentially, quicksort is sort of a recursive divide-and-conquer on top of bubble sort.

The name quicksort is crude, because when you hear it you know what it’s for. It’s fast, and it’s efficient! It’s one of the fastest sorting algorithms to deal with big data. Even though Worst Case has O(n ^ 2) in time, it is still good and in most cases outperforms a sorting algorithm with an average of O(n logn).

Algorithm description

Quicksort uses a divide-and-conquer strategy to divide a list into two sub-lists. Steps as follows:

1. Pick one element from the sequence and call it “baseline”.

2. Rearrange the sequence so that all elements smaller than the base value are placed in front of the base value and all elements larger than the base value are placed behind the base value. After the partition ends, the benchmark is in the middle of the series. This is called a partition operation.

3. Recursively (recursively) sort the subsequences of elements that are less than the base value and those that are greater than the base value.

When you get to the bottom of the recursion, the sequence has size zero or one, which means it’s sorted. This algorithm is bound to end because it will place at least one element in its last place during each iteration.

Code implementation

Public static void quickSort (int[] a, int low, int high) {public static void quickSort (int[] a, int low, int high) { If (low >= high) {return; } int left = low; // start position 0, this position is changed int right = high; 60 int pivot = a[left]; 60 int pivot = a[left]; While (left < right) {while (left < right & a[right] >= pivot) right--; a[left] = a[right]; While (left < right &&a [left] <= pivot) left++; a[right] = a[left]; } a[left] = pivot;} a[left] = pivot; QuickSort(a, low, left - 1); QuickSort(a, left + 1, high); } / / using int [] n = {60,38,5,14,7,23,89,77,88,4,35,45,67,99,87}; System.out.println(Arrays.toString(n)); QuickSort(n,0,n.length-1); System.out.println(Arrays.toString(n)); [60, 38, 5, 14, 7, 23, 89, 77, 88, 4, 35, 45, 67, 99, 87] [4, 5, 7, 14, 23, 35, 38, 45, 60, 67, 77, 87, 88, 89, 99]

Above is the recursive version of quicksort: divide and conquer by inserting the base into the appropriate place, and recursively continue to quicksort the two partitions after divide and conquer. So how does a non-recursive version of quicksort work? Since the nature of recursion is a stack, we can use the stack to save intermediate variables in the process of non-recursive implementation to achieve non-recursive. In this case, the intermediate variable is the leading and trailing pointer divided into left and right parts by the Pritation function. You only need to save the leading and trailing Pointers of these two parts.

public static void QuickSortByStack(int[] a) { Stack<Integer> stack = new Stack<Integer>(); // Stack.push (0); // Push (0); stack.push(a.length - 1); while (! Stack.isEmpty ()) {int high = stack.pop(); int low = stack.pop(); int pivotIndex = partition(a, low, high); // Save the intermediate variable if (pivotIndex > low) {stack.push(low); stack.push(pivotIndex - 1); } if (pivotIndex < high && pivotIndex >= 0) { stack.push(pivotIndex + 1); stack.push(high); } } } private static int partition(int[] a, int low, int high) { if (low >= high) return -1; int left = low; int right = high; Int pivot = a[left]; While (left < right &&a [right] >= pivot) {right--; } a[left] = a[right]; While (left < right &&a [left] <= pivot) {left++; } a[right] = a[left]; } a[left] = pivot;} a[left] = pivot; return left; } / / using int [] n = {60,38,5,14,7,23,89,77,88,4,35,45,67,99,87}; System.out.println(Arrays.toString(n)); QuickSortByStack(n); System.out.println(Arrays.toString(n)); [60, 38, 5, 14, 7, 23, 89, 77, 88, 4, 35, 45, 67, 99, 87] [4, 5, 7, 14, 23, 35, 38, 45, 60, 67, 77, 87, 88, 89, 99]

The above two bubble sort and quicksort belong to commutative sort

Four: direct insertion sort

The basic idea

The usual way people arrange decks is to take them one by one, inserting each card into the proper place among the other cards that are already ordered. In the computer implementation, in order to make room for the inserted element, we need to move all the remaining elements one bit to the right before we insert them.

Algorithm description

In general, insert sort is implemented on an array using in-place. The specific algorithm is described as follows:

1. Starting with the first element, the element can be considered sorted

2. Take the next element and scan it back-to-front in the sorted element sequence (compare the sorted element from the second element)

3. If the (sorted) element is larger than the new element, move the element to the next position

4. Repeat step 3 until you find the position where the sorted element is less than or equal to the new element

5. After inserting the new element in that position

6. Repeat steps 2 to 5

Code implementation

public static void InsertSort(int[] numbers){ for (int i=0; i<numbers.length-1; I ++){for (int j= I +1; j>0; J --){if (Numbers [j]< Numbers [j-1]) {if (Numbers [j]< Numbers [j-1]) {if (Numbers [j]< Numbers [j-1]) {if (Numbers [j]< Numbers [j-1]); // Numbers [j]= Numbers [j-1]; // Numbers [j-1]=temp; // The temporary variable is assigned to the first element}}}}

A variant of insertion sort, binary insertion sort

public static void binaryInsertSort(int[] numbers) { if (numbers ! = null && numbers.length > 1) { for (int i = 1; i < numbers.length; i++) { int left = 0; int right = i-1; int mid; int temp = numbers[i]; If (temp < Numbers [Right]){while(left <= Right){mid = (left + Right)/2; if(temp < Numbers [Right]){mid = (left + Right)/2; if(numbers[mid] < temp){ left = mid + 1; }else if(Numbers [mid] > temp){Numbers [mid] > temp; }else{left = left + 1; left = left + 1; Int j = I; int j = I; int j = I; j > left; j--) { numbers[j] = numbers[j-1]; } numbers[left] = temp; }}}} / / using int [] n = {60,38,5,14,7,23,89,77,88,4,35,45,67,99,87}; System.out.println(Arrays.toString(n)); binaryInsertSort(n); System.out.println(Arrays.toString(n)); [60, 38, 5, 14, 7, 23, 89, 77, 88, 4, 35, 45, 67, 99, 87] [4, 5, 7, 14, 23, 35, 38, 45, 60, 67, 77, 87, 88, 89, 99]

Hill Sort, also known as the descending incremental sort algorithm, is a more efficient and improved version of insertion sort. Hill sort is an unstable sort algorithm. Hill Sort proposed an improved method based on the following two properties of insertion sort:

Insertion sort of almost already sorted data operation, high efficiency, which can achieve the efficiency of linear sequence But insertion sort is usually inefficient, because insertion sort every time can only move data hill sorting is to divides the whole of the recorded for sorting sequence into several subsequences direct insertion sort, respectively, When the records in the whole sequence are “basically ordered”, all the records are directly inserted into sort in turn. The basic idea is to group the array to be sorted according to the step size GAP, and then the elements in each group are sorted by the method of direct insertion sort. Each time the gap is reduced in half, cycle the above operation; When gap=1, direct insertion is used to complete sorting.

You can see that the choice of step size is an important part of Hill’s sorting. Any sequence of step size will work as long as the final step size is 1. In general, the simplest step size is to take half of the array length as an increment, and then halve it each time until the increment is 1.

Algorithm description

1. Select an incremental sequence T1, T2… , tk, where ti > tj, tk = 1;

2. Sorting sequences by k times according to the number of incremental sequences k;

3. Sort each time, divide the sequence to be sorted into several sub-sequences of length m according to the corresponding increment Ti, and conduct direct insertion sort on each sub-table respectively. Only if the incremental factor is 1, the entire sequence is treated as a table, and the length of the table is the length of the entire sequence.

Code implementation

Public static void shellSort (int[] arr) {for (int gap = arr.length / 2; gap > 0; Void (int I = void) {void (int I = void (int I = void); i < arr.length; i++) { int j = i; While (j-gap >= 0 &&arr [j] < arr[j]]) {// Swap (arr, j, j-gap); j -= gap; } } } } public static void swap(int[] arr, int a, int b) { arr[a] = arr[a] + arr[b]; arr[b] = arr[a] - arr[b]; arr[a] = arr[a] - arr[b]; }

The reason Hill sort is more efficient is that it tradeoffs the size and order of subarrays. When you start sorting, each of the subarrays is very short, and when you finish sorting, the subarrays are all partially ordered, both of which are good for insertion sort.

Six: simple selection sort

The basic idea

Selection Sort is a simple and intuitive sorting algorithm. Here’s how it works. First, find the smallest (largest) element in the unsorted sequence and store it at the beginning of the sorted sequence. Then, continue to find the smallest (largest) element in the remaining unsorted elements and place it at the end of the sorted sequence. And so on until all the elements are sorted.

Algorithm description

1. Find the element with the smallest keyword in the unsorted sequence

2. If the smallest element is not the first element in an unsorted sequence, swap it with the first element in an unsorted sequence

3. Repeat steps 1 and 2 until the sorting is finished.

Code implementation

public static void SelectSort(int[] numbers) { for (int i = 0; i < numbers.length; i++) { int min = i; For (int j = I + 1; for (int j = I + 1; j < numbers.length; J ++) {if (Numbers [j] < Numbers [min]) {if (Numbers [j] < Numbers [min]); }} // If (min! = min! = min! = min! = i) { int temp = numbers[i]; numbers[i] = numbers[min]; numbers[min] = temp; }}} / / call int [] n = {14, 60, 38, 5, 7, 23, 89, 77, 88, 4, 35, 45, 67, 99, 87}; System.out.println(Arrays.toString(n)); SelectSort(n); System.out.println(Arrays.toString(n)); [60, 38, 5, 14, 7, 23, 89, 77, 88, 4, 35, 45, 67, 99, 87] [4, 5, 7, 14, 23, 35, 38, 45, 60, 67, 77, 87, 88, 89, 99]

The simplicity and intuitiveness of selection sort is worthy of the name, which makes it notoriously slow. In either case, even if the original array is sorted, it will take nearly n ^ 2 passes to verify it. Even so, its ranking is still unstable. The only good news is that it doesn’t cost any extra memory space.

Seven: Heap sort

Heap is defined as follows: A sequence of n elements {k1,k2,.. ,kn}

If and only if the lower relation is satisfied, it is called a heap

Consider the corresponding 2-D array as a complete binary tree. The heap then means that the value of any non-leaf node in a complete binary tree is not greater than (or less than) the value of its left and right children. From the above properties, we know that the top keyword of the large-top heap must be the largest of all the keywords, and the top keyword of the small-top heap must be the smallest of all the keywords. So we can use a large top heap for ascending sort and a small top heap for descending sort.

The basic idea

Taking large-top heap as an example, the process of heap sorting is to construct the sequence to be sorted into a heap, select the largest one in the heap and move it, then adjust the remaining elements into a heap, find out the largest one and move it again, and repeat until it is ordered

Algorithm description

1. The initial sequence K[1..n] is first built into a large-top heap, then the first element K1 is the largest at this time, and this heap is the initial disorder region.

2. Then exchange the largest record K1 (the top of the heap, the first element) with the last record Kn in the unordered region, thereby obtaining the new unordered region K[1..n−1] and ordered region K[n], which satisfy K[1..n−1]. Keys ⩽K[n]

3. After exchanging K1 and Kn, the heap top may violate the heap nature, so K[1..n−1] needs to be adjusted to the heap. Step 2 is then repeated until there is only one element in the unordered region.

Code implementation

From the algorithm description, heap sort needs two processes, one is to build the heap, and the other is to exchange the position of the top of the heap with the last element of the heap. So heap sort consists of two functions. One is a heap function, and the other is a function that calls the heap function repeatedly to select the largest number of remaining unsorted elements for sorting.

  • Maximum heap adjustment (MAX_HEAPify) : Adjust the end of the heap’s children so that they are always smaller than the parent
  • Create the maximum heap (Build_Max_Heap) : Reorder all the data in the heap
  • HeapSort: Recursive operation that removes bits from the root of the first data and performs maximum heap adjustment
public static void HeapSort(int[] numbers) { for (int i = numbers.length - 1; i > 0; i--) { max_heapify(numbers, i); Int temp = numbers[0]; int temp = Numbers [0]; numbers[0] = numbers[i]; numbers[i] = temp; }} /*** ** heap the array * I = first non-leaf node. * Start with the first non-leaf node. You don't have to start at the last leaf node. * The leaf node can be considered as a node that meets the heap requirements, and the root node is itself with a maximum value below itself. * * @param a * @param n */ public static void max_heapify(int[] a, int n) { int child; for (int i = (n - 1) / 2; i >= 0; I --) {// Left child = 2 * I + 1; If the right child exists and is larger than the left child, the child becomes the right child if (child! = n && a[child] < a[child + 1]) { child++; } if (a[I] < a[child]) {int temp = a[I]; a[i] = a[child]; a[child] = temp; }}} / / call int [] n = {14, 60, 38, 5, 7, 23, 89, 77, 88, 4, 35, 45, 67, 99, 87}; System.out.println(Arrays.toString(n)); HeapSort(n); System.out.println(Arrays.toString(n)); [60, 38, 5, 14, 7, 23, 89, 77, 88, 4, 35, 45, 67, 99, 87] [4, 5, 7, 14, 23, 35, 38, 45, 60, 67, 77, 87, 88, 89, 99]

Heap sort is not suitable for small sequences because the process of initializing the heap is compared more often. At the same time, the relative order between the same elements is broken because of the multiple exchange of arbitrary subscripts. Therefore, it is an unstable sort.

Eight: merge sort

Merge Sort is an efficient sorting algorithm based on merge operation, which was first proposed by John von Neumann in 1945. This algorithm is a very typical application of Divide and Conquer, and the recursion of Divide and Conquer at all levels can be done simultaneously.

The basic idea

Merge sort algorithm is to merge two (or more than two) ordered tables into a new ordered table, that is, the sequence to be sorted into several sub-sequences, each sub-sequence is ordered. The ordered subsequences are then combined into a global ordered sequence

Algorithm description

Merge sort can be implemented in two ways:

  • Top-down recursion
  • Bottom-up iteration

Recursive (assuming the sequence has n elements) :

1. Merge every two adjacent numbers in the sequence to form a sequence of floor(n/2), and each sequence contains two elements after sorting;

2. Merge the above sequences again to form floor(n/4) sequences, each containing four elements;

3. Repeat step 2 until all elements are sorted.

Iterative method

1. Request a space whose size is the sum of two sorted sequences, which is used to store the merged sequence 2. Set two Pointers to the starting position of each of the two sorted sequences at position 3. Compare the elements to which the two Pointers point, select the relatively small element into the merge space, and move the pointer to the next position, 4. Repeat step 3 until a pointer reaches the end of the sequence. 5. Copy the remaining elements of the other sequence directly to the end of the merge sequence to implement merge sort.

  • Decomposition: Split a sequence in half at a time
  • Merge: Combine the split sequence segments in pairs. Hence, merge sort is essentially two operations, split + merge
Private static int[] aux; private static int[] aux; Public static void MergeSort(int[] Numbers) {// Numbers = new int[Number.length]; sort(numbers, 0, numbers.length - 1); } public static void sort(int[] a, int low, int high) { if (low >= high) { return; } int mid = (low + high) / 2; Sort (a, low, mid); Sort (a, low, mid); Sort (a, mid + 1, high); merge(a, low, mid, high); } /** * This method first copies all elements to aux[] and then to a[] in the merge. When the method is merged (second for loop) *, four conditions are evaluated: * -Left Exhaustion (taking elements on the right) * -Right Exhaustion (taking elements on the left) * -Right Exhaustion (taking elements on the left) * -Right Exhaustion (taking elements on the left) * -Right Exhaustion (taking elements on the left) * -Right Exhaustion (taking elements on the left) * @Param A * @param low * @param mid * @param high */ public static void merge(int[] a, int low, int mid, Int I = low, j = mid+1; int I = low, j = mid+1; j = mid+1; for (int k = low; k <= high; k++) { aux[k] = a[k]; } for (int k = low; k <= high; k++) { if (i > mid) { a[k] = aux[j++]; } else if (j > high) { a[k] = aux[i++]; } else if (aux[j] < aux[i]) { a[k] = aux[j++]; } else { a[k] = aux[i++]; }}}


int[] n = {60, 38, 5, 14, 7, 23, 89, 77, 88, 4, 35, 45, 67, 99, 87}; System.out.println(Arrays.toString(n)); MergeSort(n); System.out.println(Arrays.toString(n)); [60, 38, 5, 14, 7, 23, 89, 77, 88, 4, 35, 45, 67, 99, 87] [4, 5, 7, 14, 23, 35, 38, 45, 60, 67, 77, 87, 88, 89, 99]

From the perspective of efficiency, merge sort can be regarded as an outstanding sorting algorithm. Assuming that the length of array is n, then it takes logn in total to split the array, and each step is an ordinary process of merging subarrays, and the time complexity is O(n), so the comprehensive time complexity is O(nlogn). On the other hand, subarrays split during mergesort recursion need to be kept in memory space with a space complexity of O(n).

9. Radix sort

Radix Sort is a non-comparative integer sorting algorithm. The principle is to cut integers into different numbers by bits, and then compare them by each bit. Since integers can also represent strings (such as names or dates) and floating-point numbers in a specific format, radix sort is not restricted to integers

The basic idea

It works by unifying all the values to be compared (positive integers) into the same digit length, with shorter digits being zeroed in front. Then, start at the lowest order and sort it one by one. Thus, the sequence becomes an ordered sequence from the lowest bit sorting to the highest bit sorting.

There are two ways to sort radix sort by priority from high or low:

  • MSD (Most Significant Digital) sort from the highest left-most position. Sort and group according to K1, record in the same group, key code K1 is equal, then sort each group into subgroups according to K2, and then continue to sort and group the following key codes, until sort each subgroup according to the most minor key code KD. The groups are then joined together to produce an ordered sequence. MSD mode is suitable for multi-digit sequences.
  • LSD (Least Significant Digital) sorts from the right-most low order. Sort kd first, then sort kd-1, and repeat the sequence until you sort k1 to get an ordered sequence. LSD mode works for sequences with few bits.

Algorithm description

We take LSD as an example, starting from the lowest position, and the specific algorithm is described as follows:

1. Get the largest number in the array, and get the number of bits; 2. Arr is the original array, and each bit is taken from the lowest position to form the Radix array; 3. Carry out counting sort on Radix (take advantage of the feature that counting sort is applicable to a small range of numbers); The code implements radix sort: through the value of each element in the sequence, the sort of N elements is “allocated” and “collected” through several times to realize the sort.

Allocation: We take out the element in L[I], first determine the number in its ones bit, and allocate it to the bucket with the same serial number according to this number

Collection: When all the elements in the sequence are allocated to the corresponding bucket, then the elements in the bucket are collected in sequence to form a new sequence L[]. The newly formed sequence L[] is repeated to allocate and collect elements in the tens, hundreds… The sorting ends when the highest bit in the sequence is allocated

public static void RadixSort(int[] arr) { if (arr.length <= 1) return; Int Max = 0; int Max = 0; for (int i = 0; i < arr.length; i++) { if (max < arr[i]) { max = arr[i]; } } int maxDigit = 1; while (max / 10 > 0) { maxDigit++; max = max / 10; Int [][] buckets = new int[10][ar.length]; int base = 10; For (int I = 0; int I = 0; i < maxDigit; i++) { int[] bktLen = new int[10]; For (int j = 0; int j = 0; j < arr.length; j++) { int whichBucket = (arr[j] % base) / (base / 10); buckets[whichBucket][bktLen[whichBucket]] = arr[j]; bktLen[whichBucket]++; } // Collect: retrieve the data from different buckets one by one, to prepare for the next round of top ranking. for (int b = 0; b < buckets.length; b++) { for (int p = 0; p < bktLen[b]; p++) { arr[k++] = buckets[b][p]; } } System.out.println("Sorting: " + Arrays.toString(arr)); base *= 10; }}


int[] n = {60, 38, 5, 14, 7, 23, 89, 77, 88, 4, 35, 45, 67, 99, 87}; System.out.println(Arrays.toString(n)); RadixSort(n); System.out.println(Arrays.toString(n)); // The Sorting of the Sorting System in the Sorting house in the Sorting house in the Sorting house in the Sorting house in the Sorting house [60, 23, 14, 4, 5, 35, 45, 7, 77, 67, 87, 38, 88, 89, 99] System.out: Sorting: [4, 5, 7, 14, 23, 35, 38, 45, 60, 67, 77, 87, 88, 89, 99] System.out: [4, 5, 7, 14, 23, 35, 38, 45, 60, 67, 77, 87, 88, 89, 99]

Where, d is the number of bits, r is the cardinal number, and n is the number of the original array. In cardinal sort, because there is no comparison operation, the best-case and worst-case complexity are the same in time, O(D *(N + R)).

Conclusion: Human knowledge is limited