Hello everyone, I am the third, a brush algorithm programmer. Although the sorting algorithm related topics in the buckle is not a lot, but the interview is often torn by hand. Next, let’s take a look at the top 10 basic rankings,

Sorting based

Stability of sorting algorithm

What is the stability of sorting algorithms?

If the keys of the records to be sorted are different, the sorting result is unique; otherwise, the sorting result is not unique [1].

If there are multiple records with the same keyword in the file to be sorted, the relative order of the records with the same keyword remains unchanged after sorting, the sorting method is stable:

This sorting method is said to be unstable if the relative order between records with the same keyword changes.

The sorting algorithm is stable for all input instances. That is, in all possible input instances, as long as there is one instance of the algorithm does not meet the stability requirements, then the sorting algorithm is unstable.

Sorted classification

Sorting is classified according to whether the exchange of internal and external memory of data is involved in the sorting process. Sorting can be roughly divided into two categories: internal sorting and external sorting.

Sorting can be divided into comparison sort and non-comparison sort according to whether the relative order of elements is determined by comparison.

Bubble sort

Bubble sort principle

Pick soft persimmon knead, start with the simplest.

Bubble sort has a nice name, but it’s also one of the best understood ideas.

The basic idea of bubble sort is to walk from end to end, compare the size of adjacent elements in pairs, and swap if it’s in reverse order.

The GIF is as follows (source reference [4]) :

Simple code implementation

First implement the following simple, very simple, two layer loop, adjacent element comparison:

    public void sort(int[] nums) {
        for (int i = 0; i < nums.length; i++) {
            for (int j = i + 1; j < nums.length; j++) {
                / / ascending
                if (nums[i] > nums[j]) {
                    / / exchange
                    inttemp = nums[i]; nums[j] = nums[i]; nums[i] = temp; }}}}Copy the code

Bubble sort optimization

There is a problem with the above code implementation. What is it?

Even if the array is ordered, it’s going to keep iterating.

So you can use a flag bit to indicate that the array is in order, and if it is, it jumps out of the loop.

   public void sort(int[] nums) {
        / / sign
        boolean flag = true;
        for (int i = 0; i < nums.length; i++) {
            for (int j = 0; j < nums.length - i - 1; j++) {
                / / ascending
                if (nums[j] > nums[j + 1]) {
                    / / exchange
                    int temp = nums[j];
                    nums[j] = nums[j + 1];
                    nums[j + 1] = temp; }}// If it is ordered, end the loop
            if (flag) {
                break; }}}Copy the code

Bubble sort performance analysis

Elements of the same size do not swap places, so bubble sort is stable.

The algorithm name Best time complexity Worst time complexity Average time complexity Spatial complexity Is stable
Bubble sort O(n) O (n squared) O (n squared) O(1) stable

Selection sort

Principle of selection sort

Why is selection sort called selection sort? How does it work?

The idea is to find the smallest or largest element in the unsorted sequence, place it at the beginning of the sorted sequence, then relay the smallest or largest element to the unsorted sequence, and place it at the end of the sorted sequence. And so on until all the elements are sorted.

The GIF is as follows (source reference [4]) :

Consider an example of sorting an array by selection [2,5,4,1,3].

  • First, find the smallest element in the array, 1, and swap it with the first element in the array.

  • On the second trip, find the smallest element in the unsorted array, 2, and swap places with the second element in the array.

  • On the third trip, find the smallest element in the unsorted array, 3, and swap places with the third element in the array.

  • On the fourth trip, find the smallest element in the unsorted array, 4, and swap places with the fourth element in the array.

So at this point, our array is sorted.

Select the sort code implementation

The idea of choosing sort is very simple and not difficult to implement.

    public void sort(int[] nums) {
        int min = 0;
        for (int i = 0; i < nums.length - 1; i++) {
            for (int j = i + 1; j < nums.length; j++) {
                // Find the smallest number
                if(nums[j] < min) { min = j; }}/ / exchange
            inttemp = nums[i]; nums[i] = nums[min]; nums[min] = temp; }}Copy the code

Select sort performance analysis

Is selection sort stable?

The answer is unstable because the minimum value found in the unsorted sequence is swapped with the last element of the sorted sequence.

The algorithm name Best time complexity Worst time complexity Average time complexity Spatial complexity Is stable
Selection sort O (n squared) O (n squared) O (n squared) O(1) unstable

Insertion sort

Principle of insertion sort

There’s a very graphic metaphor for insertion sort. Dou landlord – touch the cards are disorderly, we will touch the cards inserted into the appropriate position.

The idea is to create a new ordered sequence by inserting an element into an already ordered sequence.

The GIF is as follows (Source Reference 3) :

Again, take the array [2,5,4,1,3] :

  • First run: Unsorted sequence inserts element 5 into the appropriate position of the sorted sequence

  • Second run: Then insert element 4 into the sorted sequence in place of the unsorted sequence, iterating through the ordered sequence to find the right place

  • Third run: Go ahead and insert the 1 into place

  • Step 5: Go ahead and insert the 3 into place

OK, end of sort.

Insert sort code implementation

Find where to insert the element and move the other elements.

    public void sort(int[] nums) {
        // Unordered sequences start at 1
        for (int i = 1; i < nums.length; i++) {
            // Insert an ordered sequence of elements
            int value = nums[i];
            int j = 0;
            for (j = i - 1; j >= 0; j--) {
                // Move data
                if (nums[j] > value) {
                    nums[j + 1] = nums[j];
                } else {
                    break; }}// Insert elements
            nums[j + 1] = value; }}Copy the code

Insert sort performance analysis

Insert sort intelligently moves elements larger than insert elements, so the relative position of the same element remains unchanged and is stable.

As we can see from the code, if we find the right place, we don’t compare any more, so the best case is order n.

The algorithm name Best time complexity Worst time complexity Average time complexity Spatial complexity Is stable
Insertion sort O(n) O (n squared) O (n squared) O(1) stable

Hill sorting

Hill sorting principle

Hill sort takes its name from its inventor, Shell. It’s an improved version of direct insert sort.

The idea of Hill sorting is to divide the whole sequence of records to be sorted into several sub-sequences and insert sort them separately.

We know that direct insert sort is not ideal when you’re dealing with a lot of unordered data, so hill sort is basically jumping around to get the basic order of the elements, and then using direct insert sort.

Gifs of Hill sorting (source reference [1]) :

Using the array 2,5,6,1,7,9,3,8,4 as an example, let’s look at the hill sort process:

  • The number of elements in the array is 9, take 7/2=4 as the subscript difference value, and divide the elements with the subscript difference value of 4 into a group

  • Insertion sort is carried out in the group to form an ordered sequence

  • Then take 4/2=2 as the subscript difference and divide the elements with subscript difference of 2 into a group

  • Group insertion sort to form an ordered sequence

  • Subscript difference =2/2=1, insert the rest of the elements into sort

Hill sort code implementation

If you look at the insertion sort, hill sort

It’s just a direct insert sort with a step size.

    public void sort(int[] nums){
        / / the subscript is poor
        int step=nums.length;
        while (step>0) {// This is optional and can also be 3
            step=step/2;
            // Group to insert sort
            for (int i=0; i<step; i++){// Start with the second element in the group
                for (intj=i+step; j<nums.length; j+=step){// The element to insert
                    int value=nums[j];
                    int k;
                    for(k=j-step; k>=0; k-=step){if (nums[k]>value){
                          // Move the group element
                          nums[k+step]=nums[k];
                      }else {
                          break; }}// Insert elementsnums[k+step]=value; }}}}Copy the code

Hill sort performance analysis

  • Stability analysis

Hill sort is a variation of direct insert sort, but unlike direct insert sort, it is grouped, so the relative positions of the same elements in different groups may change, so it is unstable.

  • Time complexity analysis

The time complexity of Hill sort is related to the choice of incremental sequence, and the range is O(n^(1.3-2)). The time complexity of the sorting algorithm before this is basically O(n²). Hill sort is one of the first algorithms to break through this time complexity.

The algorithm name Time complexity Spatial complexity Is stable
Hill sorting O (n ^ (1.3 -) 2) O(1) unstable

Merge sort

Merge sort principle

Merge sort is an effective sorting algorithm based on merge operation. Merge means merge. The definition of data structure is to combine several ordered sequences into a new ordered sequence.

Merge sort is a classic use of divide and conquer, so what does divide and conquer mean? A big problem is broken down into smaller problems to be solved.

So merge sort, you take an array and you cut it into two, and then you recursively go all the way up to the individual elements, and then you combine them, and then you combine the decimals into a larger array.

The GIF is as follows (source reference [4]) :

Let’s look at the merge sort process using the array [2,5,6,1,7,3,8,4] as an example:

We don’t have to talk about splitting, but let’s see how to merge.

Merge sort code implementation

Here we use recursion to implement merge sort:

  • Recursive termination conditions

The recursion starts at a less than the end

  • Returns the result recursively

Sort the array sequence directly, so there is no return value

  • Recursive logic

Divide the current array into two groups, split the two groups separately, and merge

The code is as follows:

public class MergeSort {

    public void sortArray(int[] nums) {
        sort(nums, 0, nums.length - 1);
    }

    /** * merge sort **@param nums
     * @param left
     * @param right
     */
    public void sort(int[] nums, int left, int right) {
        // Recursive end condition
        if (left >= right) {
            return;
        }
        int mid = left + (right - left) / 2;
        // Recurse the left half of the current sequence
        sort(nums, left, mid);
        // Recurse the right half of the current sequence
        sort(nums, mid + 1, right);
        // merge the result
        merge(nums, left, mid, right);
    }

    /** * merge **@param arr
     * @param left
     * @param mid
     * @param right
     */
    public void merge(int[] arr, int left, int mid, int right) {
        int[] tempArr = new int[right - left + 1];
        // Left primary index and right primary index
        int l = left, r = mid + 1;
        int index = 0;
        // Move the smaller number into the new array first
        while (l <= mid && r <= right) {
            if (arr[l] <= arr[r]) {
                tempArr[index++] = arr[l++];
            } else{ tempArr[index++] = arr[r++]; }}// Move the rest of the left array into the array
        while (l <= mid) {
            tempArr[index++] = arr[l++];
        }
        // Move the remaining number on the right into the array
        while (r <= right) {
            tempArr[index++] = arr[r++];
        }
        // Assign the value of the new array to the original array
        for (int i = 0; i < tempArr.length; i++) { arr[i+left] = tempArr[i]; }}}Copy the code

Merge sort performance analysis

  • Time complexity

So once we merge, we’re going to iterate through the sequence that we want to sort, order n.

And merge sort, you have to divide the array in half, and that’s order logn.

So merge sort is order nlogn.

  • Spatial complexity

A temporary array is used to store the merged elements, space complexity O(n).

  • The stability of

Merge sort is a stable sorting method.

The algorithm name Best time complexity Worst time complexity Average time complexity Spatial complexity Is stable
Merge sort O(nlogn) O(nlogn) O(nlogn) O(n) stable

Quick sort

Quicksort principle

Quicksort is the ranking algorithm with the highest frequency of interviews.

Quicksort, like merge sort, is divide-and-conquer:

  • Pick a base number, usually the leftmost element of the sequence
  • Reordering the sequence so that those smaller than the base value are placed to the left and those larger than the base value are placed to the right is called partitioning

The quicksort GIF is as follows:

Let’s look at a complete quicksort diagram:

Quick sort code implementation

Single scan quicksort

Pivot selects a number as the base number, and sets a mark to represent the right-most subscript position of the left sequence. Then loop through the set of numbers. If the element is greater than the base value, no operation is performed. Then, the element at mark’s position is exchanged with the element traversed. The mark position stores data smaller than the reference value. When the traversal is over, the reference value and the element at Mark are exchanged.

public class QuickSort0 {

    public void sort(int[] nums) {
        quickSort(nums, 0, nums.length - 1);
    }

    public void quickSort(int[] nums, int left, int right) {
        // End condition
        if (left >= right) {
            return;
        }
        / / partition
        int partitonIndex = partion(nums, left, right);
        // Recursive left partition
        quickSort(nums, left, partitonIndex - 1);
        // Recursive right partition
        quickSort(nums, partitonIndex + 1, right);
    }

    / / partition
    public int partion(int[] nums, int left, int right) {
        / / value
        int pivot = nums[left];
        //mark marks the initial subscript
        int mark = left;
        for (int i = left + 1; i <= right; i++) {
            if (nums[i] < pivot) {
                // If the value is less than the base value, mark+1 and switch positions
                mark++;
                inttemp = nums[mark]; nums[mark] = nums[i]; nums[i] = temp; }}// The base value is transposed with the mark element
        nums[left] = nums[mark];
        nums[mark] = pivot;
        returnmark; }}Copy the code

Bilateral scan quicksort

There’s another way of doing bilateral scanning.

Select a number as a reference value, and then scan the left and right sides of the array. First, find an element larger than the reference value from left to right and fill it in the right pointer. Then, scan from right to left and find an element smaller than the reference value and fill it in the left pointer.

public class QuickSort1 {

    public int[] sort(int[] nums) {

        quickSort(nums, 0, nums.length - 1);
        return nums;

    }

    public void quickSort(int[] nums, int low, int high) {

        if (low < high) {
            int index = partition(nums, low, high);
            quickSort(nums, low, index - 1);
            quickSort(nums, index + 1, high); }}public int partition(int[] nums, int left, int right) {
        / / value
        int pivot = nums[left];
        while (left < right) {
            // Scan from right to left
            while (left < right && nums[right] >= pivot) {
                right--;
            }
            // Find the first element smaller than pivot
            if (left < right) nums[left] = nums[right];
            // Scan from left to right
            while (left < right && nums[left] <= pivot) {
                left++;
            }
            // Find the first element larger than pivot
            if (left < right) nums[right] = nums[left];
        }
        // Put the base number in the appropriate position
        nums[left] = pivot;
        returnleft; }}Copy the code

Quicksort performance analysis

  • Time complexity

Quicksort takes the same time as merge sort, order nlogn, but that’s the optimal case, where you can split the array into two subarrays of the same size each time.

If there is an extreme case, such as an ordered sequence [5,4,3,2,1] with a base value of 5, then n-1 shards are required to complete the whole quicksort process, and the time complexity in this case is reduced to O(n²).

  • Spatial complexity

Quicksort is a sort in place algorithm with O(1) space complexity.

  • The stability of

Quicksort is an unstable sorting algorithm because quicksort is compared and swapped by leaps.

The algorithm name Best time complexity Worst time complexity Average time complexity Spatial complexity Is stable
Quick sort O(nlogn) O (n squared) O(nlogn) O(1) unstable

Heap sort

Principle of heap sort

Remember that simple selection sort we had earlier? Heap sort is an updated version of simple selection sort.

Before we look at heap sorting, what is a heap?

Complete binary trees are one of the more classic heap implementations of heaps.

So let’s first understand what a complete binary tree is.

Jane replied that if a node is dissatisfied, its dissatisfied part can only be on the right side of the last layer.

Let’s look at a few examples.

I believe that after looking at these examples, it is clear what is a complete binary tree and what is not a complete binary tree.

What’s that pile?

  • It has to be a complete binary tree
  • The value of any node must be the maximum or minimum value of its subtree
  • The maximum value is called the “maximum heap”, also known as the big top heap;
  • At its minimum, it is called the “minimum heap”, also known as the small top heap.

Because the heap is a full binary tree, the heap can be stored in arrays.

Layer by layer to store the elements in their respective positions in the array, starting with subscript 1, and you can omit some calculations.

Ok, so now that we know a little bit about heaps, what does heap sort look like? [2]

  • Build a big top heap
  • Inserts the top of the heap element (maximum value) at the end of the array
  • Let the new maximum element float up to the top of the heap
  • Repeat the process until the sorting is complete

The GIF is as follows (source Reference [1]) :

There are two ways to build a heap, one is to float up, the other is to sink down.

What is it that floats up? You float the child nodes up one layer at a time into place.

What is sinking? The idea is to sink the top elements layer by layer into place.

In the GIF above, the sinking method is used.

Heap sort code implementation

public class HeapSort {

    public void sort(int[] nums) {
        int len = nums.length;
        / / build the heap
        buildHeap(nums, len);
        for (int i = len - 1; i > 0; i--) {
            // Swap the top and bottom elements of the heap
            swap(nums, 0, i);
            // The array count length is reduced by 1 to hide the end of the heap
            len--;
            // Sink the top element so that the largest element floats to the top of the heap
            sink(nums, 0, len); }}/** * Build heap **@param nums
     * @param len
     */
    public void buildHeap(int[] nums, int len) {
        for (int i = len / 2; i >= 1; i--) {
            / / sinksink(nums, i, len); }}/** ** Sink operation **@param nums
     * @param index
     * @param end
     */
    public void sink(int[] nums, int index, int end) {
        // Left child subscript
        int leftChild = 2 * index + 1;
        // Subscript the right child node
        int rightChild = 2 * index + 2;
        // The index of the node to be adjusted
        int current = index;
        // Drop the left subtree
        if (leftChild < end && nums[leftChild] > nums[current]) {
            current = leftChild;
        }
        // Drop the right subtree
        if (rightChild < end && nums[rightChild] > nums[current]) {
            current = rightChild;
        }
        // If the subscripts are not equal, the switch is done
        if(current! =index){/ / exchange value
            swap(nums,index,current);
            // Continue sinkingsink(nums,current,end); }}public void swap(int[] nums, int i, int j) {
        inttemp = nums[i]; nums[i] = nums[j]; nums[j] = temp; }}Copy the code

Heap sort performance analysis

  • Time complexity

The heaprow time is O(nlogn), the same as the fast row time.

  • Spatial complexity

Heap row introduces no new space, in situ sort, space complexity O(1).

  • The stability of

The elements on the top of the heap are constantly sinking, and swapping changes the relative positions of the same elements, so the heap is unstable.

The algorithm name Time complexity Spatial complexity Is stable
Heap sort O(nlogn) O(1) unstable

Count sorting

At the beginning of this article, we said that sorting is divided into comparison classes and non-comparison classes, and counting sort is a sorting method for non-comparison classes.

Counting sort is a linear time sort, using space in exchange for time, let’s see how counting sort is implemented.

Counting sort principle

General process of counting sort [4] :

  • Find the largest and smallest elements in the array to be sorted
  • Count the number of occurrences of each element of value I in the array and store it in the i-th entry of array ARr;
  • Add up all the counts (starting with the first element in the ARR, adding each term to the previous one);
  • Fill the target array in reverse: place each element I in the arr(I) entry of the new array, subtracting arr(I) for each element.

Let’s take a look at the GIF demo (from Reference [4]) :

Let’s take an array to see the whole process: [6,8,5,1,2,2,3]

  • First, find the largest number in the array, which is 8, and create an empty array arr with a maximum subscript of 8

  • Iterate over the data and fill the number of occurrences of data into the subscript position corresponding to ARR

  • And then it prints the subscript value of the element in the array, as many times as the value of the element is

Counting sort code implementation

public class CountSort {

    public void sort(int[] nums) {
        // Find the maximum value
        int max = findMax(nums);
        // Create a new array of counts
        int[] countNums = new int[max + 1];
        // Store the number of occurrences of the nums element in the corresponding index
        for (int i = 0; i < nums.length; i++) {
            int num = nums[i];
            countNums[num]++;
            nums[i] = 0;
        }
        / / sorting
        int index = 0;
        for (int i = 0; i < countNums.length; i++) {
            while (countNums[i] > 0) { nums[index++] = i; countNums[i]--; }}}public int findMax(int[] nums) {
        int max = nums[0];
        for (int i = 0; i < nums.length; i++) {
            if(nums[i] > max) { max = nums[i]; }}returnmax; }}Copy the code

OK, at first glance, there is nothing wrong with it, but if you think about it carefully, there is something wrong with it. Where is it?

  • What if the elements we want to sort have 0? For example, [0,0,5,2,1,3,4], arR starts with 0.

    One way to do this is to add 10 to the array, [-1,0,2], and subtract when you write back, but if you happen to have a -10 element, it’s cool.

  • What if the range of elements is large? For example, [9992,9998,9993,9999], do we apply for an array of 10,000 elements?

    This can be solved with offsets, find the largest and smallest elements, calculate the offsets, for example [9992,9998,9993,9999], the offsets =9999-9992=7, we just need to create an array of capacity 8.

The version that addresses the second problem is as follows:

public class CountSort1 {

    public void sort(int[] nums) {
        // Find the maximum value
        int max = findMax(nums);
        // Find the minimum value
        int min = findMin(nums);
        / / the offset
        int gap = max - min;
        // Create a new array of counts
        int[] countNums = new int[gap + 1];
        // Store the number of occurrences of the nums element in the corresponding index
        for (int i = 0; i < nums.length; i++) {
            int num = nums[i];
            countNums[num - min]++;
            nums[i] = 0;
        }
        / / sorting
        int index = 0;
        for (int i = 0; i < countNums.length; i++) {
            while (countNums[i] > 0) { nums[index++] = min + i; countNums[i]--; }}}public int findMax(int[] nums) {
        int max = nums[0];
        for (int i = 0; i < nums.length; i++) {
            if(nums[i] > max) { max = nums[i]; }}return max;
    }

    public int findMin(int[] nums) {
        int min = nums[0];
        for (int i = 0; i < nums.length; i++) {
            if(nums[i] < min) { min = nums[i]; }}returnmin; }}Copy the code

Counting sort performance analysis

  • Time complexity

Our total number of operations is n+n+n+k=3n+k, so the effort complexity is ORDER n+k.

  • Spatial complexity

Introduced auxiliary arrays, space complexity O(n).

  • The stability of

Our implementation is actually unstable, but counting sort does have a stable implementation, see reference [1].

We also realized that counting sort is not actually suitable for arrays with negative numbers and large element offsets.

Bucket sort

Bucket arrays can be seen as an updated version of counting sort, sorting elements into buckets and sorting the elements in each bucket individually.

Bucket sorting principle

Bucket sorting process:

  • Set a quantitative array as an empty bucket;
  • Iterate over the input data and place elements one by one into their respective buckets;
  • Sort each bucket that is not empty;
  • Splicing together sorted data from never-empty buckets.

The bucket sorting GIF is as follows (source reference [1]) :

As mentioned above, count sorting is not appropriate for arrays with large offsets. Let’s take an array with a very large offset [2066,566, 1666, 666, 2566, 1566] and look at bucket sorting.

  • Create 6 buckets to store elements 0-500,500-1000,1000-1500,1500-2000,2000-2500,2500-3000 respectively

  • Iterate through the number group and assign the elements to the corresponding buckets

  • So we’re obviously only sorting the first bucket

  • Take the elements out of the bucket one by one, and the elements are in order

Bucket sort code implementation

To implement bucket sorting we need to consider several issues:

  • What is a bucket?
  • How to determine the number of barrels?
  • What method does bucket sort use?

Let’s look at the code:

public class BucketSort {

    public void sort(int[] nums) {
        int len = nums.length;
        int max = nums[0];
        int min = nums[0];
        // Get the maximum and minimum values
        for (int i = 1; i < len; i++) {
            if (nums[i] > max) {
                max = nums[i];
            }
            if(nums[i] < min) { min = nums[i]; }}// Calculate the step size
        int gap = max - min;
        // Use lists as buckets
        List<List<Integer>> buckets = new ArrayList<>();
        // Initialize the bucket
        for (int i = 0; i < gap; i++) {
            buckets.add(new ArrayList<>());
        }
        // Determine the bucket storage range
        int section = gap / len - 1;
        // Add the array to the bucket
        for (int i = 0; i < len; i++) {
            // Determine which bucket the element should go into
            int index = nums[i] / section - 1;
            if (index < 0) {
                index = 0;
            }
            // Add elements to the bucket
            buckets.get(index).add(nums[i]);
        }
        // Sort the elements in the bucket
        for (int i = 0; i < buckets.size(); i++) {
            // This bottom layer calls arrays.sort
            // This API may use three sorts in different cases: insert sort, quicksort, and merge sort. See [5]
            Collections.sort(buckets.get(i));
        }
        // Write the elements in the bucket to the original array
        int index = 0;
        for (List<Integer> bucket : buckets) {
            for(Integer num : bucket) { nums[index] = num; index++; }}}}Copy the code

Bucket sorting performance analysis

  • Time complexity

Bucket sorting, in the best case, the elements are evenly distributed in each bucket, order n, in the worst case, all elements are distributed in one bucket, order n squared. The average time complexity is O(n+ K), the same as for technical sorting.

  • Spatial complexity

Bucket sorting requires n additional buckets, which store k elements, so the space complexity is O(n+ K).

  • The stability of

Stability depends on what sort algorithm is used to sort in buckets, stable sort algorithm in buckets, stable sort algorithm. Using an unstable sorting algorithm, then it’s unstable.

Radix sort

Principle of radix sort

Radix sort is a non – comparison sort method.

Its basic principle is to slice the elements into different numbers in terms of bits, and then compare them in terms of each digit.

General process:

  • Get the largest number in the array and get the number of digits;
  • Arr is the original array, and each bit is taken from the lowest level to form the RADIX array
  • Counting sorting for the Radix (taking advantage of counting sorting for a small range of numbers)

The GIF is shown below (source Reference [1]) :

Radix sort can be said to be an evolution of bucket sort. We take [892, 846, 821, 199, 810,700] to see the process of radix sort:

  • Create ten buckets to store elements

  • Assign elements to buckets based on the units digit

  • Then take the elements out of the bucket one by one

  • The next ten digits are lined up, buckets are allocated according to the ten digits, and buckets are taken out in turn

  • And then the hundreds

Radix sort code implementation

public class RadixSort {

    public void sort(int[] nums) {
        int len = nums.length;
        / / Max
        int max = nums[0];
        for (int i = 0; i < len; i++) {
            if(nums[i] > max) { max = nums[i]; }}// The current sort position
        int location = 1;
        // Implement buckets with lists
        List<List<Integer>> buckets = new ArrayList<>();
        // Initialize a bucket with size 10
        for (int i = 0; i < 10; i++) {
            buckets.add(new ArrayList<>());
        }
        while (true) {
            // The highest number of elements
            int d = (int) Math.pow(10, (location - 1));
            // Check whether the row is complete
            if (max < d) {
                break;
            }
            // Add data to bucket
            for (int i = 0; i < len; i++) {
                // Calculate the remainder and put it into the appropriate bucket
                int number = ((nums[i] / d) % 10);
                buckets.get(number).add(nums[i]);
            }
            // Write back to array
            int nn = 0;
            for (int i = 0; i < 10; i++) {
                int size = buckets.get(i).size();
                for (int ii = 0; ii < size; ii++) { nums[nn++] = buckets.get(i).get(ii); } buckets.get(i).clear(); } location++; }}}Copy the code

Radix sort performance analysis

  • Time complexity

Time complexity O(n+k), where n is the number of array elements and k is the highest digit of array elements.

  • Spatial complexity

As with bucket sorting, the space complexity is O(n+ K) because of the introduction of bucket storage.

  • The stability of

Because the radix sort process, each time the current digit is the same number of elements are uniformly allocated to the bucket, does not change positions, so the radix sort is stable.

conclusion

In this article, we’ve taken a look at the top 10 basic sorts to briefly summarize.

First of all, the simplest bubble sort: two layers of loop, adjacent exchange;

Select sort: unsorted and sorted two points, unsorted sequence to find the smallest element, at the end of the sorted sequence;

Insert sort: dou landlord touch card thinking, insert an element into the proper position of the ordered sequence;

Hill sort: insert sort plus, first split the sequence, then insert sort separately;

Merge sort: the first shot of the divide-and-conquer idea is to slice the sequence and then sort it in the merge process.

Quick sort: divide-and-conquer idea of the second bullet, the datum partition original sequence, small to the left, large to the right;

Heap sort: Select sort plus to create the big top heap, insert the top element (maximum value) at the end of the sequence, and float the new element up.

Counting sort: space for the first time, using the new array, counting the number of the corresponding elements, output the new array subscript, the original array to complete the sort;

Bucket sorting: space for time second play, the elements of the original array into a number of buckets, each bucket sorted separately, and then put the elements in the bucket together;

Radix sort: space for time third shell, bucket sort plus, according to the digit, divide the elements into buckets, and then compare by each digit.

Top 10 Basic sorting performance summary:

Sorting method Time complexity (average) Time complexity (worst) Time complexity (best) Spatial complexity The stability of
Bubble sort O (n squared) O (n squared) O(n) O(1) stable
Selection sort O (n squared) O (n squared) O (n squared) O(1) unstable
Insertion sort O (n squared) O (n squared) O(n) O(1) stable
Hill sorting O (n ^ (1.3 -) 2) O (n squared) O(n) O(1) unstable
Merge sort O(nlogn) O(nlogn) O(nlogn) O(n) stable
Quick sort O(nlogn) O (n squared) O(nlogn) O(nlogn) unstable
Heap sort O(nlogn) O(nlogn) O(nlogn) O(1) unstable
Count sorting O(n+k) O(n+k) O(n+k) O(n) stable
Bucket sort O(n+k) O (n squared) O(n) O(n+k) stable
Radix sort O(n*k) O(n*k) O(n*k) O(n+k) stable

Do simple things repeatedly, do repetitive things carefully, and do serious things creatively.

I am three points evil, a full stack of literary and military development.

Like, pay attention to not get lost, let’s see you next time!



Reference:

[1]. This is perhaps the best article on the Analysis of ten sorting algorithms in the Eastern Hemisphere

[2]. Github.com/chefyuan/al…

[2]. Data Structure and Algorithm Analysis

[3]. Interview frequency: Java commonly used eight sorting algorithms in one!

[4]. Ten Classic Sorting algorithms (GIF Demo)

[5]. An analysis of the underlying principle of Arrays.sort in JDK8 and its sorting algorithm selection