The sorting

Bubble sort(Stable sort)

** Idea: The idea of bubble sort is to compare the size of the current number to the size of the next number and move the larger number back to ensure that the largest number is placed at the end of the array in the next round. Then repeat the process to complete the sorting.

After the first round of comparison above, we can see that the largest number 5 has been placed at the end, at this point we just need to remove the largest number part (2,3,1,4) to repeat the operation.

public int[] MaopaoSort (int[] arr) {
        if (arr.length < 2)
            return arr;
        for (int i = 0; i < arr.length-1; i++) {
            for (int j = 0; j < arr.length-i-1; j++) {
                if (arr[j] > arr[j+1])
                    swap(arr,j,j+1); }}return arr;
    }
Copy the code

Time complexity

If the initial state of the file is positive order, a scan can complete the sorting. The required keyword comparison times C and record movement times M both reached the minimum value:

Cmin = n-1, Mmin = 0

So the best time complexity for bubble sort is O(n)

If the initial file is in reverse order, n-1 sorting is required. N-1 keyword comparison (1≤ I ≤n-1) is performed for each sort, and records must be moved three times for each comparison to reach the swap record position. In this case, both the comparison and the movement times reach the maximum:

Cmax = n(n-1)/2 = O(n2) Mmax = 3n(n-1)/2 = O(n2)

The worst time complexity of bubble sort is O(n2)

So the total average time complexity of bubble sort is O(n2)

Quick sort

Quick Sort is divide-and-conquer, as well as merge Sort. At first glance, quicksort and merge sort look very similar in that they make the problem smaller, sorting substrings first and merging them last. Quicksort is an improved method of bubble sort. In bubble sort, the comparison and exchange of elements are carried out between adjacent elements. Each exchange of elements can only move one position, so the number of comparison and movement is more, and the efficiency is relatively low. In quick sort, elements of comparison and exchange is, at both ends to the middle of the larger elements round can exchange to the back of the position, but smaller elements can exchange next to the front position, elements, the distance of each move further, so is less number and mobile number, speed faster, so it is called the “quick sort”. The basic idea of quicksort is:

  1. In the elements to be sorted in any element as a baseline (usually choose the first element, but the optimal selection method is to choose a random element from the elements to be sorted as the baseline), called the baseline element;
  2. Partition the elements to be sorted so that the elements larger than the base element are placed to its right and those smaller than the base element are placed to its left.
  3. Repeat the steps for the left and right partitions until all elements are in order

/** * quicksort *@param arr
     * @param left
     * @param right
     * @return* /
    public int[] quickSort(int[] arr, int left, int right){

        if (left < right){
            int point = partition(arr,left,right);
            quickSort(arr,left,point-1);
            quickSort(arr,point+1,right);
        }

        return arr;
    }

    /** * Quick sort to find segmentation points *@param arr
     * @param left
     * @param right
     * @return* /
    private int partition(int[] arr, int left, int right) {
        // use the first number as the base comparison value
        int base = arr[left];
        while(left < right){
            while (left < right && arr[right] >= base)
                right--;
            swap(arr,left,right);

            while(left < right && arr[left] <= base)
                left++;
            swap(arr,left,right);
        }
        return left;
    }
Copy the code

As you can see from the diagram above, this is the worst-case scenario, where the left-most array is only one element smaller than the original array, and the right-most array is only one element smaller. If this is always the case, then the time of the quicksort algorithm degrades to O(n2);

In the best case, the optimal lengths of the partitioned arrays are equal. Time is O(nlog2N)

Optimization (random reference number)

/** * Quick sort to find segmentation points *@param arr
     * @param left
     * @param right
     * @return* /
    private int partition(int[] arr, int left, int right) {

        /* Add a random reference value */
        int random_index = new Random().nextInt(right-left+1)+left;
        System.out.println("Random value"+random_index);
        swap(arr,left,random_index);

        // use the first number as the base comparison value
        int base = arr[left];
        while(left < right){
            while (left < right && arr[right] >= base)
                right--;
            swap(arr,left,right);

            while(left < right && arr[left] <= base)
                left++;
            swap(arr,left,right);
        }
        return left;
    }
Copy the code

Merge sort

Merge Sort is an efficient and stable sorting algorithm based on Merge operation. It is a typical application of Divide and Conquer. The ordered subsequences are combined to obtain a fully ordered sequence. That is, each subsequence is ordered first, and then the subsequence segments are ordered. If two ordered tables are merged into one ordered table, it is called binary merge.

Merge sort is stable sort, it is also a very efficient sort, can take advantage of the full binary tree characteristics of sorting generally not too bad performance. Arrays.sort() in Java uses a sort algorithm called TimSort, which is an optimized version of merge sort. Every time you can see from the above figure, merge the average time complexity is O (n), and the depth of full binary tree is | log2n |. The total average time complexity is O(nlogn). In addition, merge sort is best, worst, the average time complexity is O(nlogn).

/** * merge sort *@param arr
     * @param left
     * @param right
     * @param tmp
     */
    public void mergeSort(int[] arr, int left, int right, int[] tmp){
        if (left == right)
            return;

        int mid = left+((right-left)>>1);
        mergeSort(arr,left,mid, tmp);
        mergeSort(arr,mid+1,right, tmp);
        merge(arr,left,mid,right,tmp);
    }

    /** * merge up *@param arr
     * @param left
     * @param mid
     * @param right
     * @param tmp
     */
    private void merge(int[] arr, int left, int mid, int right, int[] tmp) {
        int i = left;
        int l = left;  // The index of the left half array
        int m = mid+1;   // The index of the right half of the array

        // merge into auxiliary arrays
        while(l <= mid && m <= right) tmp[i++] = arr[l]<arr[m]? arr[l++]:arr[m++];while (l <= mid)
            tmp[i++] = arr[l++];

        while(m <= right)
            tmp[i++] = arr[m++];

        // Copy the elements of the auxiliary array into the original array
        for (intj = left; j <= right; j++) { arr[j] = tmp[j]; }}Copy the code

Merge sort takes up more memory, but it is a high efficiency and stable algorithm.

In the process of merging, the relationship between the maximum value of the preceding sequence and the minimum value of the following sequence is first judged and then the replication comparison is determined. If the maximum value of the preceding sequence is less than or equal to the minimum value of the following sequence, it indicates that the sequence can directly form an ordered sequence without merging, and vice versa. So the time can go down to order n if the sequence itself is ordered.

TimSort can be said to be the ultimate optimized version of merge sort. The main idea is to detect natural ordered subsegments in sequences (if a strict descending subsegment is detected, the sequence is flipped to ascending subsegments). In the best case, both ascending and descending order can reduce the time complexity to O(n), with strong adaptability.

Best time complexity Worst time complexity Average time complexity Spatial complexity The stability of
Traditional merge sort O(nlogn) O(nlogn) O(nlogn) T(n) stable
Improved merge sort [1] O(n) O(nlogn) O(nlogn) T(n) stable
TimSort O(n) O(nlogn) O(nlogn) T(n) stable

Heap sort HeapSort

With reference to the post

public int[] heapSort(int[] arr){
        / / build the heap
        for (int i = 0; i < arr.length; i++) {
            creatSort(arr,i);
        }

        // The last element swaps with the top of the big root heap
        int size = arr.length;
        swap(arr,0,--size);
        while(size > 0){
            heapify(arr,0,size);
            swap(arr,0,--size);
        }
        return arr;
    }

    /** * Create a large root heap *@param arr
     * @param index
     */
    public void creatSort(int[] arr,int index){
        while(arr[index] > arr[(index-1) /2]){
            swap(arr,index,(index-1) /2);
            index = (index-1) /2; }}/** * The top of the big root heap becomes smaller, resize the heap, adjust it downward *@param arr
     * @param index
     * @param size
     */
    public void heapify(int[] arr,int index,int size){
        int left = index*2+1;
        while(left < size){
            // Compare the size of the left and right nodes of index
            int large = left+1 < size && arr[left] < arr[left+1]? left+1:left;
            large = arr[large] > arr[index]? large:index;
            // If the subscript is the same as the original root, the adjustment is finished
            if (large == index)
                break;
            swap(arr,index,large);
            index = large;
            left = index*2+1; }}Copy the code

PriorityQueue PriorityQueue

Priority queues no longer follow a first-in, first-out principle, but are divided into two cases:

  • Maximum priority queue: Regardless of the queue order, the current largest element is first out of the queue;
  • Minimum priority queue, regardless of the queue order, the current smallest element priority out;
/** * Sort by priority queue *@param arr
     * @return* /
    public int[] PriorityQueueSort(int[] arr){

        PriorityQueue<Integer> queue = new PriorityQueue<>(new Comparator<Integer>() {
            @Override
            public int compare(Integer o1, Integer o2) {
                returno1.compareTo(o2); }});for (int i = 0; i < arr.length; i++) {
            queue.add(arr[i]);
        }
        int[] newarr = new int[arr.length];

        for (int i = 0; i < arr.length; i++) {
            newarr[i] = queue.poll();
        }
        return newarr;
    }
Copy the code

Sort algorithm used by Java internal sort

  • Number of elements < 47: insertion sort

  • 47 <= Number of elements < 286: quicksort

  • Number of elements > 286: merge sort