Summary of merge sort and quicksort

This paper has participated inProject DigginTo win the creative gift package and challenge the creative incentive money.

Last review

  1. Heap sort and insert sort code interpretation
  2. Complexity analysis and calculation methods

Review of Basic Knowledge

  1. Recursive: A very simple example, you try to translate a English, we usually go to the translation in accordance with the clauses, then we found that the English have five or six sentences, words in a sentence have a few don’t know, when you query the words mean, continue to return to the sentence in the sentence translation, and then put all the sentences together, for the whole period of English translation, This is a recursive process
  2. Recursion needs to satisfy three conditions
    • The solution of a problem can be decomposed into several subproblems
    • This problem and the decomposition of the sub-problem, except the data scale is different, the solution idea is exactly the same
    • There are recursive termination conditions

Merge Sort

  • The basic idea

    If you want to sort an array, we first array from the middle into two halves, then the two parts respectively before and after sorting, then sorted the two parts together, so that the entire array is orderly array, its use is actually the partition of ideas (will be a big problem is decomposed into small subproblems to solve), small problem is solved, big problems are solved

  • Dynamic graph display

As can be seen from the above GIF, two elements are compared at first, then four, and then eight elements. This method is somewhat similar to the recursion we learned before. In fact, divide-and-conquer algorithms are generally implemented by recursion.

Divide-and-conquer is a problem solving processing idea, while recursion is a programming technique

  • Code implementation

    /** * merge sort idea * recursion (divide and conquer) *@param {*} arr
     * @returns* /
    const mergeSort = arr= > {
      const len = arr.length;
      if (len <= 1) return arr;
      const middle = Math.floor(len / 2);
      const left = arr.slice(0, middle);
      const right = arr.slice(middle);
      return merge(mergeSort(left), mergeSort(right));
    };
    
    function merge(left, right) {
      let result = [];
      while (left.length && right.length) {
        if (left[0] < right[0]) {
          result.push(left.shift());
        } else{ result.push(right.shift()); }}if (left.length) {
        result = [...result, ...left];
      }
      if (right.length) {
        result = [...result, ...right];
      }
    
      return result;
    }
    Copy the code
  • Code reading

    We recorded the length of the array first, and then noticed that when the array length less than 2 does not need to sort, can be directly returned array, and then calculate the left half element and element of the right half, for the merge operation, but we found that in the merge function, we incoming array is left side and right side array, It is placed in the mergeSort method to form recursion

    Let’s look at the picture below to help explain

If (len <= 1) return arr, if (len <= 1) return arr, So we can follow this decomposition law to get the following derivation

MergeSort ([11, 8, 3, 9]), mergeSort([7, 1, 2, 5])

Merge (mergeSort([11, 8]), mergeSort([3, 9])), merge(mergeSort([7, 1]), mergeSort([2, 5])))

Step 3: merge(merge(merge(merSort([11]), mergeSort([8])), merge(mergeSort([3]), mergeSort([9]))), merge(merge(merSort([7]), mergeSort([1])), merge(merSort([2]), mergeSort([5]))))

It may seem visually fatiguing, but that’s the essence of recursion. When mergeSort([11]) passes in only one element, it returns the current ARR to the merge method. After the merge, merge the array returned by the merge method. Finally, merge the upper level of the merge method. Merge the left and right sides of the merge method to sort the array elements

So after we analyze this, do you know what the merge method needs to do?

That’s right, the merge of two arrays! And is always two ordered array merge function, the main idea is to compare the head of two arrays, the small first into a temporary array, until an array is empty, directly insert the element in another array into the tail of the temporary array, to achieve two ordered array sort

  • The code analysis

    • Spatial complexity analysis

      Merge (merge, merge, merge); merge (merge, merge); merge (merge, merge); merge (merge, merge); merge (merge, merge); merge (merge, merge); merge (merge, merge); merge (merge, merge); merge (merge, merge); merge (merge, merge); merge (merge, merge); merge (merge, merge); merge (merge, merge) So the storage Jon key grows as the size of the data increases, but it also takes up O(logn) stack space during the recursive time, so the total space complexity is O(n + logn). Because of the magnitude, the final space complexity is O(n).

    • Is it a stable sorting algorithm

      Is stable, depending on whether the switching elements when will be the same element position swap, so we will see the merge method, found that the elements of the swap will not, but I have a question here, because I’m in the treatment of residual elements, adopted the expansion of the es6 operator, this will lead to the same element will be covered directly, So if you have the same element during sorting, just write a while loop

    • Time complexity analysis

      Let’s say it takes T(n) to merge n data, so it takes T(n/2) to split into two subarrays, and then the time complexity of merge function is O(n), so the time complexity of merge sort is T(n)


      T ( 1 ) = C ( C Is a constant ) T ( n ) = 2 T ( n / 2 ) + n ; n > = 1 T(1) = C(C is a constant) \\ T(n) = 2 * T(n/2) + n; n >= 1

      Merge sort n elements into 2 arrays of N /2 elements. Merge sort n elements into 2 arrays of N /2 elements. Merge sort n elements into 2 arrays of N /2 elements


      T ( n ) = 2 T ( n / 2 ) + n = 2 ( 2 T ( n / 4 ) + n / 2 ) + n = 4 T ( n / 4 ) + 2 n = 4 ( 2 T ( n / 8 ) + n / 4 ) + 2 n = 8 T ( n / 8 ) + 3 n . . . . . . = 2 k T ( n / 2 k ) + k n . . . . . . T(n) = 2 * T(n/2) + n \\ = 2 * (2 * T(n/4) + n/2) + n = 4 * T(n/4) + 2 * n \\ = 4 * (2 * T(n/8) + n/4) + 2 * n = 8 * T(n/8) + 3 * n \\ …… \\ = 2^k * T(n/2^k) + k * n \\ ……

      When T of n over 2 to the k is equal to T of 1, that’s k is equal to log base 2n. So if we plug in k up here, we get T of n is equal to Cn plus nlog2n, and finally in big O notation it’s O of nlogn.

      So the final time is order nlogn.

  • Quick Sort

    • The basic idea

      Assumption needs to sort an array subscript between p and r a set of data, we choose between p and r any data as a pivot point (partition), we traverse between p and r data, will be less than the pivot on the left, on the right side is greater than the pivot, the pivot in the middle, according to the recursive and partition processing, We can recursively sort the elements until we get down to 1, which means everything is sorted

    • Dynamic graph demonstration

      Here we see that the first element is selected as the base element, which is the partition point above, and there are two Pointers to the first element of the current array, one to iterate over the rest of the elements, and one to finally interswitch the first element with the current element. The last iteration ensures that everything on the left is less than the first base element and everything on the right is greater than the base element, and then recurses in this way

    • Code implementation

      function partition(arr, start, end) {
        // Base on the last element
        const pivotValue = arr[end];
        let pivotIndex = start;
        for (let i = start; i < end; i++) {
          if (arr[i] < pivotValue) {
            // Swap elements
            [arr[i], arr[pivotIndex]] = [arr[pivotIndex], arr[i]];
            // Move to the next elementpivotIndex++; }}// Put the base value in the middle
        [arr[pivotIndex], arr[end]] = [arr[end], arr[pivotIndex]];
        return pivotIndex;
      }
      
      const quickSort = (arr, start, end) = > {
        if (end <= start) {
          return;
        }
        let index = partition(arr, start, end);
        quickSort(arr, start, index - 1);
        quickSort(arr, index + 1, end);
        return arr;
      };
      Copy the code
    • Code reading

      Select the last element (obtained by end) as the base element, and then iterate from start to end. If the value is smaller than the current base element, select the last element (obtained by end) as the base element. At the end of the loop, the base element and the element pointed to by pivotIndex are swapped, and the left side of pivotIndex is smaller than the base element. The right-hand side is larger than the base element, and the sorting is complete by dividing until there are only 2 or 1 elements left at each partition operation

      For example

      Here we choose the last element as a benchmark elements, I and j are pointing to the first element, and then look at the code above to view, because 6 < 8, so need to carry on the exchange, but because I and j are pointing to the same element, so the exchange and meaningless, so here just played a will I the purpose of the pointer moves to the right, Since 11 more than 8, do not operate, and then when j to 3 this position 3 < 8, so need to carry on the exchange, is the current element to swap the I and j, then the pointer moves to the right one, I then move right j found 9 > 8, does not operate, and when j came to this place in 8, end of the cycle, Swap the base element 8 with the element pointed to by the current I to complete a sequence. In this case, the left side of the base element is less than 8, while the right side of the base element is greater than 8. Here, I represents pivotIndex, and j represents I in the above loop.

      The partition finds a base element, inserts the base element in the correct place in the array and returns its index, and then all we need to do is recursively traverse the array to the left and right of the base element, so we call quickSort again, Sort the array to the left and right of the base element until end-start <= 1

    • The code analysis

      • Spatial complexity analysis

        As we can see from the code above, no additional storage is required during the running of the program, so the space complexity is O(1).

      • Is it a stable sorting algorithm

        Unstable. In the example above, if we use numbers 6, 8, 7, 6, 3, 5, 9, 4, the relative order of the 6’s changes after the first partition

      • Time complexity analysis

        Because of the recursive processing idea, each partition can exactly divide the array into two intervals of similar size, so the calculation of the time complexity of fast sorting is the same as the above merge sort, so the time complexity of fast sorting is also O(nlogn).

        But let’s think about a case where we have an already ordered array, and if we select the last element as the base element each time, the two partitions are not equal, so we need about n partitions to do the whole quicksort operation, and at this point, So the quicksort time goes from O(nlogn) to O(n2), but the average time is still O(nlogn).

  • comments

    Please feel free to discuss in the comments section. The nuggets will draw 100 nuggets in the comments section after the diggnation project. See the event article for details

More difficult, we move a small hand point praise, humble xiaoqiang, online praise ~