• Author: Chen Da Yu Tou

  • github:github.com/KRISACHAN

What’s the algorithm?

Algorithm is already a platitude concept, originally from the field of mathematics.

Algorithm represents the strategy mechanism to solve problems with a systematic method. It can obtain the required output in a limited time through a certain standard input.

The algorithm is shown below:

Good or bad algorithms

The quality of an algorithm is measured by time complexity and space complexity.

An 🌰 :

Fish head with Fang Qin go to the same company interview, the interviewer let them achieve the same function, bala Bala half a day, two people finally delivered the code.

As soon as the interviewer runs, he finds:

Fang Qin’s code takes 100ms to run and occupies 5MB of memory.

The fishhead code takes 100s to run and takes up 500MB of memory.

Okay, fish head interview failed again!

This is how much time and memory it takes to measure an algorithm.

Simply put, time complexity is the time cost of executing an algorithm, and space complexity is the space cost of executing an algorithm.

The complexity of the

Both time complexity and space complexity are represented by “big O”, written as O(*). It’s important to note that when we talk about complexity, we generally talk about time complexity.

Common “big O” representations of time complexity are as follows:

Time complexity Informal terms
O(1) Constant of the order
O(n) Linear order
O(n2) Square order
O(log n) The logarithmic order
O(n log n) Linear logarithmic order
O(n3) Cubic order
O(2n) The index order

The time consumed by an algorithm at N scale is as follows from large to small:

O(1) < O(log n) < O(n) < O(n log n) < O(n2) < O(n3) < O(2n)

What does the data in parentheses mean? Don’t ask. Ask and you’ll go back to your math book.

The following is a graph of resource consumption growth in different time complexity:

Common concepts:

  1. Best time complexity: The time complexity of executing code in the best case, which takes the least time;
  2. Worst time complexity: The worst-case time complexity of executing code, which takes the most time;
  3. Average time complexity: The average amount of time it takes to execute code. This value is the weighted average in probability theory, also known as the expected value.

Common sorting algorithms

We can’t live without sorting. For example, in gym class, people line up by height; Another example is the ranking of scores after an exam.

The same is true in programming, for example, when developing a student management system, it is necessary to rank students from small to large in order of learning well; To develop a platform, you need to rank similar products in order of price from highest to lowest. (Of course, the general front end is not responsible for handling the business logic.)

As you can see, sorting is everywhere.

Sorting seems simple, but there are a variety of algorithms and ideas hidden behind it.

An overview of the

According to different time complexity, common algorithms can be divided into three categories.

  1. O (n squared)Sorting algorithm of
    • Bubble sort
    • Selection sort
    • Insertion sort
    • Hill sorting
  2. O(n log n)Sorting algorithm of
    • Merge sort
    • Quick sort
    • Heap sort
  3. Linear sorting algorithm
    • Count sorting
    • Bucket sort
    • Radix sort

Specific information about various sorts

Picture explanation:

  • They are arranged from smallest to largest
  • K represents the number of “digits” in a number
  • N indicates the data scale
  • M represents the maximum value minus the minimum value of the data

Bubble Sort

Bubble Sort is a basic Sort of swap Sort.

Bubble sort is called bubble sort because it moves each element, like a bubble, down one side of the array, depending on its size.

The algorithm steps are as follows:

  1. Compare adjacent elements. If the first is greater than the second, swap them both;
  2. Do the same for each pair of adjacent elements, starting from the first pair to the last pair at the end. When you do that, the last element is going to be the largest;
  3. Repeat for all elements except the last one;
  4. Keep repeating the above steps for fewer and fewer elements at a time until there are no more pairs to compare.

The illustration is as follows:

The concrete implementation is as follows:

const bubbleSort = arr= > {
    const len = arr.length - 1
    for (let i = 0; i < len; ++i) { /* The number of loops is len-1 */
        for (let j = 0; j < len - i; ++j) { /* The inner loop is the number of comparisons per trip. The I trip is compared len-i times */
            if (arr[j] > arr[j + 1]) { /* If you want to compare adjacent elements, if you want to reverse the order, you want to swap */
                [arr[j], arr[j + 1]] = [arr[j + 1], arr[j]]
            }
        }
    }
    return arr
}
Copy the code

Selection Sort

Selection sort is a simple and intuitive sorting algorithm.

The main advantage of selection sort has to do with data movement.

If an element is in the correct final position, it will not be moved.

Each time a pair of elements is swapped, at least one of them will be moved to its final position, so sorting a list of N elements does at most n-1 swaps. Of all the sorting methods that rely entirely on swapping to move elements, selective sorting is the best one.

The algorithm steps for selecting sort are as follows:

  1. Find the smallest (largest) element in the unsorted sequence and store it at the beginning of the sorted sequence.
  2. Then, from the remaining unsorted elements continue to find the smallest (large) element, and then placed at the end of the sorted sequence;
  3. And so on until all the elements are sorted.

The illustration is as follows:

The concrete implementation is as follows:

const selectionSort = arr= > {
    const len = arr.length
    let min
    for (let i = 0; i < len - 1; ++i) {
        min = i /* Initializes the smallest data array subscript */ in an unsorted sequence
        for (let j = i + 1; j < len; ++j) { /* Access unsorted elements */
            if (arr[j] < arr[min]) { /* Find the current minimum */
                min = j /* Record the minimum value */
            }
        }
        [arr[i], arr[min]] = [arr[min], arr[i]] /* Switch places */
    }
    return arr
}
Copy the code

Insertion Sort

Insertion sort is a simple and intuitive sorting algorithm.

It works by building an ordered sequence, scanning back to front in the sorted sequence for unsorted data, finding the appropriate location and inserting it.

The algorithm steps of insertion sort are as follows:

  1. Starting with the first element, the element can be considered sorted;
  2. Fetch the next element and scan back to front in the sorted element sequence;
  3. If the element (sorted) is larger than the new element, move the element to the next position;
  4. Repeat step 3 until you find a place where the sorted element is less than or equal to the new element;
  5. After inserting a new element into that position;
  6. Repeat steps 2 to 5.

The illustration is as follows:

The concrete implementation is as follows:

const insertionSort = arr= > {
    const len = arr.length
    let j, temp
    for (let i = 0; i < len; ++i) {
        j = i - 1
        temp = arr[i]
        while (j >= 0 && arr[j] > temp) {
          arr[j + 1] = arr[j]
          j--
        }
        arr[j + 1] = temp
    }
    return arr
}
Copy the code

Shell Sort

Hill sort, also known as the descending incremental sort algorithm, is a more efficient and improved version of insertion sort. Hill sort is an unstable sort algorithm.

Hill sort is an improved method based on the following two properties of insertion sort:

  1. Insertion sort has high efficiency when it operates on almost ordered data, that is, it can achieve the efficiency of linear sort.
  2. But insert sort is generally inefficient because it can only move data one bit at a time.

Step size selection is an important part of Hill sorting.

Any sequence of steps will work as long as the final step is 1.

The algorithm starts by sorting by a certain step size.

It then continues to sort at a certain step size, and eventually the algorithm sorts at a step size of 1.

When the step size is 1, the algorithm changes to ordinary insertion sort, which guarantees that the data will always be sorted.

The algorithm steps of insertion sort are as follows:

  1. Define a step size to split;
  2. Sort the array K times according to step length K;
  3. Repeat these steps.

The illustration is as follows:

The concrete implementation is as follows:

const shellSort = arr= > {
    let gaps = [5.3.1] // Define the step size and the number of splits
    let len = arr.length
    for (let g = 0, gLen = gaps.length; g < gaps.length; ++g) {
        for (let i = gaps[g]; i < len; ++i) {
            let temp = arr[i], j
            for (j = i; j >= gaps[g] && arr[j - gaps[g]] > arr[i]; j -= gaps[g]) {
                arr[j] = arr[j - gaps[g]]
            }
            arr[j] = temp
        }
    }
    return arr
}
Copy the code

Quick Sort

Quicksort is also called partition-exchange sort.

On average, sorting n items takes O(n log n) comparisons. In the worst case, O(n2) comparisons are required, but this is not common. In fact, quicksort O(n log n) is usually significantly faster than other algorithms because its inner loop can be achieved efficiently on most architectures.

Quicksort uses a Divide and conquer strategy to Divide a sequence into smaller and larger subsequences, and then sort the two subsequences recursively.

The algorithm steps of quicksort are as follows:

  1. Pick a base: Pick an element from a sequence, called pivot;
  2. Split: Reorder the sequence so that all elements smaller than the base value are placed in front of the base value and all elements larger than the base value are placed behind the base value (numbers equal to the base value can go to either side). At the end of the split, the sorting of the reference values is complete;
  3. Recursive sort subsequence: Recursively sorts subsequences of elements less than a reference value and subsequences of elements greater than a reference value.

The criterion for recursion to the bottom is that the size of the sequence is zero or one, at which point the sequence is clearly ordered.

There are several specific methods to select the reference value, which has a decisive influence on the time performance of sorting.

The illustration is as follows:

The concrete implementation is as follows:

const quickSort = arr= > {
    const len = arr.length
    if (len < 2) {
        return arr
    }
    const pivot = arr[0]
    const left = []
    const right = []
    for (let i = 1; i < len; ++i) {
        if (arr[i] >= pivot) {
            right.push(arr[i])
        }
        if (arr[i] < pivot) {
            left.push(arr[i])
        }
    }
    return [...quickSort(left), pivot, ...quickSort(right)]
}
Copy the code

In addition to the regular quicksort, there is an optimized version of quicksort called triple quicksort.

When faced with a sequence with a lot of duplicate data, picking Pivot’s quicksort can degenerate into an O(n²) algorithm

In this case, there is 3 Ways Quick Sort.

Three-way quicksort is to divide the sequence into three parts: less than pivot, equal to pivot, and greater than pivot, and then recursively sort the parts less than v and greater than V.

The concrete implementation is as follows:

const quickSort = arr= > {
    const len = arr.length
    if (len < 2) {
        return arr
    }
    let left = []
    let center = []
    let right = []
    let pivot = arr[0]
    for (let i = 0; i < len; ++i) {      
        if (arr[i] < pivot) {
            left.push(arr[i])
        } else if (arr[i] === pivot) {
            center.push(arr[i])
        } else {
            right.push(arr[i])
        }
    }
    return [...quickSort(left), ...center, ...quickSort(right)]
}
Copy the code

If you are tired, you can save it for the next time, but it will never happen again, right? So keep watching! Come on duck ~

Merge Sort

Merge sort is an efficient sorting algorithm based on Merge operations. The time complexity is O(n log N). It was first proposed in 1945 by John von Neumann. This algorithm is a very typical application of Divide and Conquer, and each level of divide-and-conquer recursion can be performed simultaneously.

Basically, it is the operation of combining two sorted sequences into one.

Merge sort can be implemented in two ways

The first is top-down recursion, and the algorithm steps are as follows:

  1. Apply for a space equal to the sum of two sorted sequences, which is used to store the combined sequence;
  2. Set two Pointers, starting at the start of two sorted sequences;
  3. Compare the elements pointed to by the two Pointers, select the smaller element into the merge space, and move the pointer to the next position;
  4. Repeat step 3 until a pointer reaches the end of the sequence;
  5. Copies all remaining elements of the other sequence directly to the end of the merged sequence.

The concrete implementation is as follows:

const merge = (left, right) = > {
    let resArr = []
    while (left.length && right.length) {
        if (left[0] < right[0]) {
            resArr.push(left.shift())
        } else {
            resArr.push(right.shift())
        }
    }
    return resArr.concat(left, right)
}

const mergeSort = arr= > {
    if (arr.length <= 1) {
        return arr
    }
    let middle = Math.floor(arr.length / 2)
    let left = arr.slice(0, middle)
    let right = arr.slice(middle)
    return merge(mergeSort(left), mergeSort(right))
}
Copy the code

The second is bottom-up iteration. Since the specific algorithm of divide-and-conquer can basically be achieved by recursion and iteration, this writing method is available. The main steps are as follows:

  1. Merge each two adjacent digits of the sequence to form CEIL (N / 2) sequences, and each sequence contains two/one element after sorting;
  2. If the number of sequences at this time is not one, the above sequences are merged again to form CEIL (N / 4) sequences, each sequence contains four/three elements;
  3. Repeat step 2 until all elements are sorted, that is, the sequence number is 1.

The concrete implementation is as follows:

const merge = (arr, startLeft, stopLeft, startRight, stopRight) = > {
    /* Create left and right subsequences */
    let rightArr = new Array(stopRight - startRight + 1)
    let leftArr = new Array(stopLeft - startLeft + 1)
    /* Sort the left and right sequences */
    let k = startRight
    for (let i = 0, len = rightArr.length; i < len - 1; ++i) {
        rightArr[i] = arr[k]
        ++k
    }
    k = startLeft
    for (let i = 0, len = leftArr.length; i < len - 1; ++i) {
        leftArr[i] = arr[k]
        ++k
    }
    // Set the sentry value so that when the left or right subcolumn reads the last digit, i.e. Infinity, the value from the other remaining column is inserted directly into the array
    rightArr[rightArr.length - 1] = Infinity
    leftArr[leftArr.length - 1] = Infinity
    let m = 0
    let n = 0
    // Compare the size of the first value in the left and right subcolumns
    for (let c = startLeft; c < stopRight; ++c) {
        if (leftArr[m] <= rightArr[n]) {
            arr[c] = leftArr[m]
            m++
        } else {
            arr[c] = rightArr[n]
            n++
        }
    }
}
const mergeSort = arr= > {
    if (arr.length <= 1) {
        return arr
    }
    // Set the size of the subsequence
    let step = 1
    let left
    let right
    while (step < arr.length) {
        left = 0
        right = step
        while (right + step <= arr.length) {
            merge(arr, left, left + step, right, right + step)
            left = right + step
            right = left + step
        }
        if (right < arr.length) {
            merge(arr, left, left + step, right, arr.length)
        }
        step *= 2
    }
    return arr
}
Copy the code

Fish head note: Iteration is much safer than recursion; too much recursion can cause stack overflows.

The illustration is as follows:

Heap Sort

Heapsort is a kind of sorting algorithm based on binary heap data structure. A heap is a nearly complete binary tree structure that also satisfies the heap-property: the child node’s key value or index is always smaller than (or greater than) its parent node.

What is a binary heap?

There are two types of binary heap:

Fish head note: The following pictures are from:Comics: What is a binary heap? (Revised version)

  1. Maximum heap: Maximum heap The value of any parent node is greater than or equal to the values of its left and right children.

    • The illustration is as follows:

    • The array is represented as follows:

      [10, 8, 9, 7, 5, 4, 6, 3, 2]

  2. Minimum heap: The value of any parent node of the minimum heap is less than or equal to the values of its left and right children.

    • The illustration is as follows:

    • The array is represented as follows:

      [1, 3, 2, 6, 5, 7, 8, 9, 10]

The algorithm steps of heap sort are as follows:

  1. Disordered sequence is constructed into binary heap.
  2. The loop removes the top element, replaces it to the end of the binary heap, and adjusts the heap to produce a new top.

The concrete implementation is as follows:

// Switch positions
const Swap = (array, a, b) = > {
    [array[a], array[b]] = [array[b], array[a]];
}
/* Heap sink adjustment */
const adjustHeap = (arr, parentIndex, length) = > {
    let temp = list[parentIndex] /* temp holds the parent value for the final assignment */
    let childIndex = 2 * parentIndex + 1 /* Save the child node position */
    while (childIndex < length) {
        /* If there are right child nodes and the right child node is greater than the value of the left child node, locate the right child node */
        if (childIndex + 1 < length && list[childIndex + 1] > list[childIndex]) {
            childIndex++
        }
        /* If the parent node is less than the value of any child node, exit the loop directly */
        if (temp >= list[childIndex]) {
            break
        }
        /* Unordered exchange, one-way assignment can */
        list[parentIndex] = list[childIndex]
        parentIndex = childIndex
        childIndex = 2 * childIndex + 1
    }
    list[parentIndex] = temp
}
// Make the array heap
const heapify = (list, index, heapSize) = > {
    let largest = index
    const left = (2 * index) + 1
    const right = (2 * index) + 2
  
    
    if (left < heapSize && list[left] < list[index]) {
        largest = left
    }
  
    if (right < heapSize && list[right] < list[largest]) {
        largest = right
    }
  
    if(largest ! == index) { Swap(list, index, largest) heapify(list, largest, heapSize) } }// Create the maximum heap
const buildMaxHeap = list= > {
    for (let i = Math.floor(list.length / 2); i >= 0; i -= 1) {
        heapify(list, i, list.length)
    }
    return list
}

// Heap sort is also a very efficient algorithm, so named because it sorts arrays like binary trees. The algorithm manages the array as a binary tree based on the following information.
Index 0 is the root node of the tree;
// 2. The parent of any node N except the root node is N/2;
// the left child of node L is 2*L;
// the right child of R is 2*R+1.

const HeapSort = list= > {
    let heapSize = list.length
    buildMaxHeap(list) // Create the maximum heap
    while (heapSize > 1) {
        Swap(list, 0, --heapSize) // Swap the position of the first element (the larger value in the array) and the last element in the heap. In this way, the largest value will appear at its sorted location.
        heapify(list, 0, heapSize) // When the heap property is lost, the array is reheaped
    }
    return list
}
Copy the code

The illustration is as follows:

Counting Sort

Counting sort is a stable linear time sort algorithm. The algorithm was proposed by Harold H. Seward in 1954. Counting sort uses an extra array to store the input elements, and counting sort requires that the input data be integers with a defined range.

When the input element is n integers between 0 and k, it runs O(n + k). Counting sort is not comparison sort, which is faster than any comparison sort algorithm.

Counting sort algorithm steps are as follows:

  1. Find the largest and smallest elements in the array to be sorted;
  2. Count the number of occurrences of each element of value I in the array and store it in the i-th entry of array C;
  3. Summing up all counts (starting with the first element in array C, adding each item to the previous one);
  4. Reverse populating the target array: place each element I in the C[I] entry of the new array, subtracting C[I] by one for each element.

The concrete implementation is as follows:

const countSort = arr= > {
    const C = []
    for (let i = 0, iLen = arr.length; i < iLen; ++i) {
        const j = arr[i]
        if (C[j] >= 1) {
            C[j]++
        } else {
            C[j] = 1}}const D = []
    for (let j = 0, jLen = C.length; j < jLen; ++j) {
        if (C[j]) {
            while (C[j] > 0) {
                D.push(j)
                C[j]--
            }
        }
    }
    return D
}
Copy the code

The illustration is as follows:

Bucket Sort

Like Counting Sort, Bucket Sort is a stable linear time Sort algorithm, but this time it does not need Counting, but buckets.

It works by dividing a sequence into a finite number of buckets. Each bucket is sorted individually. Bucket sorting uses linear time O(n) when the values in the array to be sorted are evenly distributed.

The algorithm steps of bucket sorting are as follows:

  1. Set a quantitative array as an empty bucket;
  2. Search the sequence, and put the items one by one into the corresponding bucket;
  3. Sort each bucket that is not empty;
  4. Put items back into their original sequence in a bucket that is never empty.

The concrete implementation is as follows:

const bucketSort = arr= > {
    let bucketsCount = 10 /* Number of default buckets */
    const max = Math.max(... arr)/* Sequence maximum number */
    const min = Math.min(... arr)/* The smallest number in the sequence */
    const bucketsSize = Math.floor((max - min) / bucketsCount) + 1 /* Bucket depth */
    const __buckets = [] * / / * empty the bucket
    for (let i = 0, len = arr.length; i < len; ++i) {
        const index = ~~(arr[i] / bucketsSize) /* the smallest or largest sequence */
        if(! __buckets[index]) { __buckets[index] = []/* Create a sub-bucket */
        }
        __buckets[index].push(arr[i])
        let bLen = __buckets[index].length
        while (bLen > 0) { /* Sort buckets */
            if (__buckets[index][bLen] < __buckets[index][bLen - 1]) {
                [__buckets[index][bLen], __buckets[index][bLen - 1]] = [__buckets[index][bLen - 1], __buckets[index][bLen]]
            }
            bLen--
        }
    }
    let buckets = [] /* Real sequence */
    for (let i = 0, len = __buckets.length; i < len; ++i) {
        if(__buckets[i]) { buckets.push(... __buckets[i]) } }return buckets
}
Copy the code

The illustration is as follows:

Radix Sort

Radix sort is a non-comparative integer sorting algorithm, its principle is to cut the integers into different numbers according to the number of digits, and then compare them according to each number. Since integers can also represent strings (such as names or dates) and floating-point numbers in a particular format, radix sort is not restricted to integers.

The working principle is to unify all the values to be compared (positive integers) into the same number length, and fill zeros in front of the shorter number. Then, sort the order one by one, starting with the lowest order. So from the lowest order to the highest order, the sequence becomes an ordered sequence.

The radix sort can be LSD (Least Significant Digital) or MSD (Most Significant Digital).

LSD sorts from the right-most (smallest bit) of the key value, while MSD does the opposite, starting from the left-most (largest bit) of the key value.

MSD mode is suitable for sequences with many bits.

LSD is suitable for sequences with few bits.

Radix sort, bucket sort, count sort principle is similar, all with the help of the concept of “bucket”, but there are obvious differences in the way of use, the differences are as follows:

  • Radix sort: buckets are allocated according to each digit of the key value;
  • Bucket sorting: Each bucket stores a range of values;
  • Counting sort: Each bucket stores a single key value.

The LSD diagram is as follows:

LSD implementation is as follows:

const LSDRadixSort = arr= > {
    const max = Math.max(... arr)/* Get the maximum value */
    let digit = `${max}`.length /* Get the maximum number of digits */
    let start = 1 /* Bucket number */
    let buckets = [] * / / * empty the bucket
    while (digit > 0) {
        start *= 10
        * / / * into the bucket
        for (let i = 0, len = arr.length; i < len; ++i) {
            const index = (arr[i] % start)
            if(! buckets[index]) { buckets[index] = [] } buckets[index].push(arr[i])/* Add data to different buckets */
        }
        arr = []
        / * out of the barrel * /
        for(let i = 0; i < buckets.length; i++) {
            if (buckets[i]) {
                arr = arr.concat(buckets[i])
            }
        }
        buckets = []
        digit --
    }
    return arr
}
Copy the code

Special bubble Sort — Cocktail Sort

Cocktail sort is a variation of bubble sort. This algorithm differs from bubble sort in that it compares each element in the sequence from low to high and then from high to low. It gives slightly better performance than bubble sort because bubble sort aligns only in one direction (from low to high), moving only one item per cycle.

The algorithm steps are as follows:

  1. The procedure is similar to the bubbling algorithm, except that after traversing from start to finish, there is an end to start traversal.

The illustration is as follows:

The concrete implementation is as follows:

const cocktailSort = arr= > {
    let i
    let left = 0
    let right = arr.length - 1
    while (left < right) {
        for (i = left; i < right; ++i)
            if (arr[i] > arr[i + 1]) {
                [arr[i], arr[i + 1]] = [arr[i + 1], arr[i]]
            }
        right--
        for (i = right; i > left; --i)
            if (arr[i - 1] > arr[i]) {
                [arr[i], arr[i - 1]] = [arr[i - 1], arr[i]]
            }
        left++
    }
    return arr
}
Copy the code

Sleep Sort

This sorting algorithm is implemented based on timer, time complexity Mmmmmmmmmmm, space complexity Mmmmmmmmm, not shown, the specific implementation is as follows:

const list = [3.4.5.8.9.7.1.3.4.3.6]
const newList = []
list.forEach(item= > {
    setTimeout(function () {
        newList.push(item)
	}, item * 100)})Copy the code

Utility is 0, pure entertainment, hahaha ~

Afterword.

In fact, sorting algorithm is not only the above, if all listed is not an article can be finished, specific you also need to learn, explore.

Algorithm is the importance of self-evident, although we don’t necessarily use in our daily life how profound algorithms, but there are more or less will come into contact with, and for some complex business, algorithm thinking can often give us different ideas, more important is that if we go out to interview now, algorithm is also a is not open around the centre.

If you like to discuss technology, or have any comments or suggestions on this article, you are welcome to add yu Tou wechat friends to discuss. Of course, Yu Tou also hopes to talk about life, hobbies and things with you. His wechat id is krisChans95