preface

This series started with preparing yourself for an interview. Later found more and more sorting, almost 120,000 characters, finally decided to share to everyone.

In order to share the sorting out, I spent a lot of time, at least three times as much time as ONLY myself. If you like, welcome to collect, follow me! Thank you very much!

The article links

  • Front – end interview check and fill gaps -(1) tremble and throttling
  • (2) Garbage collection mechanism
  • (3) Cross-domain and common solutions
  • (4) Front-end local storage
  • (5) Rendering mechanism and redraw and backflow
  • Front – end interview -(six) browser cache
  • (7) XSS attack and CSRF attack
  • (8) Front-end encryption
  • (9) HTTP and HTTPS
  • Check and fill gaps in front interview –(10) Front authentication
  • (11) Front-end software architecture pattern MVC/MVP/MVVM
  • (12) From the input URL to the page to see the whole process (including three handshake, four wave detailed explanation)
  • Front – end interview leak -(13) memory leak
  • Front – end interview omission and filling -(xiv) algorithm and sorting
  • (15) Event Loop

Collection of articles:

The Index (120,000 character collection) contains more than a dozen other articles in the series written so far. The following new value-added articles will not be added in each link, strongly suggest that the comments like, pay attention to the collection!!!! , thank you! ~

Follow-up Update Plan

Design pattern, front-end engineering, project process, deployment, closed loop, vUE often test knowledge and other contents will be added in the future. If you think the content is good, welcome to collect, follow me! Thank you very much!

Ask for an extrapolation

At present, I am also preparing for job-hopping. I hope you and HR sister can promote a reliable front-end position in Wuhan! Email :[email protected]. Thanks! ~

Algorithm term

  • Stability: if a was originally in front of b and a=b, a is still in front of B after sorting;
  • Instability: If a was originally before b and a=b, a may appear after B after the ordering;
  • Internal sort: All sort operations are done in memory;
  • External sort: because the data is too large, the data is put on the disk, and the sort can be carried out through the data transmission between disk and memory;
  • Time complexity: The time taken to execute an algorithm.
  • Space complexity: The amount of memory required to complete a program.

Time and space complexity check out this article: Time and Space Complexity in Detail

The data structure

  • Stack: An ordered collection that follows the first in, last out (LIFO) principle; Newly added or deleted elements are stored at the end of the stack, called the top of the stack, and the other end is called the bottom of the stack. In the stack, new elements are near the top and old elements are near the bottom.
  • Queue: An ordered set of items following the FIFO/First In First Out principle, as opposed to upper; New elements are added to the end of the queue and removed from the header. The most recently added element must be at the end of the queue.
  • Linked list: Stores an ordered set of elements, but unlike arrays, the elements in a linked list are not contiguously placed in memory; Each element consists of a node that stores the element itself and a reference (pointer/link) to the next element.
  • A set consisting of an unordered and unique (that is, unrepeatable) set of items; This data structure uses the same mathematical concepts as finite sets, but applies to computer science data structures.
  • The dictionary: a data structure in the form of [key, value] pairs where key names are used to query specific elements, similar to JavascriptObject.
  • Hash: data structure accessed directly based on Key value; It accesses records by mapping key values to a location in the table to speed up the lookup. The mapping function is called a hash function, and the array that holds the records is called a hash table.
  • Tree: a set consisting of n (n>=1) finite nodes with a hierarchical relationship; It’s called a tree because it looks like an upside-down tree, which means the roots are up and the leaves are down, and it’s basically a one-to-many relationship, and a tree can also be a special form of a graph.
  • Graph: Graph is an abstract model of network structure; A graph is a set of nodes (vertices) connected by edges; Any binary relationship can be expressed by graph, common such as: road map, diagram, show a many-to-many relationship.

A more detailed read can be found here: this article and this article

Sort contrast:

N: data size K: number of buckets In-place: constant memory usage but no extra memory usage Out-place: extra memory usage

Sorting classification:

A note on sorting algorithms

  • Never memorize the implementation code!
  • Remember algorithmic animation
  • Through animation, understand the idea and implementation method of algorithm

Only then will you be able to confidently hand code in front of the interviewer and explain your ideas as you write!

1. Bubble Sort

Algorithm description

Bubble sort is a simple sorting algorithm. It repeatedly goes through the sequence to be sorted, comparing two elements at a time and swapping them if they are in the wrong order. The work of visiting the sequence is repeated until no more exchanges are needed, that is, the sequence has been sorted. The algorithm gets its name from the fact that the smaller elements will slowly “float” to the top of the sequence by swapping. Hence the name bubble sort.

Algorithm steps and implementation code

Bubble sort is a basic sorting algorithm. The general idea is that two levels of cyclic nesting. Combined with the diagram below, to compose: external loop iterate through each item, the number of identified two more cycle (in fact, the last time can be omitted), inner loop is used to determine the single cycle two element is the number of times, each time pay attention to the outer loop, the inner loop comparing two times will be minus 1, namely the yellow block in the graph, says it has good bar.

Steps:

  • Compare adjacent elements. If the first one is bigger than the second, swap them both.
  • Do the same for each pair of adjacent elements, starting with the first pair and ending with the last pair. And when we do that, the last element is going to be the largest.
  • Repeat these steps for all the elements except the last one.
  • Keep repeating this step with fewer and fewer elements each time until there are no pairs of numbers to compare.

JS code implementation:

function bubbleSort(arr) {
    var len = arr.length;
    for (var i = 0; i < len; i++) {
        for (var j = 0; j < len - 1 - i; j++) {
            if (arr[j] > arr[j+1]) {        // Compare adjacent elements in pairs
			[arr[j],arr[j+1]] = [arr[j+1],arr[j]]  // Element swapping is done through deconstruction}}}return arr;
}
var arr=[3.44.38.5.47.15.36.26.27.2.46.4.19.50.48];
console.log(bubbleSort(arr));/ / [2, 3, 4, 5, 15, 19, 26, 27, 36, 38, 44, 46, 47, 48, 50]
Copy the code

About the bubble algorithm other implementation ideas: reverse order, bidirectional implementation methods, you can see this article, here is no ink.

2. Select Sort

Algorithm description

Select-sort is a simple and intuitive sorting algorithm. It works by first finding the smallest (large) element in the unsorted sequence and storing it at the beginning of the sorted sequence. Then, it continues to find the smallest (large) element from the remaining unsorted elements and places it at the end of the sorted sequence. And so on until all the elements are sorted.

Selection sort is also one of the most stable sorting algorithms, because it takes O(n ^ 2) time to put any data in. So when you use it, the smaller the data, the better. The only benefit may be that it does not take up extra memory space. In theory, selection sort is probably the most common sort that people think of.

Algorithm steps and implementation code

Selection sort is also a basic sorting algorithm, the general idea is also two layers of cyclic nesting. Consider the following GIF and how it works: First loop out, each loop determines the position of a value in the order (from the left in the GIF). How many times does it go through this cycle? The answer is to subtract 1 from the length of the sequence. Next comes the inner loop: determine the number of times the remaining unsorted columns need to be compared one by one.

Step: The direct selection sort of N records can go through n-1 direct selection sort to get the ordered result. The specific algorithm is described as follows:

  • 1. Initial state: the disordered region is R[1..n], and the ordered region is empty;
  • (I =1,2,3… N -1) at the beginning, the current ordered region and unordered region are respectivelyR[1..i-1]And R (I.. N). This sort selects the record R[k] with the smallest keyword from the current unordered region, exchanges it with the first record R of the unordered region, and makesR[1..i]andR[i+1..n)The number of records is increased by 1 and the number of records is decreased by 1.
  • 3. The n-1 pass is over, and the array is ordered.

JS code implementation:

function selectionSort(arr) {
    var len = arr.length;
    var minIndex, temp;
    for (var i = 0; i < len - 1; i++) {
        minIndex = i;   // To hold the minimum number
        for (var j = i + 1; j < len; j++) {
            if (arr[j] < arr[minIndex]) {     // Look for the smallest number
                minIndex = j;                 // Save the index with the smallest number
            }
        }
		[arr[minIndex],arr[i]] = [arr[i],arr[minIndex]]  // Element swapping is done through deconstruction
    }
    return arr;
}
var arr=[3.44.38.5.47.15.36.26.27.2.46.4.19.50.48];
console.log(selectionSort(arr));/ / [2, 3, 4, 5, 15, 19, 26, 27, 36, 38, 44, 46, 47, 48, 50]
Copy the code

3. Insertion Sort

Algorithm description

The description of Insertion Sort algorithm is a simple and intuitive sorting algorithm. It works by building an ordered sequence, and for unsorted data, it scans the sorted sequence from back to front to find the corresponding position and insert it. In the implementation of insertion sort, in-place sort is usually adopted (that is, the sort that only needs the extra space of O(1)), so in the process of backward scanning, the sorted elements need to be repeatedly moved backward step by step to provide the insertion space for the latest elements.

Insert sort core – poker thought: think of yourself in playing cards, pick up the first piece, where it doesn’t matter, then pick up a piece, smaller than the first piece, put on the left, continue to pick up, may be the middle number, inserted in the middle. The cards from behind are compared from back to front and inserted.

Algorithm steps and implementation code

Insert sort is also a basic sort algorithm. First of all, follow the idea of their playing cards. The sequence to be sorted is divided into two parts. On the left is the ordered sequence (cards in the hand), which starts blank. The right part of the sequence to be sorted (that is, disorderly cards).

With this general idea in mind, set up the loop. First, loop out how many cards you need to play. What’s that? It’s the length of the sequence, of course, but for convenience, we can make the first number of the sequence an ordered sequence by default, which reduces the loop. Therefore, the number of external cycles is the length of the sequence minus 1; The inner loop circulates the ordered sequence and compares sizes from right to left, putting smaller numbers in front (combined with the GIF)

Steps:

  • 1. From the first element, this element can be considered sorted;
  • 2. Take the next element and scan the sorted element sequence from back to front;
  • 3. If the element (sorted) is larger than the new element, move the element to the next position.
  • 4. Repeat Step 3 until the sorted element is less than or equal to the new element.
  • 5. Insert the new element into the position.
  • 6. Repeat Steps 2 to 5.

JS code implementation:

function insertSort(arr) {
    for(let i = 1; i < arr.length; i++) {  // The outer loop starts at 1. By default, arr[0] is an ordered segment
        for(let j = i; j > 0; j--) {  // insert arR [j] (arR [j], arr[j] (arr[j]), arr[j] (arr[j]), arr[j] (arr[j])
            if(arr[j] < arr[j- 1]) {
                [arr[j],arr[j- 1]] = [arr[j- 1],arr[j]];  // In fact, in this inner loop, as long as the number smaller than its previous exchange, until there is no smaller, break exit. It's a little different than the GIF, but the end result is the same.
            } else {
                break; }}}return arr;
}
var arr=[3.44.38.5.47.15.36.26.27.2.46.4.19.50.48];
console.log(insertSort(arr));/ / [2, 3, 4, 5, 15, 19, 26, 27, 36, 38, 44, 46, 47, 48, 50]
Copy the code

4. Quick Sort

Algorithm description

The name quicksort is simple, because as soon as you hear the name, you know the meaning of its existence, is fast, and efficient! It’s one of the fastest sorting algorithms for big data.

It’s a recursive divide and conquer based on bubble sort. The data is broken down recursively into different subsequences of smaller and larger elements. The algorithm repeats this step until all the data is in order.

Note: Quicksort is also one of the easiest algorithm questions in the interview, often requiring you to write by hand.

Algorithm steps and implementation code

Quick sort is an advanced sort algorithm, so it is not similar to loop nesting. The general idea is to find a number for reference, and put anything larger than that number on the left and anything smaller on the right; Then do the same for the left and right sequences (recursion).

Note: when recursion is involved, always remember to set the exit to jump out of recursion!

Steps:

  • 1. Select an element from the sequence, called the “pivot”;
  • 2. Reorder the sequence so that all elements smaller than the base value are placed in front of the base value and all elements larger than the base value are placed after the base value (the same number can go to either side). After the partition exits, the benchmark is in the middle of the series. This is called a partition operation;
  • 3. Recursively sort the subsequence of elements less than the base value from the subsequence of elements greater than the base value;

JS code implementation:

function quickSort (arr) {
	if(arr.length <= 1) {
        return arr;  // Exit recursively
    }
	let left = [],
        right = [],
		// In this case, we default to the first base of the array. PS: This is not good if the array is already sorted. Is likely to be the time complexity of the worst case
		//pivotIndex = Math.floor(arr.length / 2),
	    pivot = arr[0];    Splice (pivotIndex, 1)[0]; When using Splice for large amounts of data, it consumes a lot of memory; But not to the point of being sprayed for nothing! There's nothing wrong with the way it works!
	for (var i = 1; i < arr.length; i++) {
		if (arr[i] < pivot) {
			left.push(arr[i])
		} else {
			right.push(arr[i])
		}
	}
	// Concat is also not suitable for sorting large amounts of data. It consumes a lot of memory
	return quickSort(left).concat(pivot, quickSort(right))
}
var arr=[3.44.38.5.47.15.36.26.27.2.46.4.19.50.48];
console.log(quickSort(arr));/ / [2, 3, 4, 5, 15, 19, 26, 27, 36, 38, 44, 46, 47, 48, 50]

// Update:
function partition2(arr, low, high) {
  let pivot = arr[low];
  while (low < high) {
    while (low < high && arr[high] > pivot) {
      --high;
    }
    arr[low] = arr[high];
    while (low < high && arr[low] <= pivot) {
      ++low;
    }
    arr[high] = arr[low];
  }
  arr[low] = pivot;
  return low;
}

function quickSort2(arr, low, high) {
  if (low < high) {
    let pivot = partition2(arr, low, high);
    quickSort2(arr, low, pivot - 1);
    quickSort2(arr, pivot + 1, high);
  }
  return arr;
}
var arr=[3.44.38.5.47.15.36.26.27.2.46.4.19.50.48];
console.log(quickSort2(arr,0,arr.length- 1));/ / [2, 3, 4, 5, 15, 19, 26, 27, 36, 38, 44, 46, 47, 48, 50]
Copy the code

5. Shell Sort

Algorithm description

Shell was invented in 1959; The first to break through the sorting algorithm of O(n^2); Is an improved version of simple insertion sort; It differs from insertion sort in that it compares elements farther away in preference. Hill sort is also called reduced increment sort. And sorting is also unstable

Hill sort proposes an improved method based on the following two properties of insertion sort:

  • Insertion sort has high efficiency when it operates on almost sorted data, that is, it can achieve the efficiency of linear sort.
  • But insertion sort is generally inefficient because it only moves data one bit at a time;

The basic idea of Hill sorting is that the whole sequence of records to be sorted is divided into several sub-sequences for direct insertion sorting, and when the records in the whole sequence are “basically ordered”, all records are directly inserted for sorting in turn.

Algorithm steps and implementation code

The idea behind Hill sorting is to divide an array (length len) into arrays spaced t1 apart. Do insertion sort; After the array is sorted, the array is divided into several arrays with interval T2 (gradually decreasing), and the insertion sort is carried out. Then continue until the array is separated by one, and the last insertion sort is complete.

You can see the following figure for easy understanding:

Steps:

  • 1, select an incremental sequence T1, T2… , tk, where ti>tj, tk=1;
  • 2. Sequence is sorted k times according to the number of incremental sequence k;
  • 3. In each sorting, the sequence to be sorted is divided into several sub-sequences of length m according to the corresponding increment TI, and the direct insertion sorting is carried out for each sub-table. When the increment factor is only 1, the entire sequence is treated as a table, and the length of the table is the length of the entire sequence.

JS code implementation:

function shellSort(arr) {
    var len = arr.length,
        temp,
        gap = 1;
    while(gap < len/5) {          // Define the interval sequence dynamically
        gap =gap*5+1;
    }
    for (gap; gap > 0; gap = Math.floor(gap/5)) {
        for (var i = gap; i < len; i++) {
            temp = arr[i];
            for (var j = i-gap; j >= 0&& arr[j] > temp; j-=gap) { arr[j+gap] = arr[j]; } arr[j+gap] = temp; }}return arr;
}
var arr=[3.44.38.5.47.15.36.26.27.2.46.4.19.50.48];
console.log(shellSort(arr));/ / [2, 3, 4, 5, 15, 19, 26, 27, 36, 38, 44, 46, 47, 48, 50]

Copy the code

6. Merge Sort

Algorithm description

Merge sort is an efficient sorting algorithm based on Merge operations. This algorithm is a typical application of Divide and Conquer.

Merge sort is a stable sort method. The ordered subsequence is merged to obtain a completely ordered sequence. That is, to order each subsequence, and then to order the subsequence segments. If two ordered tables are merged into one, it is called 2-way merge.

Algorithm steps and implementation code

Split the array into two parts, left and right, and continue splitting the left and right parts (recursively) until they are separated into a single part. Then split it into the two smallest arrays, compare them, and merge them into one array. Then we continue to recursively compare and merge. Until it finally merges into an array.

Steps:

  • 1. Divide the input sequence of length n into two subsequences of length N /2;
  • 2. Merge the two subsequences respectively;
  • 3. Merge the two sorted subsequences into a final sorted sequence.

JS code implementation:

function mergeSort(arr) {  // Take a top-down recursive approach
    var len = arr.length;
    if(len < 2) {
        return arr;
    }
    var middle = Math.floor(len / 2),
        left = arr.slice(0, middle),
        right = arr.slice(middle);
    return merge(mergeSort(left), mergeSort(right));
}

function merge(left, right){
    var result = [];
    while (left.length && right.length) {
        if (left[0] <= right[0]) {
            result.push(left.shift());
        } else{ result.push(right.shift()); }}while (left.length)
        result.push(left.shift());

    while (right.length)
        result.push(right.shift());
    return result;
}
var arr=[3.44.38.5.47.15.36.26.27.2.46.4.19.50.48];
console.log(mergeSort(arr));/ / [2, 3, 4, 5, 15, 19, 26, 27, 36, 38, 44, 46, 47, 48, 50]
Copy the code

7. Heap Sort

Algorithm description

Heapsort is a kind of sorting algorithm designed by using the data structure of heap. The stack is an approximate complete binary tree structure and satisfies the stack property: that is, the key value or index of the child node is always less than (or greater than) its parent node. Heap sort is a sort of selection that uses the concept of heap to sort. There are two methods:

  • Large top heap: the value of each node is greater than or equal to the value of its children, used for ascending sorting in heap sorting algorithms;
  • Small top heap: the value of each node is less than or equal to the value of its children and is used in descending order in heap sorting algorithms;

Steps:

  • 1. Set the initial keyword sequence (R1,R2….) to be sorted Rn) is constructed into a large top heap, which is the initial unordered region;
  • 2. Swap the top element R[1] with the last element R[n] to get a new unordered region (R1,R2,……) Rn-1) and a new ordered region (Rn), where R[1,2…n-1]<=R[n];
  • 3. Since the new heap top R[1] after the swap may violate the properties of the heap, the current unordered region (R1,R2,……) is required Rn-1) to the new heap, and then swap R[1] again with the last element of the unordered region to get the new unordered region (R1,R2….) Rn-2) and the new ordered region (Rn-1,Rn). This process is repeated until the number of elements in the ordered area is n-1, and the sorting process is complete.

JS code implementation:


function buildMaxHeap(arr,len) {   // Create the big top heap
    
    for (var i = Math.floor(len/2); i >= 0; i--) { heapify(arr, i,len); }}function heapify(arr, i,len) {     / / heap of adjustment
    var left = 2 * i + 1,
        right = 2 * i + 2,
        largest = i;

    if (left < len && arr[left] > arr[largest]) {
        largest = left;
    }

    if (right < len && arr[right] > arr[largest]) {
        largest = right;
    }

    if (largest != i) {
        swap(arr, i, largest);
        heapify(arr, largest,len);
    }
}

function swap(arr, i, j) {
    var temp = arr[i];
    arr[i] = arr[j];
    arr[j] = temp;
}

function heapSort(arr) {
    var len = arr.length;
    buildMaxHeap(arr,len);

    for (var i = arr.length- 1; i > 0; i--) {
        swap(arr, 0, i);
        len--;
        heapify(arr, 0,len);
    }
    return arr;
}
var arr=[3.44.38.5.47.15.36.26.27.2.46.4.19.50.48];
console.log(heapSort(arr));/ / [2, 3, 4, 5, 15, 19, 26, 27, 36, 38, 44, 46, 47, 48, 50]
Copy the code

8. Counting Sort

Algorithm description

Counting sort is almost the only sorting algorithm that is not based on comparison. It was proposed by Harold H. Seward in 1954. Using it to sort integers over a range of integers in order of time (n+k), where k is the range of integers, it is almost faster than any comparison-based sorting algorithm (except when O(k)>O(n*log(n)) and less efficient than comparison-based sorting, such as merge and heapsort).

The core of counting sort is to convert input data values into keys that are stored in an extra array space.

Algorithm steps and implementation code

Counting sort takes advantage of the fact that once you know how many other elements (let’s say m) are smaller than an element in an array, you can determine the correct position of that element (the m+1 bit).

Steps:

  • Get the maximum and minimum value of array A to be sorted.
  • 2, create a count array B with the difference between the maximum value and the minimum value +1 as the length, and store the number of the same elements as the value in the count array.
  • 3. Add the count to the count array B, storing the initial subscripts of the different values.
  • 4, take the values from A, assign them to A new array C with the corresponding index, and return C.

Note: If array A is an array of objects that need to be sorted based on A certain attribute of the object, then the algorithm starts by treating array A as A simple array of object attribute values, simpleA, and then counts and accumulates based on simpleA, and so on.

JS code implementation:

// The following implementation supports sorting not only of numeric sequences, but also by the value of a property of an object.
function countSort(array, keyName){
  var length = array.length,
      output = new Array(length),
      max,
      min,
      simpleArray = keyName ? array.map(function(v){
        return v[keyName];
      }) : array; // If keyName exists, then create a simple array of keyvalues

  // Get the maximum and minimum values
  max = min = simpleArray[0];
  simpleArray.forEach(function(v){
    v > max && (max = v);
    v < min && (min = v);
  });
  // Get the length of the count array
  var k = max - min + 1;
  // Create and initialize the count array
  var countArray = new Array(k);
  simpleArray.forEach(function(v){
    countArray[v - min]= (countArray[v - min] || 0) + 1;
  });
  // Add up the count to store the initial subscripts of different values
  countArray.reduce(function(prev, current, i, arr){
    arr[i] = prev;
    return prev + current;
  }, 0);
  // Select a value from the original array. (This can only be done by traversing the original array.)
  simpleArray.forEach(function(v, i){
    var j = countArray[v - min]++;
    output[j] = array[i];
  });
  return output;
}
var arr=[3.44.38.5.47.15.36.26.27.2.46.4.19.50.48];
console.log(countSort(arr));/ / [2, 3, 4, 5, 15, 19, 26, 27, 36, 38, 44, 46, 47, 48, 50]
Copy the code

9. Bucket Sort

Algorithm description

Bucket sort is an updated version of counting sort. It uses the mapping of functions, and the key to efficiency lies in the determination of this mapping function. To make bucket sorting more efficient, we need to do these two things:

  • Maximize the number of buckets if there is enough extra space
  • The mapping function used can evenly distribute N input data into K buckets

At the same time, for sorting elements in the bucket, the selection of comparison sorting algorithm is very important for performance. Obviously, the smaller the buckets, the less data there is between them, and the less sorting time it takes. But the corresponding space consumption will increase.

Algorithm steps and implementation code

Bucket sort: Assuming that the input data is evenly distributed, the data is divided into a finite number of buckets, and each Bucket is sorted separately (it is possible to use another sorting algorithm or continue to use Bucket sort recursively)

Steps:

  • 1. Set an array as an empty bucket.
  • 2. Iterate over the input data and place the data one by one into the corresponding bucket;
  • 3. Sort the buckets that are not empty.
  • 4. Piecing together sorted data from never-empty buckets.

Note: If array A is an array of objects that need to be sorted based on A certain attribute of the object, then the algorithm starts by treating array A as A simple array of object attribute values, simpleA, and then counts and accumulates based on simpleA, and so on.

JS code implementation:

function bucketSort(arr, bucketSize) {
    if (arr.length === 0) {
      return arr;
    }

    var i;
    var minValue = arr[0];
    var maxValue = arr[0];
    for (i = 1; i < arr.length; i++) {
      if (arr[i] < minValue) {
          minValue = arr[i];                // The minimum value of input data
      } else if (arr[i] > maxValue) {
          maxValue = arr[i];                // The maximum value of input data}}// Initialize the bucket
    var DEFAULT_BUCKET_SIZE = 5;            // Set the default number of buckets to 5
    bucketSize = bucketSize || DEFAULT_BUCKET_SIZE;
    var bucketCount = Math.floor((maxValue - minValue) / bucketSize) + 1;   
    var buckets = new Array(bucketCount);
    for (i = 0; i < buckets.length; i++) {
        buckets[i] = [];
    }

    // Use the mapping function to allocate data to each bucket
    for (i = 0; i < arr.length; i++) {
        buckets[Math.floor((arr[i] - minValue) / bucketSize)].push(arr[i]);
    }

    arr.length = 0;
    for (i = 0; i < buckets.length; i++) {
        insertionSort(buckets[i]);                      // Sort each bucket, using insertion sort
        for (var j = 0; j < buckets[i].length; j++) { arr.push(buckets[i][j]); }}return arr;
}
var arr=[3.44.38.5.47.15.36.26.27.2.46.4.19.50.48];
console.log(bucketSort(arr));/ / [2, 3, 4, 5, 15, 19, 26, 27, 36, 38, 44, 46, 47, 48, 50]
Copy the code

10. Radix Sort

Algorithm description

Radix sort is a non-comparative integer sort algorithm. The principle is to cut the integer into different numbers by the number of digits, and then compare each number separately. Because integers can also represent strings (such as names or dates) and floating-point numbers of a particular format, cardinality sort is not restricted to integers.

There are two implementations of sorting in order of high or low priority:

  • MSD: from the high level as the basis, first sorted and grouped by K1, recorded in the same group, the key code K1 is equal, and then sorted into subgroups by K2, then continue to sort and group the following key codes in this way, until the subgroups are sorted by the most important key kd. Join the groups together and you get an ordered sequence. MSD mode is suitable for sequences with many bits.
  • LSD: From the low order as the basis, first start the sorting from KD, then the sorting of KD-1, repeat, until the sorting of K1 to get an ordered sequence. LSD mode is suitable for sequences with few bits.

Radix sort, counting sort, bucket sort. These three sorting algorithms all make use of the concept of buckets, but there are significant differences in the way buckets are used:

  • Radix sort: buckets are allocated according to each digit of the key value;
  • Counting sort: stores only a single key value per bucket;
  • Bucket sort: Each bucket stores a range of values;

Algorithm steps and implementation code

Steps:

  • 1. Obtain the maximum number in the array and the number of digits;
  • 2. Arr is the original array, and each bit is taken from the lowest order to form the Radix array;
  • 3. Carry out counting sequencing for Radix (take advantage of the characteristics of counting sequencing applicable to small range number);

JS code implementation:

/** * Radix sort is suitable for: * (1) The range of data is small, it is recommended to be less than 1000 * (2) each value must be greater than or equal to 0 * @author xiazdong * @param arr * @param maxDigit */
//LSD Radix Sort

function radixSort(arr, maxDigit) {
    var mod = 10;
    var dev = 1;
    var counter = [];
    console.time('Radix sort time');
    for (var i = 0; i < maxDigit; i++, dev *= 10, mod *= 10) {
        for(var j = 0; j < arr.length; j++) {
            var bucket = parseInt((arr[j] % mod) / dev);
            if(counter[bucket]== null) {
                counter[bucket] = [];
            }
            counter[bucket].push(arr[j]);
        }
        var pos = 0;
        for(var j = 0; j < counter.length; j++) {
            var value = null;
            if(counter[j]! =null) {
                while((value = counter[j].shift()) ! =null) { arr[pos++] = value; }}}}console.timeEnd('Radix sort time');
    return arr;
}
var arr = [3.44.38.5.47.15.36.26.27.2.46.4.19.50.48];
console.log(radixSort(arr,2)); / / [2, 3, 4, 5, 15, 19, 26, 27, 36, 38, 44, 46, 47, 48, 50]
Copy the code

Binary trees and binary search trees

The tree

A tree is a non-sequential data structure, an abstract model of hierarchical data that is useful for storing data that needs to be found quickly.

The most common real-life example of a tree is a family tree, or a company’s organizational chart.

A tree structure consists of a series of nodes in which a parent-child relationship exists. Each node has a parent (except for the first node at the top) and zero or more children.

Common tree structures/attributes:

  • node
    • The root node
    • Internal node: A node that is not a root node and has child nodes
    • External node/page node: a node with no child node
  • Subtree: a tree consisting of large and small nodes
  • Depth: The number of nodes from node to root node
  • Height: The height of the tree is determined by the maximum of all node depths
  • Hierarchy: You can also hierarchy by node level

Binary tree

A binary tree is a special kind of tree with a maximum of two child nodes, which makes it possible to write efficient inserts, deletes, and lookups. In a binary tree, the child nodes are called the left node and the right node.

Binary search tree

A binary search tree is a special type of binary tree in which relatively small values are stored in the left node and larger values (or equal to) are stored in the right node. This feature makes the search very efficient, both for numeric and non-numeric data, such as letters and strings. Now through JS implementation of a binary search tree.

Nodes:

The smallest element of a binary tree is a node, so define a node first

function Node(data,left,right) {
    this.left = left;
    this.right = right;
    this.data = data;
    this.show = (a)= > {return this.data}
}
Copy the code

This is the smallest structure of a binary tree

Binary tree

function BST() {
    this.root = null // Initialize with root as null
}

Copy the code

When BST is initialized, there is only one root node and no data. Next, we define an insert method using the binary tree lookup rule. The basic idea of this method is:

  1. ifBST.root === null, then the node will be the root node
  2. ifBST.root ! ==nullIf less than the root node, take the left node, if not, take the right node, and compare again, recursion.

This is where the recursion comes in, because you always have to put the smaller one on the left branch. In other words

The leftmost variable leaf node is the smallest number, and the rightmost variable leaf node is the largest number

function insert(data) {
    var node = new Node(data,null.null);
    if(this.root === null) {
        this.root = node
    } else {
        var current = this.root;
        var parent;
        while(true) {
            parent = current;
            if(data < current.data) {
                current = current.left; // Go to the left subtree
                if(current === null) {  // If the left subtree is empty, then node can be inserted there
                    parent.left = node;
                    break;  // Break out of the while loop}}else {
                current = current.right;
                if(current === null) {
                    parent.right = node;
                    break;
                }
            }
        }
    }
}
Copy the code

In this case, I’m using a circular method, constantly looking for the right location in the subtree. The core of both loops and recursions is to find the exit. The exit in this case is when current is null, indicating that there is no content to insert.

Next, write this method to BST:

function BST() {
    this.root = null;
    this.insert = insert;
}
Copy the code

In this way, we can use the binary tree as our own data structure:

var bst = newBST (); bst.insert(10);
bst.insert(8);
bst.insert(2);
bst.insert(7);
bst.insert(5); Copy the codeCopy the code

But at this point, if you want to look at the data in the tree, it’s not that clear, so the next thing to do is to iterate.

Tree traversal:

According to the order of accessing the root node, tree traversal can be divided into the following three types:

  • Preorder traversal (root node -> left subtree -> right subtree)
  • In order traversal (left subtree -> root -> right subtree)
  • Postorder traversal (left subtree -> right subtree -> root node)

Preorder traversal:

Preorder traversal accesses each node in order prior to its descendants. One application of preorder traversal is to print a structured document.

function preOrder(node) {
    if(node ! = =null) {
        Root -> left subtree -> right subtree
        console.log(node.show()); preOrder(node.left); preOrder(node.right); }}Copy the code

In order traversal:

In order traversal is to ask all nodes from the smallest to the largest order. One of the applications of ordered traversal is to sort a tree.

function inOrder(node) {
    if(node ! = =null) {
        // If it is not null, it is always looking for the left variable, so recursion
		Left subtree -> root -> right subtree
        inOrder(node.left);
        // Print the current value at the end of the recursion
        console.log(node.show());
        // The last time we recursed, we finished the left hand side, and the right hand sideinOrder(node.right); }}Copy the code

Post-order traversal:

Post-order traversal is to access the node’s descendants first, and then access the node itself. One application of sequential traversal is to calculate the size of the space occupied by all files in a directory and its subdirectories.

function postOrder(node) {
    if(node ! = =null) {
        Left subtree -> right subtree -> root node
        postOrder(node.left);
        postOrder(node.right);
        console.log(node.show())
    }
}
Copy the code

Binary tree lookup

Binary tree is the most convenient data structure for data lookup:

  • Minimum: leaf node of the leftmost subtree
  • Maximum: leaves of the rightmost subtree
  • Specific values: Target is compared to current, and if it is larger than current, it is looked up in current. Right, and vice versa.

After you have clear ideas, start writing:

/ / the minimum
function getMin(bst) {
    var current = bst.root;
    while(current.left ! = =null) {
        current = current.left;
    }
    return current.data;
}

/ / Max
function getMax(bst) {
    var current = bst.root;
    while(current.right ! = =null) {
        current = current.right;
    }
    return current.data;
}
Copy the code

Maximum and minimum values are very simple, the next main look at how to pass

function find(target,bst) {
    var current = bst.root;
    while(current ! = =null) {
        if(target === current.data) {
            return true;
        }
        else if(target > current.data) {
            current = current.right;
        } else if(target < current.data) { current = current.left; }}return - 1;
}
Copy the code

But the core, again, is to keep searching through a cycle and judgment, and the idea here is actually a little bit similar to binary search.

AVL tree:

AVL tree is a self-balancing binary search tree and AVL tree with the balance function is essentially a binary search tree (binary sort tree and binary search tree), any node in the AVL tree, the height of the two subtrees of the biggest difference for a, that is to say, this kind of tree will try to be as far as possible when the add or remove node a full tree, so it is also known as highly balanced tree. Lookups, inserts, and deletes are all O (log n) on average and in the worst case, and additions and deletes may require one or more tree rotations to rebalance the tree.

Red and black tree:

Red-black trees are similar to AVL trees in that they maintain the balance of binary search trees by specific operations during insertion and deletion, so as to obtain higher search performance. It is complex, but its worst-case running time is very good, and in practice it is efficient: it can do lookups, inserts, and deletes in order log n time, where n is the number of elements in the tree.

A red-black tree is a binary lookup tree with a color attribute for each node, either red or black. In addition to the general requirements for binary lookup trees, we add the following additional requirements for any valid red-black tree:

  • Nodes are red or black
  • The root node is black
  • Each leaf node (NIL node) is black
  • The two children of each red node are black. (There cannot be two consecutive red nodes on all paths from each leaf to root)
  • All paths from any node to each of its leaves contain the same number of black nodes

These constraints enforce a key property of red-black trees: the longest possible path from root to leaf is no more than twice as long as the shortest possible path. It turns out that the tree is roughly balanced. Because the worst-case time of operations such as insert, delete, and find a value is required to be proportional to the height of the tree, this theoretical upper limit on the height allows red-black trees to be efficient in the worst case, unlike normal binary search trees.

Both red-black trees and AVL trees provide the best possible worst case guarantee for insert time, delete time, and find time. This makes them valuable not only in time-sensitive applications such as Real Time Applications, but also as building blocks in other data structures that provide worst-case guarantees; For example, many data structures used in computational geometry can be based on red-black trees. Red-black trees are also particularly useful in functional programming, where they are one of the most commonly used persistent data structures, where they are used to construct associative arrays and collections that remain the previous version after mutation. In addition to order log n time, the persistent version of the red-black tree requires order log n space for each insert or delete.

Thanks and Reference

  • JS for all the sorting algorithms you can possibly use
  • Top 10 classic sorting algorithms
  • Sort animation, you can manually control the flow
  • Front – end written test & interview climb pit series – algorithm
  • Learn data structures and algorithms in JavaScript
  • JavaScript data structures and algorithms
  • Summary of top 10 Sorting Algorithms (JavaScript description)
  • Just a little bit of a javascript algorithm