Small knowledge, big challenge! This article is participating in the creation activity of “Essential Tips for Programmers”.

preface

In the last part, we learned some advanced uses of arrays, such as nullating, generating random data and sequences, emptying arrays, shallow copies of arrays, etc. Today we continue the wonderful journey of arrays.

Everybody, let’s move.

Array to heavy

Very popular Set with array de-duplication.

var arr = [The word "apple"."Pear".1.1.3.3.undefined, {a: 1}];

var arr2 = Array.from(new Set(arr));  // [' apple ', 'pear ', 1, 3, undefined, {...}]

Copy the code

At first glance, no problem. 666.

For reference types, but there’s a big problem with this, which is the definition of heavy, right?


function uniqueArray(arr){
    return Array.from(new Set(arr))
}


var arr = [{a:1}, {a:1}];
console.log(uniqueArray(arr)); // [{a:1}, {a:1}]


var obj1 = {a:1}
var arr2 = [obj1, obj1];
console.log(uniqueArray(arr2));  // [{a:1}]
Copy the code

As you can see from the example, array. from(new Set(arr)) is useful for objects that have the same reference, but not for objects that have the same attribute key and value, such as the data you get from the interface.

Some front-end is about to say, this back-end is done, I smiled slightly, you see charming not?

Array.prototype.filter: array.prototype.filter: array.prototype.filter: array.prototype.filter: array.prototype.filter: array.prototype.filter

function uniqueArray(arr = [], key){
    const keyValues = new Set(a);let val;
    return arr.filter(obj= > {
        val = obj[key];
        if(keyValues.has(val)){
            return false;
        }
        keyValues.add(val);
        return true; })}var arr = [{a:1}, {a:1}];

console.log(uniqueArray(arr));  // [{a:1}]

Copy the code

The principle above is to filter by a unique key value, good, good, looks good, but you need multiple values to confirm a unique item?

Honey, based on the code above, you can.

masked

Here’s what’s common on the web:

const arr1 = [0.1.2] 
const arr2 = [3.2.0] 
const values = [...new Set(arr1)].filter(item= >arr2 .includes(item)) 
console.log(values)  / / [0, 2)
Copy the code

No, let’s abstract it:

function intersectSet(arr1, arr2){
    return [...new Set(arr1)].filter(item= >arr2 .includes(item))
}

intersectSet(arr1, arr2) / / [0, 2)
Copy the code

At first glance, there are two problems:

  1. Reference type and same judgment
  2. Performance issues each time includes, this does not cost a bit of resources.

Let’s improve the wave by assuming that the objects are all valid objects


// Reference type
function intersect(arr1, arr2, key){ 
    const map = new Map(a); arr1.forEach(val= > map.set(val[key]))

    return arr2.filter(val= > {
        return map.has(val[key]);
    });
}


// Raw data type
function intersectBase(arr1, arr2){ 
    const map = new Map(a); arr1.forEach(val= > map.set(val))

    return arr2.filter(val= > {
        return map.has(val);
    });
}

var arr1 = [{p:0}, {p:1}, {p:2}] 
var arr2 = [{p:3}, {p:2}, {p:1}]  
intersect(arr1, arr2, "p"); // [{p:2, p: 1}]

const arr1 = [0.1.2] 
const arr2 = [3.2.0] 
intersectBase(arr1,arr2 );

Copy the code

Now, some of you might say, is this better performance? Let’s put it to the test.

function createData(length){    
    return Array.from({length}, (val, i) = > {
        return~ ~ (Math.random()* length)
    })
}

var data1 = createData(10000);
var data2 = createData(10000);

console.time("intersectSet");
intersectSet(data1, data2);
console.timeEnd("intersectSet");


console.time("intersectBase");
intersectBase(data1, data2);
console.timeEnd("intersectBase");
Copy the code
The length of the array intersectSet (ms) intersectBase (ms)
1000 0.6 0.2
10000 51.2 2.7
100000 4973.8 22.1

As you can see, the more data you use, the worse the Map performance is when you use Set + array.prototype. includes.

summary

Today we learned how to multiply and intersect arrays. Did you harvest anything today?