We know that JS objects are called by sharing, so when dealing with complex JS objects, we often have side effects from changing the object — not knowing who is still referencing the data, and not knowing who these changes will affect. So we often make a copy of an object and put it in a handler. The most common copies are creating a new copy using object.assign () or using ES6’s Object deconstruction operations, but these are only shallow copies.

Deep copy

$.extend ($.extend); $.extend ($.extend); $.extend ($.extend) In fact, there are quite a few logical branches to deal with. CloneDeep in LoDash has hundreds of lines. Is there a simpler way to do this?

JSON.parse

Parse converts this object to its JSON string representation, and then parses it back to the object:

    const deepClone(obj) => JSON.parse(JSON.stringify(obj));
Copy the code

This is a practical approach for most scenarios, with the exception of parsing strings with a slight performance cost (which is really negligible). Embarrassingly, it doesn’t handle loop objects (parent and child nodes refer to each other), nor does it handle functions, regees, and so on.

MessageChannel

The MessageChannel interface is an interface of the channel communication API that allows you to create a new channel and pass data through the channel’s two MessagePort properties

Using this feature, you can create a MessageChannel that sends data to one port and the other port receives it.

    function structuralClone(obj) {
        return new Promise(resolve= > {
            const {port1, port2} = new MessageChannel();
            port2.onmessage = ev= > resolve(ev.data);
            port1.postMessage(obj);
        });
    }
    const obj = / *... * /
    const clone = await structuralClone(obj);
Copy the code

Other than the fact that it’s asynchronous, there’s no big problem with it. It supports circular objects, built-in objects (dates, regex), and browser compatibility is ok. But it also doesn’t handle functions in objects.

Similar apis, such as the History API and Notification API, use Structured Clone algorithm to transfer values.

Immutable

If you need to manipulate a complex object frequently, making a full deep copy each time is inefficient. In most cases, only one or two fields of the object are updated, leaving the rest unchanged, and copying those invariant fields is obviously unnecessary. Look at what Dan Abramov said:

The key idea of these libraries is to create Persistent data structures. When manipulating objects, only clone the changing nodes and their ancestors, and keep the rest unchanged to achieve structural sharing. For example, after the red nodes are changed in the figure below, only three green nodes will be generated again, and the remaining nodes will remain reusable (similar to the feeling of soft chain). This reduces the number of new nodes created for deep copy from 8 to 3.

Immutable.js

In Immutable, a node is not a key in an object, but a Trie(dictionary tree) data structure is used internally. Immutable maps all the keys in an object. The resulting hash value is converted to binary, and then divided into Trie tree every 5 bits from back to front.

For example, suppose we have an object called zoo:

zoo={
    'frog': 🐸'panda': 🐼.'monkey': 🐒.'rabbit': 🐰.'tiger': 🐯.'dog': {'dog1': 🐶.'dog2':🐕,
        ...// There are also 1 million dogs}...// There are still 1 million fields left
}
Copy the code

‘frog’ hash 3151780 to binary 11 00000 00101 11101 00100, Frog and dog are in the IMmutable Trie tree by hash 11 00001 01001 11100:

Of course, the actual Trie tree will be pruned according to the actual object. Branches without values will be pruned, and each node will not be filled with 32 child nodes.

For example, one day, zoo. Frog needs to be changed from 🐸 to 👽. Only the green nodes in the figure are changed, and the other nodes are directly reused, which is much more efficient than the deep copy to produce 1 million nodes.

In general, using Immutable. Js is more efficient than direct deep copy for handling large amounts of data, but it’s not much different for small objects. However, if you need to change a deeply nested Object, Immutable. Js is much simpler to write than Object. Assign or destruct.

Such as modification of zoo. T dog1. Name. FirstName = ‘haha’, the two types of writing are:

    // Object destruct
    constzoo2 = {... zoo,dog: {... zoo.dog,dog1: {... zoo.dog.dog1,name: {... zoo.dog.dog1,firstName:'haha'}}}}
    //Immutable. Js Zoo is Immutable
    const zoo2 = zoo.updateIn(['dog'.'dog1'.'name'.'firstName'],(oldValue)=>'haha')
Copy the code

seamless-immutable

If you don’t have a lot of data but you want to use this convenient updateIn syntax you can use seamless-immutable. This library does not include Trie, which is a simple Object that extends 9 methods, such as updateIn, Merge, etc., using object. freeze to freeze changes to the Object itself and return a copy for each change. It feels like a castrated version, not as powerful as Immutable. Js, but it works in some situations.

Similar libraries include Dan Abramov’s immutability-Helper and updeep, which are similar in usage and implementation. Methods such as updateIn are implemented via object. assign and Object deconstruction, respectively.

Immer.js

Immer.js is a clean stream:

    import produce from "immer"
    const zoo2 = produce(zoo, draft=>{
        draft.dog.dog1.name.firstName = 'haha'
    }) 
Copy the code

While not very elegant from a distance, it’s relatively simple to write, and any logic that needs to be changed can be put inside the function (called the producer function) of produce’s second argument without affecting the original object. In the Producer function, multiple fields can be changed at once, which is very convenient.

This “dot” operating-like approach obviously hijacks the data result and does something new. Many frameworks now do this, too, using Object.defineProperty. Immer.js is implemented with a Proxy: create a Proxy for each node accessed in the original data, modify the node to modify the copy without operating on the original data, and finally return to the object consisting of the unmodified part and the modified copy.

The structure of each proxy object in immer.js is as follows:

function createState(parent, base) {
    return {
        modified: false.// Whether it has been modified,
        assigned:{},// Record which keys were changed or deleted,
        finalized: false    // Whether to complete
        base,            / / the original data
        parent,          / / the parent node
        copy: undefined.// Shallow copies of base and Proxies properties
        proxies: {},        // Record which keys are proxied}}Copy the code

Calling the getter for a key of the original object returns the value of the corresponding key in the copy if the key has been modified, or creates a proxy for the child node if it has not been modified and returns the original value. When you call a setter for a key, you just change the value in the copy. If this is the first time to modify, you need to shallow copy the properties of base and proxies to copy. It also recurses to the parent node based on the parent attribute, making shallow copies until it reaches the root node.

draft.dog.dog1.name.firstName = 'haha'

conclusion

Immutable. Js is a good choice for handling large amounts of data. For small amounts of data, immer or seamless-immutable is fine for deeply nested objects, depending on which way you prefer to write them. If you want a “perfect” deep copy, use Lodash 😂.

Further reading

  1. Deep-copying in JavaScript
  2. Introducing Immer: Immutability the easy way