Remember the opening

This is my first article and my new journey after a year of work. The author graduated from a poor family in 2019. Witnessed the school recruit fairy fight, unfortunately in the world (I don’t deserve). Now I work in a finance company on an outsourcing basis.

scenario

The front-end project is VUE technology stack. There is such a scene in the business that there is an input box, which can be typed or copied and pasted into a large section of text with a certain format. The text can be segmented according to format symbols (for example, according to ‘; ‘split objects according to the’, ‘property), and finally process them into a collection of objects of some form, while generating a preview. It looks something like thisThe following code

// index.vue
 
import { sectionSplice, contentSplice } from '@/utils/handleInput'; .onInput() {
      this.loading = true;
      const temp = sectionSplice(this.text);
      this.cardList = contentSplice(temp).data;
      this.loading = false;
    },
// @/utils/handleInput
export function sectionSplice(val) {
  const breakSymbol = '\n';
  let cards = val.split(breakSymbol);
  return cards.filter((item) = >item ! =' ');
}
export function contentSplice(dataArr, cardId) {
  const splitSymbol = ', ';
  const length = dataArr.length;
  const result = {
    data: [],
    cardId,
  };
  let item = null;
  let time = new Date().getTime();
  function maxLength(text) {
    if (text && text.length > 1000) return text.substring(0.1000);
    return text;
  }
  for (let i = 0; i < length; i++) {
    item = dataArr[i].split(splitSymbol);
    if(item ! =' ') {
      result.data.push({
        title: maxLength(item[0]),
        desc: maxLength(item.slice(1).join(splitSymbol)),
        key: time + i,
        keydef: time + i + 'keydef'}); }}return result;
}
Copy the code

Performance bottleneck

But with more input and more frequent operations, you can quickly run into performance problems that cause the page to freeze. This is a paragraph of 2082080 words typed and executedThis is what happens when you have a lot of input, and because you get stuck, you can see that the entire input callback takes a long time to execute, resulting in poor performance, and the frequent triggering of VUE updates exacerbates the already inefficient performance.

How to optimize

Introduce the web – worker

Since the input callback is time-consuming and blocks the execution of subsequent events, we can invoke the Web-worker to initiate a new thread to perform this time-consuming operation. In this process, the loading mode of web-worker causes difficulties in webpack engineering projects. I tried using worker-loader, etc., but there were too many holes. Vue-worker is finally used. The this.$worker.run() method is used because the worker will destroy itself after this method is executed. This is attached

// main.js
import VueWorker from 'vue-worker';
Vue.use(VueWorker);
// index.js
onInput() {
      this.loading = true;
      const option = [this.text];
      this.workerInput = this.$worker
        .run(sectionSplice, option)
        .then((res) = > {
          this.handleCards(res);
        })
        .catch((e) = > console.log(e));
    },
 handleCards(data) {
      this.workerCards = this.$worker
        .run(contentSplice, [data])
        .then((res) = > {
          this.cardList = res.data;
          this.loading = false;
        })
        .catch((e) = > console.log(e));
    },
Copy the code

One thread is not enough

But the reality is that after starting a new thread, the process is still very heavy, but the location of the block is changed from the page rendering thread to the new thread. I came up with the idea of React Fiber, so LET’s do a shard. So the original logic is divided into two steps.

  1. Open up a thread to split the whole text into an array
  2. Split the array into 50 pieces, open up threads for each piece, and return a summary of the results

Once you’re done, you run into a new problem. Since the sharding process is asynchronous and cannot be terminated (the vue-worker has no termination function), when the result is returned, it can be stale.

Using the agent

I think about it a little bit and I think about the proxy mode and I’m going to design a class Cards that has four properties

  1. SL Records the number of fragments in this task
  2. Count Number of completed fragments
  3. CardId ID of the current operation
  4. List Indicates the merged result

Each time you update an operation, instantiate a card and pass in the incremented operation ID. When the sharding task is completed, the addCards method is called to compare the fragment ID with the CardId of the current cards instance. If the array is merged and the count is increased, the final result list is returned when all the sharding is completed. So we solved the out-of-sync problem.

export default class Cards {
  constructor(id, length) {
    this.SL = length;
    this.count = 0;
    this.CardId = id;
  }
  list = [];
  addCards(sid, section) {
    if (this.CardId == sid) {
      this.count++;
      this.list = this.list.concat(section);
    }
    if (this.count == this.SL) {
      return this.list;
    } else {
      return[]; }}empty() {
    this.list = [];
  }
  get() {
    return this.list; }}Copy the code

Web-workers are so good, can they open new threads infinitely?

This question is very important, but I am not a graduate student, I have baidu for a long time did not find the relevant explanation of the article, can only try to explain. This design to the computer foundation, the earliest CPU only one core, one thread, can only complete one thing at the same time, can not do two things at the same time. But with the development of technology, today’s consumer cpus have 16 cores and 32 threads, which can be understood as three heads and six arms, and can do a lot of things at once. But you can only open as many threads as you have. Take this year’s best-selling Intel I5 10400, for example. The CUP has six cores and 12 threads, which is the maximum number of threads that can execute in parallel. The CPU has a js Eventloop-like scheduling mechanism to switch tasks to idle threads. This process consumes physical resources, and if there are too many threads, switching back and forth between threads can be very costly. Therefore, the number of CPU threads should not exceed the number of threads. And why can use vue-worker bypass so many pit using Web worker in vUE environment? So I went to look at the vue-worker source code.

// https://github.com/israelss/vue-worker/blob/master/index.js
import SimpleWebWorker from 'simple-web-worker'
export default {
  install: function(Vue, name) {
    name = name || '$worker'
    Object.defineProperty(Vue.prototype, name, { value: SimpleWebWorker })
  }
}
Copy the code

It’s… I just registered SimpleWebWorker as a Vue plugin, so I don’t need vue-worker. So I based on SimpleWebWorker wrote an execution of a worker queue, through the window. The navigator. HardwareConcurrency CPU thread information limited open thread number is less than the CPU thread count, if you do not get is the default online four threads, After all, it’s 2020, and on older machines it’s 2 cores and 4 threads or more. However, this thread restriction is not strict, because there are many other applications that occupy threads, but it is relatively reasonable not to open up many new threads.

import SimpleWebWorker from "simple-web-worker";
export default class WorkerQueue {
  constructor() {
    try {
      this.hardwareConcurrency = window.navigator.hardwareConcurrency;
    } catch (error) {
      console.log(
        "Set 4 Concurrency,because can`t get your hardwareConcurrency."
      );
      this.concurrency = 4;
    }
    this.concurrency = 4;
    this._worker = SimpleWebWorker;
    this.workerCont = 0;
    this.queue = [];
  }
  push(fn, callback, ... args) {
    this.queue.push({ fn, callback, args });
    this.run();
  }
  run() {
    while (this.queue.length && this.concurrency > this.workerCont) {
      this.workerCont++;
      const { fn, callback, args } = this.queue.shift();
      this._worker
        .run(fn, args)
        .then((res) = > {
          callback(res);
          this.workerCont--;
          this.run();
        })
        .catch((e) = > {
          throwe; }); }}}Copy the code

Image stabilization

Although the introduction of worker to open a thread alleviates the problem of consuming and blocking to a certain extent, frequent triggering of Input callback and frequent vUE update will still affect performance, so the frequency of anti-shake callback execution is introduced here. Give Cup a little breathing space so he can keep running.

if (this.timer) {
    clearTimeout(this.timer);
    this.timer = setTimeout(() = > {
       clearTimeout(this.timer);
       this.timer = null;
     }, 2000);
    return
 }
Copy the code

The final result

At the extreme, this is 1278, 531 words typed at once, and when that much is typed at once, even the browser’s textInput becomes the most time-consuming event, and our processing does not lag. That is, the theory is that when there is enough content for the browser to handle, our event handling will not cause a lag and will suffice.

In the case of normal large data volume, the execution situation after typing 2082080 words is still used to compare with that before optimization.

Before optimization

The optimized

At the end

Attached is the demo address github.com/liubon/vue-…

The first attempt to write an article, the inadequacy of the place please forgive me, there are problems welcome correction