Concurrent subject

Some time ago when I read the nuggets interview questions related articles, I found a very interesting topic, the topic is as follows

Design a Task queue, pass a number as the current maximum number of tasks (asynchronous, Promise, etc.), put several tasks in the queue, when the number of queued tasks exceeds the limit, delay the execution until the current Task is complete. You can then enqueue tasks and follow the previous rules

It seemed like a very practical programming problem. We really need a concurrency controller to control the number of HTTP requests we send, especially in large applications where we need to send tens or hundreds of requests at a time. The established TCP connection is likely to block other resources, so the addition of such a concurrency control, resources can be scheduled, an example is actually in our browser has such a request concurrency control, control each page can send the number of requests, so that the page will not be stuck.

So say do do, I began to think about their own

Derive from test cases

First I’ll try to derive it from a use case. Imagine that we have the following asynchronous functions

let [a, b, c, d, e] = (function(){
    const result = []
    for(let i = 0; i < 5; i++) {
        let timeLimit = 1000
        let fn = function () {
            return new Promise((resolve) = > {
                setTimeout(() = > {
                    console.log(i)
                    resolve(i)
                }, timeLimit)
            })
        }
        result.push(fn)
    }
    return result
}())
Copy the code

When these functions are called as asynchronous tasks, assume that there is a taskControll function that takes two arguments. The first argument is a queue, which can be passed to a queue of functions [a,b, C, D,e]. The second argument is a number that you can specify as the limit for performing the task

Version 1: Simple TaskControll

To start, we rolled out a simple version of taskControll first, he will receive a number of tasks, and execute a limited first, then according to the Task execution, in then restore the maximum limit, and then continue to perform the run, the Max

function taskControll(list, max)  {
    const run = () = > {
    	while(max) {
        max--
        let task = list.shift()
        task && task().then(() = > {
					max++
          run()
        })
      }
    }
    run()
}

taskControll([a,b,c,d], 2)
Copy the code

Running results:This is a simple concurrency control that can achieve the maximum number of runs required by the interview questions and then continue to perform new tasks

Version 2: TaskPool Concurrency pool

TaskControll has a drawback, that is, if I want to add a task to another code, it is not feasible to pass in all tasks when I have to call them.

I thought a little bit more, should we construct a class that is created everywhere and used everywhere? Since this concurrency control is obviously a pool-like or queue-like operation, I have the following code again

class TaskPool {
	constructor (list, max) {
  	this.list = list
    this.max = max
    this.run()
  }
  add (task) {
    this.list.push(task)
    this.run()
  }
  run () {
    while(this.max && this.list.length > 0) {
      let task = this.list.shift()
      if (task) {
      	this.max--
        task().then(() = > {
          this.max++
          this.run()
        })
      }
    }
  }
}
let taskPool = new TaskPool([
    a,b,c
], 2)
taskPool.run()
taskPool.add(d)
taskPool.add(e)
Copy the code

Running results:TaskAdd aaddFunction that can be called to the currentlistaddtask. jointaskAnd it will run again when it joinsrunThis is considered in case the previous task is completed and the later task cannot be called

Optimization: Add callback

Think again, how do I notify when a task I put into TaskPool is complete? We can certainly make callbacks in advance for each Task, but what if we need the tasks in the current queue to complete? Here’s the solution

class TaskPool {
	constructor (list, max, callback) {
  	this.list = list
    this.max = max
    this.len = this.list.length
    this.callback = callback
    this.runIdx = 0
    this.run()
  }
  add (task) {
    this.list.push(task)
    this.len = this.list.length
    this.run()
  }
  run () {
    while(this.max && this.runIdx < this.len) {
      let task = this.list[this.runIdx]
      this.runIdx++
      if (task) {
      	this.max--
        task().then(() = > {
          this.max++
          if (this.list.indexOf(task) === this.len - 1) {
            this.callback()
          } else {
          	this.run()
          }
        })
      }
    }
  }
}
function callback() {
	console.log('callback')}let taskPool = new TaskPool([
    a,b,c
], 2, callback)
taskPool.run()
taskPool.add(d)
taskPool.add(e)
Copy the code

Running results:

The addition of runIdx allows tasks to execute sequentially, change this.len in real time when add, and call a callback when the last function has finished executing

summary

Concurrency control in the front-end belongs to an infrastructure, we usually in the browser default use, this article with you to achieve a simple concurrency pool, through asynchronous task detection can do a good effect of asynchronous control. If you find any problems in your code, you are welcome to correct them