Recently we received a request for a server that simply calls one of the provided interfaces upon receiving the request and returns the result. Because of the performance of this interface, and because no more than a certain number of requests can be made, limiting traffic in the service is required.

The requirement of flow limiting is to limit the number of simultaneous executions and cache in a queue beyond this number.

Koa middleware does not call next

The original idea was to count in KOA middleware and cache the next function when there were more than six. When the ongoing task ends, call next to continue with other requests.

Then it turns out that in KOA middleware, not executing the next function does not stop the request, but instead does not call the following middleware and returns the content directly.

const Koa = require('koa');
const app = new Koa();
app.use((ctx, next) => {
    console.log('middleware 1');
    setTimeout(() => {
        next();
    }, 3000);
    ctx.body = 'hello';
});
app.use((ctx, next) => {
    console.log('middleware 2');
});
app.listen(8989);Copy the code

The above code first hits’ Middleware 1 ‘on the console => the browser receives’ Hello’ => The console hits’ Middleware 2 ‘.

Another thing to note here is that after a request has finished, his next method can still be called, and middleware will continue to run (but the changes to CTX won’t take effect because the request has already been returned). Similarly, a close request behaves the same way.

Use await to make the request wait

Delaying the execution of the next function does not serve the purpose. The natural next step is to use await to make the current request wait. The await function returns a Promise, and we queue the resolve function in the Promise and defer the call.

const Koa = require('koa');
const app = new Koa();
const queue = [];
app.use(async (ctx, next) => {
    setTimeout(() => {
        queue.shift()();
    }, 3000);
    await delay();
    ctx.body = 'hello';
});
function delay() {
    return new Promise((resolve, reject) => {
        queue.push(resolve);
    });
}
app.listen(8989);Copy the code

This code returns a Promise in the delay function, and the Promise’s resolve function is queued. Set the resolve function in the queue after timing 3s, so that the request continues to execute.

Traffic limiting for routes or requests?

Once the basic principle of limiting traffic is implemented, the next question is where do I write the limiting code? Basically, there are two positions:

Traffic limiting for interfaces

In our requirements, limiting traffic is due to the limited performance of the interface being requested. So we can limit the flow for this request alone:

Async function requestSomeApi() {// If (counter > maxAllowedRequest) {await delay(); } counter++; const result = await request('http://some.api'); counter--; queue.shift()(); return result; }Copy the code

There is also a reusable version below.

async function limitWrapper(func, maxAllowedRequest) { const queue = []; const counter = 0; return async function () { if (counter > maxAllowedRequest) { await new Promise((resolve, reject) => { queue.push(resolve); }); } counter++; const result = await func(); counter--; queue.shift()(); return result; }}Copy the code

Traffic limiting is implemented for routes

This is done by writing a KOA middleware and limiting traffic in the middleware:

async function limiter(ctx, Next) => {if (counter >= maxAllowedRequest) {// If (counter >= maxAllowedRequest) { reject) => { queue.push(resolve); }); } store.counter++; await next(); store.counter--; queue.shift()(); };Copy the code

Then use this middleware in the router for different routes:

router.use('/api', rateLimiter);Copy the code

To compare

The realization of the interface for traffic limiting, feel some logical chaos, so switch to the route for traffic limiting, everything runs perfectly.

Until I get another request for three requests and this interface returns an array of the results of three requests. Now the problem is that we can’t call the interface directly because we have to limit the flow. We can’t call the interface function directly because we’re limiting traffic on a routing basis. So what to do? We have to request this route, request yourself…

Something to watch out for

  • Listen for the close event to remove the request from the queue

Requests already stored in the queue may be cancelled by the user. As mentioned earlier, even if the request is cancelled in KOA, the middleware will still run later, which means it will also execute the interface that needs to limit the flow, resulting in waste.

We can do this by listening for close events, and we need to hash each request:

ctx.res.on('close', () => { const index = queue.findIndex(item => item.hash === hash); if (index > -1) { queue.splice(index, 1); }});Copy the code
  • Setting timeout

To prevent users from waiting too long, you need to set a timeout, which is easy to implement in KOA:

const server = app.listen(config.port);
server.timeout = DEFAULT_TIMEOUT;Copy the code
  • The current queue is too long. Procedure

If the current queue is too long, even joining the queue will time out. So we also need to deal with long queues:

if (queue.length > maxAllowedRequest) {
    ctx.body = 'error message';
    return;
}Copy the code