Original text: softwareontheroad.com/ideal-nodej… More excellent articles: follow the public account “Nodejs Technology Stack”, open source project “www.nodejs.red/”

Express.js is an excellent framework for developing the Node.js REST API, but it doesn’t give you any clues as to how to organize your Node.js project.

As silly as it sounds, this is a problem.

Properly organizing the Node.js project structure will avoid duplication of code and improve service stability and scalability.

This article is based on my years of experience dealing with bad Node.js project structures, bad design patterns, and countless hours of code refactoring.

If you need help adjusting the node.js project architecture, just send me a letter at [email protected].

directory

  • Directory structure 🏢
  • Three-tier architecture 🥪
  • The service layer 💼
  • The Pub/Sub layer ️ ️ ️ ️ 🎙 ️ ️
  • Dependency injection 💉
  • Unit test 🕵🏻
  • Cron Jobs and repetitive tasks ⚡
  • Configuration and key 🤫
  • Loaders 🏗 ️

Directory structure 🏢

This is the Node.js project structure I’m talking about.

I’ve used the following structure for every Node.js REST API service I’ve built to give you an idea of what each component does.

The SRC │ app. Js# App entrance└ ─ ─ ─ API# Express route controllers for all the endpoints of the app└ ─ ─ ─ the configEnvironment variables are related to configuration└ ─ ─ ─jobs            # Task scheduling definition for agend.js└ ─ ─ ─ loaders# Split the startup process into modules└ ─ ─ ─ models# Database model└ ─ ─ ─ services# All business logic should be here└ ─ ─ ─ subscribers# Event handler for asynchronous tasks└ ─ ─ ─ types# For Typescript type declarations (d.ts)
Copy the code

This is not just a way to organize JavaScript files…

Three-tier architecture 🥪

The idea is to use the separation of concerns principle to remove business logic from node.js API routing.

Because one day, you’ll want to use your business logic on a CLI tool, or not at all. It’s obviously not a good idea to do some repetitive task and then call itself from the Node.js server.

☠️ Do not put your business logic in the controller!! ☠ ️

You might want to use the Controllers layer of Express.js to store the business logic of the application layer, but soon your code will become unmaintainable and you will need to write complex simulations of Express.js req or RES objects whenever you need to write unit tests.

Determining when a response should be sent and when processing should continue “in the background” (for example, after the response has been sent to the client) is a complicated matter.

route.post('/'.async (req, res, next) => {

    // This should be a middleware or should be handled by a library like Joi
    // Joi is a data validation library github.com/hapijs/joi
    const userDTO = req.body;
    const isUserValid = validators.user(userDTO)
    if(! isUserValid) {return res.status(400).end();
    }

    // There is a lot of business logic...
    const userRecord = await UserModel.create(userDTO);
    delete userRecord.password;
    delete userRecord.salt;
    const companyRecord = await CompanyModel.create(userRecord);
    const companyDashboard = awaitCompanyDashboard.create(userRecord, companyRecord); . whatever...// This is the "optimization" that messed everything up.
    // The response is sent to the client...
    res.json({ user: userRecord, company: companyRecord });

    // but the code block is still executed :(
    const salaryRecord = await SalaryModel.create(userRecord, companyRecord);
    eventTracker.track('user_signup',userRecord,companyRecord,salaryRecord);
    intercom.createUser(userRecord);
    gaAnalytics.event('user_signup',userRecord);
    await EmailService.startSignupSequence(userRecord)
  });
Copy the code

Use the business logic for the service layer 💼

This layer is where you put your business logic.

Following the SOLID principles that apply to Node.js, it is simply a collection of classes with a clear purpose.

There should be no “SQL queries” of any kind at this layer and the data access layer can be used.

  • Remove your code from the express.js router.
  • Do not pass reQ or RES to the service layer
  • Do not return any HTTP transport layer related information, such as status code or HEADERS, from the service layer

example

route.post('/', 
    validators.userSignup, // This middle layer is responsible for data validation
    async (req, res, next) => {
      // The routing layer is actually responsible
      const userDTO = req.body;

      // Call the Service layer
      // How to access the abstractions of the data layer and business logic layer
      const { user, company } = await UserService.Signup(userDTO);

      // Return a response to the client
      return res.json({ user, company });
    });
Copy the code

This is how your service runs in the background.

import UserModel from '.. /models/user';
import CompanyModel from '.. /models/company';

export default class UserService {

    async Signup(user) {
        const userRecord = await UserModel.create(user);
        const companyRecord = await CompanyModel.create(userRecord); // needs userRecord to have the database id 
        const salaryRecord = await SalaryModel.create(userRecord, companyRecord); // depends on user and company to be created. whateverawaitEmailService.startSignupSequence(userRecord) ... do more stuffreturn { user: userRecord, company: companyRecord }; }}Copy the code

Publish and subscribe 🎙️

The PUB/Sub pattern goes beyond the classic 3-tier architecture presented here, but it is very useful.

Now create a simple Node.js API endpoint for the user, perhaps invoking a third-party service, perhaps an analysis service, perhaps opening an email sequence.

Before long, this simple “create” operation will accomplish several things, and you’ll end up with 1000 lines of code, all in one function.

This violates the principle of single liability.

Therefore, it is best to divide responsibilities from the start to keep your code maintainable.

import UserModel from '.. /models/user';
  import CompanyModel from '.. /models/company';
  import SalaryModel from '.. /models/salary';

  export default class UserService(a){

    async Signup(user) {
      const userRecord = await UserModel.create(user);
      const companyRecord = await CompanyModel.create(user);
      const salaryRecord = await SalaryModel.create(user, salary);

      eventTracker.track(
        'user_signup',
        userRecord,
        companyRecord,
        salaryRecord
      );

      intercom.createUser(
        userRecord
      );

      gaAnalytics.event(
        'user_signup',
        userRecord
      );

      awaitEmailService.startSignupSequence(userRecord) ... more stuffreturn { user: userRecord, company: companyRecord }; }}Copy the code

Forcing calls to dependent services is not a good idea.

One of the best ways to do this is to trigger an event, called “user_signup.” This is done, and the rest is up to the event listener.

import UserModel from '.. /models/user';
  import CompanyModel from '.. /models/company';
  import SalaryModel from '.. /models/salary';

  export default class UserService(a){

    async Signup(user) {
      const userRecord = await this.userModel.create(user);
      const companyRecord = await this.companyModel.create(user);
      this.eventEmitter.emit('user_signup', { user: userRecord, company: companyRecord })
      return userRecord
    }

  }
Copy the code

Now you can split the event handler/listener into multiple files.

eventEmitter.on('user_signup', ({ user, company }) => {

    eventTracker.track(
        'user_signup',
        user,
        company,
    );

    intercom.createUser(
        user
    );

    gaAnalytics.event(
        'user_signup',
        user
    );
})
Copy the code
eventEmitter.on('user_signup'.async ({ user, company }) => {
    const salaryRecord = await SalaryModel.create(user, company);
})
Copy the code
eventEmitter.on('user_signup'.async ({ user, company }) => {
    await EmailService.startSignupSequence(user)
})
Copy the code

You can wrap the await statement in a try-catch block, or you can let it fail and process. On (‘unhandledRejection’,cb) with ‘unhandledPromise’.

Dependency injection 💉

DI or Inversion of Control (IoC) is a common pattern that helps organize code by passing dependencies of classes or functions through “injection,” or through constructors.

In this way, you have the flexibility to inject “compatible dependencies,” for example, when you write unit tests for a service, or when you use a service in another context.

No code for DI

import UserModel from '.. /models/user';
import CompanyModel from '.. /models/company';
import SalaryModel from '.. /models/salary';  
class UserService {
    constructor(){}
    Sigup(){
        // Caling UserMode, CompanyModel, etc. }}Copy the code

Code with manual dependency injection

export default class UserService {
    constructor(userModel, companyModel, salaryModel){
        this.userModel = userModel;
        this.companyModel = companyModel;
        this.salaryModel = salaryModel;
    }
    getMyUser(userId){
        // models available throug 'this'
        const user = this.userModel.findById(userId);
        returnuser; }}Copy the code

At the end you can inject custom dependencies.

import UserService from '.. /services/user';
import UserModel from '.. /models/user';
import CompanyModel from '.. /models/company';
const salaryModelMock = {
  calculateNetSalary(){
    return 42; }}const userServiceInstance = new UserService(userModel, companyModel, salaryModelMock);
const user = await userServiceInstance.getMyUser('12346');
Copy the code

The number of dependencies a service can have is infinite, and when you add a new service, refactoring every instance of it is a tedious and error-prone task. This is why the dependency injection framework was created.

The idea is to define your dependencies in the class so that when you need an instance of the class you simply call the “Service Locator”.

Now let’s look at an example of an NPM library using TypeDI. The following node.js example introduces DI.

More information about TypeDI can be found on the website.

www.github.com/typestack/t…

Typescript sample

import { Service } from 'typedi';
@Service(a)export default class UserService {
    constructor(
        private userModel,
        private companyModel, 
        private salaryModel
    ){}

    getMyUser(userId){
        const user = this.userModel.findById(userId);
        returnuser; }}Copy the code

services/user.ts

TypeDI will now take care of resolving any dependencies required by the UserService.

import { Container } from 'typedi';
import UserService from '.. /services/user';
const userServiceInstance = Container.get(UserService);
const user = await userServiceInstance.getMyUser('12346');
Copy the code

Abusing service locator calls is an anti-pattern

Dependency injection is practiced with Express.js

Using DI in Express.js is the last piece of the node.js project architecture puzzle.

The routing layer

route.post('/'.async (req, res, next) => {
        const userDTO = req.body;

        const userServiceInstance = Container.get(UserService) // Service locator

        const { user, company } = userServiceInstance.Signup(userDTO);

        return res.json({ user, company });
    });
Copy the code

Great, the project looks great! It’s so organized that I want to code now.

Unit test example 🕵🏻

By using dependency injection and these organizational patterns, unit testing becomes very simple.

You don’t have to emulate req/res objects or require(…) The call.

Example: unit tests for user registration methods

tests/unit/services/user.js

import UserService from '.. /.. /.. /src/services/user';

  describe('User service unit tests', () => {
    describe('Signup', () => {
      test('Should create user record and emit user_signup event'.async() = > {const eventEmitterService = {
          emit: jest.fn(),
        };

        const userModel = {
          create: (user) = > {
            return {
              ...user,
              _id: 'mock-user-id'}}};const companyModel = {
          create: (user) = > {
            return {
              owner: user._id,
              companyTaxId: '12345',}}};const userInput= {
          fullname: 'User Unit Test'.email: '[email protected]'};const userService = new UserService(userModel, companyModel, eventEmitterService);
        const userRecord = awaituserService.SignUp(teamId.toHexString(), userInput); expect(userRecord).toBeDefined(); expect(userRecord._id).toBeDefined(); expect(eventEmitterService.emit).toBeCalled(); }); })})Copy the code

Cron Jobs and repetitive tasks ⚡

Therefore, now that the business logic is encapsulated in the service layer, it is easier to use it from the Cron Job.

Instead of relying on node.js setTimeout or other primitive methods that delay code execution, you should rely on a framework to persist your Jobs and their execution into a database.

So you will control the Jobs failed and some successful feedback, refer me to write about the best Node. Js task manager softwareontheroad.com/nodejs-scal…

Configuration and key 🤫

Follow the concept of the twelve-Factor App (12factor.net/) that has been tested for Node.js, which is a best practice for storing API keys and database link strings, using Dotenv.

Place an.env file, which can never be committed (but must exist in the repository with the default values), and then the dotenv NPM package will load the.env file and write its variables to the process.env object in node.js.

That’s enough, but I want to add one more step. There is a config/index.ts file where the NPM package dotenv loads.env

File, and then I use an object to store variables, so we have structure and code completion.

config/index.js

const dotenv = require('dotenv');
  // config() will read your.env file, parse its contents, and assign it to process.env
  dotenv.config();

  export default {
    port: process.env.PORT,
    databaseURL: process.env.DATABASE_URI,
    paypal: {
      publicKey: process.env.PAYPAL_PUBLIC_KEY,
      secretKey: process.env.PAYPAL_SECRET_KEY,
    },
    paypal: {
      publicKey: process.env.PAYPAL_PUBLIC_KEY,
      secretKey: process.env.PAYPAL_SECRET_KEY,
    },
    mailchimp: {
      apiKey: process.env.MAILCHIMP_API_KEY,
      sender: process.env.MAILCHIMP_SENDER,
    }
  }
Copy the code

This way, you can avoid flooding the code with the process.env.my_random_var directive, and with auto-completion, you don’t have to know how to name environment variables.

Loaders 🏗 ️

I took this pattern from W3Tech’s micro frameworks, but didn’t rely on their packaging.

The idea is to split the startup process of Node.js into testable modules.

Let’s look at a classic express.js application initialization

const mongoose = require('mongoose');
  const express = require('express');
  const bodyParser = require('body-parser');
  const session = require('express-session');
  const cors = require('cors');
  const errorhandler = require('errorhandler');
  const app = express();

  app.get('/status', (req, res) => { res.status(200).end(); });
  app.head('/status', (req, res) => { res.status(200).end(); });
  app.use(cors());
  app.use(require('morgan') ('dev'));
  app.use(bodyParser.urlencoded({ extended: false }));
  app.use(bodyParser.json(setupForStripeWebhooks));
  app.use(require('method-override') ()); app.use(express.static(__dirname +'/public'));
  app.use(session({ secret: process.env.SECRET, cookie: { maxAge: 60000 }, resave: false.saveUninitialized: false }));
  mongoose.connect(process.env.DATABASE_URL, { useNewUrlParser: true });

  require('./config/passport');
  require('./models/user');
  require('./models/company');
  app.use(require('./routes'));
  app.use((req, res, next) = > {
    var err = new Error('Not Found');
    err.status = 404;
    next(err);
  });
  app.use((err, req, res) = > {
    res.status(err.status || 500);
    res.json({'errors': {
      message: err.message,
      error: {}}}); }); . more stuff ... maybe start up Redis ... maybe add more middlewaresasync function startServer() {    
    app.listen(process.env.PORT, err => {
      if (err) {
        console.log(err);
        return;
      }
      console.log(`Your server is ready ! `);
    });
  }

  // Run the async function to start our server
  startServer();
Copy the code

As you can see, this part of the application can be a real mess.

This is an efficient way to deal with it.

const loaders = require('./loaders');
const express = require('express');

async function startServer() {

  const app = express();

  await loaders.init({ expressApp: app });

  app.listen(process.env.PORT, err => {
    if (err) {
      console.log(err);
      return;
    }
    console.log(`Your server is ready ! `);
  });
}

startServer();
Copy the code

Now it’s obvious that loaders is just a small file.

loaders/index.js

 import expressLoader from './express';
  import mongooseLoader from './mongoose';

  export default async ({ expressApp }) => {
    const mongoConnection = await mongooseLoader();
    console.log('MongoDB Intialized');
    await expressLoader({ app: expressApp });
    console.log('Express Intialized');

    // ... more loaders can be here

    // ... Initialize agenda
    // ... or Redis, or whatever you want
  }
Copy the code

The express loader

loaders/express.js

import * as express from 'express';
import * as bodyParser from 'body-parser';
import * as cors from 'cors';

export default async ({ app }: { app: express.Application }) => {

    app.get('/status', (req, res) => { res.status(200).end(); });
    app.head('/status', (req, res) => { res.status(200).end(); });
    app.enable('trust proxy');

    app.use(cors());
    app.use(require('morgan') ('dev'));
    app.use(bodyParser.urlencoded({ extended: false }));

    / /... More middlewares

    // Return the express app
    return app;
})
Copy the code

The mongo loader

loaders/mongoose.js

import * as mongoose from 'mongoose'
export default async() :Promise<any> => {
    const connection = await mongoose.connect(process.env.DATABASE_URL, { useNewUrlParser: true });
    return connection.connection.db;
}
Copy the code

The above code is available from the code repository github.com/santiq/bull… To obtain.

conclusion

We took a deep dive into the production-tested Node.js project structure, and here are some tips to summarize:

  • Use a 3-tier architecture.
  • Do not put your business logic in the express.js controller.
  • Use Pub/Sub mode and trigger events for background tasks.
  • Do dependency injection so you don’t have to worry.
  • Do not disclose your passwords, secrets, and API keys. Use the configuration manager.
  • Break your Node.js server configuration into small modules that can be loaded independently.