I. Demand background

First of all, let’s explain that our project uses WebSocket for communication at the front end and back end, and uses ProtoBuf protocol for data transformation, which can reduce the overall package structure and greatly improve the performance in the case of huge data transmission. Interested partners can have an in-depth understanding of it, and there is no more explanation here

With respect to the use of a protobuf, both the front and back ends must have corresponding message package at the same time. In the process of transmitting data from the front end to the back end, the corresponding message package is used to transcode the data, and in the process of receiving data, the corresponding message package is used to decode the data. This back and forth process leads to a lot of repetitive work on the front end (CTRL + C, CTRL + V background message body file to the front end, and then add the corresponding transcoding, decoding configuration, etc. Balabala……)

In the beginning of the iteration process, the interface given by the background for a function will not be very much, so we will not have a sense of front-end development, the background to the front-end, directly Copy to use; Up until the last iteration, the background gave me 17 interfaces at once and I had to CTRL, C, CTRL, V 17 times and I had to configure so many things without missing What the FXXK? !!!!!

Harm, steady food is hard, life still has to go on, the code has to continue to knock, but I can’t continue to do this, this is too silly X

I look at the profile, then I look at the message body, then I look at today’s X, then I look at the little shit, then I look back at the code

Ding! I had an idea!!!

I’ve got it! Why don’t we automate code generation so I can just lie back and do it? I’m like, “Damn.

2. The body of

OK, let’s start the main body

First, I’ll do an analysis of the historical code structure

Why don’t we take a look at the message body

// @filename a.proto syntax = "proto3" package Test; message CTest { string market = 1; // This is a remark test 1 string stkcode = 2; Double price = 3; // This is a remark test 3 int32 qty = 4; Test 4 repeated test. DTest account = 5; // This is a note test 5... } message DTest { string id = 1; Double hold_amt = 2; double hold_amt = 2; // This is an inline remarks test 2... }...Copy the code

Then there are the configuration files

// @file a.js const str = ` message CTest { string market = 1; . } message DTest { string id = 1; . } ` module.exports = { type: 'CTest', str, } // @filename parse.ts const a = require('./a'); . class Parser { ... private method({xx}) { ... return balabalabala; } private PROTO_MAPPING = { [APICONF.XXX]: this.method(a), [APICONF.YYY]: this.method(b), [APICONF.ZZZ]: this.method(c), ... }... }Copy the code

OK, after looking at the above files, I realized that the code structure is the same, but the content is different. Based on this, I can generate the files according to A. proto, and since our project is coded in Typescript, The proto message body itself looks like the interface definition, so I can also generate the interface definition file.

Ding ding!!!!! Since the interface definition file is generated, so I can also generate the API request file; Now that API requests can be generated automatically, I can also generate interface documents automatically; Now that the interface documentation is automatically generated, I can… Gnome male -“

Ha, ha, ha, ha, ha, ha, ha, ha, ha, ha

A: How to do it?

  • First, by looking at the A. js file, we can see that an interface needs to use a specific header. Apiconf. XXX needs to use CTest for transcoding. Here we can use a special comment method, add a comment on the specific message body, and then use the re to match, so we can easily find each interface. Interface is also called function in our project, each function has its own function number, different functions may use the same function number.

  • Add comments, here we strongly recommend VSCode support custom code snippets extension function, generated code is not too cool; Then take a look at what the comment looks like (here the comment is maintained by the back end)

// @filename a.proto ... = = = = = = = ∨ ∨ ∨ special annotation ∨ ∨ ∨ = = = = = = = / 12345 * * * * @ funcid @ desc here is an interface * / = = = = = = = Sunday afternoon Sunday afternoon Sunday afternoon special comments Sunday afternoon Sunday afternoon Sunday afternoon = = = = = = = message CTest {string  market = 1; // This is a remark test 1 string stkcode = 2; Double price = 3; // This is a remark test 3 int32 qty = 4; Test 4 repeated test. DTest account = 5; // This is a note test 5... } message DTest { string id = 1; Double hold_amt = 2; double hold_amt = 2; // This is an inline remarks test 2... }...Copy the code
  • Plan specific implementation flow chart

  • Began to lu code

First, take a look at the general execution flow

// @filename index.js ... / / first of all, create the directory createDir (['. / proto ', '/ type', '. / conf ', '. / request ']); . GetAllFilePath () = getAllFilePath(); Then (async list => {await batchChangeWeakImport(list); return list; Then (list => {return batchReadFile(list); }) // Get all function number caches, message body caches used by function numbers, Then (async data => {const {funCache, funcMsgCache, allMsgCache} = data; const list = Object.values(funCache); For (let I = 0; i < list.length; i++) { const main = list[i]; const mainMsg = getMessage(main.name, funcMsgCache); const msgChildList = recursionCache(mainMsg, funcMsgCache); if (! MainMsg) {console.log(' No header found: ${main}! `); } else { ... // If you read the message body, start writing proto.js, type.ts, request.ts await writeProtoFile(mainMsg, msgChildList); await writeTypeFile(mainMsg, msgChildList, allMsgCache); Await writeRequestFile (mainMsg)}} / / write type/index. The ts do unified export await writeTypeIndexFile (list); Conf.ts await writeConfFile(object.values (funCache)); // Write configuration file conf.ts await writeConfFile(object.values (funCache)); return data; }). Then (() => {console.log(' task completed '); }).then(() => {// Get prettier execution path const prettierPath = path.resolve(__dirname, '.. /.. /.. /node_modules/prettier/bin-prettier.js'); Exec (' node ${prettierPath} --write ${This is the proto filePath} ', (err, _code) => { if (err) { console.log(err); return; } console.log(' formatting succeeded '); }); });Copy the code
The implementation of detailed functions
Async function batchReadFile(list) {let funCache = object.create ({}); let funcMsgCache = Object.create({}); let allMsgCache = Object.create({}); await list.reduce(async (last, next) => { const result = await last; if (result) { funCache = Object.assign(funCache, result.funcMsgCache); funcMsgCache = Object.assign(funcMsgCache, result.nested); allMsgCache = Object.assign(allMsgCache, result.allMsgCache); } return await readSignFile(next); }, readSignFile(list[0])); return { funcMsgCache, funCache, allMsgCache, }; Function readSignFile(filePath) {return new Promise(resolve => {fs.readFile(filePath, (__err, originCode) => { const code = iconv.decode(originCode, 'utf-8'); // GET_FUNC_MESSAGE_REG = /@funcid\s+(? <funcid>.+)?? [\r|\n][\s|\S]+? @desc\s+(? <desc>.+)?? [\r|\n][\s|\S]+? message\s+(? <name>(\w+))[\s|\S]+? {/g; message\s+(? <name>(\w+))[\s|\S]+? {/g; let funcMsg = GET_FUNC_MESSAGE_REG.exec(code); // GET_ALL_MESSAGE_REG = /message (? <name>(\w+))[\s|\S]+? {(? <content>[\s|\S]+?) }/g; let allMsg = GET_ALL_MESSAGE_REG.exec(code); while (funcMsg ! == null) { ... funcMsg = GET_FUNC_MESSAGE_REG.exec(code); } while (allMsg ! == null) { ... allMsg = GET_ALL_MESSAGE_REG.exec(code); } // The proto message body is not just a file, it can introduce many modules in the message body of other files cannot be read. Root.load (filePath, {keepCase: true}, (err, bufCode) => {... If (bufCode) {resolve({nested: bufcode. nested, // nested message body funcMsgCache, allMsgCache,}); } else { resolve(); }}); }); }); }Copy the code
The other Write functions are pretty much the same, so I won’t post them here
function writeProtoFile(main, childlist) {
    return new Promise(async resolve => {
        const fileName = `./proto/${main.name}.js`;
        const filePath = path.resolve(__dirname, fileName);
        await write(filePath, baseProtoModule(main, childlist));
        resolve();
    });
}

Copy the code
Template file
// @filename commonModule.js const { NEED_CHANGE_NUMBER } = require('.. /resource'); Int32, int62, double, Function isNeedChangeNumber(STR) {if (need_change_number.includes (STR)) return 'number'; return str; } function baseProtoModule(main, childList) {const result = writeContent(main, childList, 'message', false); return ` const str = \` ${result} \`; module.exports = { type: '${main.name}', str, } `; } // The basic type template function baseTypeModule(main, childList, AllMsgCache) {return ` / * * functions, ${main. Use | | main. Funcid} * / ${writeContent (main, childList that consists of 'interface' allMsgCache, true)} `; Function writeContent(main, childList, type, allMsgCache = {}, needExport = false) {... return ` ${messageContent(main, type, allMsgCache, parentExportStr ? 'export default' : '')} ${childStr} `; Function messageContent(message, type, allMsgCache, prefix = "") {... const mainContent = (fields, split) => ` ${prefix} ${type} ${message.name} { ${createMessageContent(fields, split, allMsgCache)} } `; Const enumContent = (message, split) => '${prefix} enum ${message.name} {${createEnumContent(message, split)} } `; if (message.fieldsArray) { return mainContent(message.fieldsArray, splitComman); } else {// If there is no field, it may be a special type, for example, enumeration Enum return enumContent(message, splitComman); Function createMessageContent(list, split, allMsgCache = {}) {let STR = '; for (let i = 0; i < list.length; i++) { const item = list[i]; const type = item.type.indexOf('.') >= 0 ? item.type.split('.')[1] : item.type; If (split == ',') {const msgContent = allMsgCache[item.parent. Name]; let note = ''; if (msgContent) { const reg = new RegExp(`${item.name}[\\s|\\S]+?; (? <note>[\\s|\\S]*? \\n)`); const getNote = reg.exec(msgContent); note = `/** ${getNote && getNote.groups.note.replace(/\/\/|\n|\s/g, '')} */`; } str += ` ${note} ${item.name}: ${item.rule === 'repeated' ? `Array<${type}>` : isNeedChangeNumber(type)}; \n `; } else if (split == '; ') { str += `${item.rule || ''} ${type} ${item.name} = ${item.id}${split} \n`; } } return str; }...Copy the code
// @filename confModule.js function baseConfModule(apiList) { let str = ''; let importStr = ''; let hadRequire = {}; for (let i = 0; i < apiList.length; i++) { const apiItem = apiList[i]; // Since some of the message bodies may be named test.emsg, the realName was extracted from the file before parsing, and injected into apiList if (! hadRequire[apiItem.realName]) { hadRequire[apiItem.realName] = true; / / automatic introduced importStr + = ` const ${apiItem. RealName. ToLocaleLowerCase ()} = the require (".. /proto/${apiItem.realName}.js'); \n`; } // The configuration items are generated here, which is consistent with the PROTO_MAPPING structure in parse.ts. Directly into the conf export effect on the content / / can be done zero injected STR + = ` [FUNAPI. ${apiItem. Name. ToLocaleUpperCase ()}] : method(${apiItem.realName.toLocaleLowerCase()}), \n`; } return ` import FUNAPI from './FUNAPI'; Const method = require(".. /.. /protocol-method"); ${importStr} function method(xxx) { return protobuf(xxx); } export default { ${str} } `; }Copy the code
  • The results show

Add protoAutoTask: “node ${XXX /index.js}” to script of package.json, then run NPM run protoAutoTask command as shown below

Let’s take a look at the generated file

The proto message body definition file is partially intercepted. Js

Type Interception of the interface type definition. Ts

FunConf configures partial interception.ts

API sample intercepts. Ts

Doc examples are generated by interception

Dang dang dang! Done! This is the general flow of automated code generation tools

4. Conclusion

Why do you want to write this thing? In the past few years after graduation, I have not carefully written a summary, resulting in many things are always unknowingly forgotten

This is my first article since graduation and the first article in 2021. I hope to take this opportunity to cultivate my writing ability, learn to summarize, learn to share, and make progress together with da 🔥 Meng.

This is the first article of the new MOE, hope the big guys give more suggestions!

Topics that may be coming up include:

  • Timing synchronization task of front and back end code based on Jenkins
  • Webpack lazy loading source code interpretation principle and implementation
  • Electron thermal update implementation
  • Git Hooks code submission process specification
  • Write custom Eslint rules to customize your business requirements
  • .