Protobuf is introduced

Since the data on protobuf interactions are scattered on the web, I compiled the data on protobuf interactions on the front and back end for reference.

Google Protocol Buffers, short for Protobuf, provides a flexible, efficient, automatic serialization of structured data that can be associated with XML but is smaller, faster, and simpler than XML. Only need to customize the data format you want once, then users can use the Protobuf compiler to automatically generate the source code in various languages, easy to read and write custom formatted data. Language-independent, platform-independent, you can also update existing data formats based on old data formats without breaking the original.

Front – end interaction mode

Both front and back ends interact with information in binary form. The front and back ends define a file with the proto suffix, which is used as a document to communicate.

Protobuf Indicates the format of a file

The following is a demo of the protobuf file, test.proto. The structure of the file is really straightforward.

enum FOO {
  BAR = 1;
}

message Test {
  required float num  = 1;
  required string payload = 2;
  optional string payloads = 3;
}

message AnotherOne {
  repeated FOO list = 1;
}
Copy the code

Install protobuf environment on front and back ends

Use node as an example. Install BufferHelper and protocol-buffers to parse protobuf files

npm install bufferhelper
npm install protocol-buffers
Copy the code

The front-end requires an environment to parse the Protobuf. The MAC uses BREW to install the Protobuf environment. This operation requires the Homebrew environment to be installed. Specific Homebrew installations are searched by themselves. Windows front-end environment installation is a bit different, search yourself.

brew install protobuf
Copy the code

Test whether the front-end proto environment is installed, if there is a version is installed.

protoc --version
Copy the code

The front end needs to compile the PROto file before the front and back end can interact. Test. proto is the proto file of the same front-end and back-end. Compile to js file before executing. First go to the proto directory of the Node project and execute the following command to generate the test_pb.js file. Finally, JS only needs to parse this file. I need to do the same thing on the front end, because I’m separating the front end from the back end. There are two projects, so both projects need to compile.

protoc --js_out=import_style=commonjs,binary:. test.proto
Copy the code

The back-end sends data to the front-end

The back end assigns the contents of the proto file and passes it to the front end. The back end sends the protobuf binary to the front end, which must be converted to JSON before it can be sent to the front end. Otherwise the front end will turn into garbled. The front end needs to request this route.

app.get('/proto/get'.function (req, res) {
  let protobuf = require('protocol-buffers')
  let messages=protobuf(fs.readFileSync('./proto/test.proto')
  let buf = messages.Test.encode({
    num: 42,
    payload: 'Hello World Node JS and Javahhh',
    payloads: ' '}) console.log(buf) // prints the binary stream res.send(json.stringify (buf)); // It needs to be jsonized and sent to the front-end. Otherwise, the browser will automatically parse the text.Copy the code

The front-end needs to accept binary streams by introducing proto.js files and protobufjs plug-ins

  import awesome from '.. /.. /proto/test_pb.js'
Copy the code

The front end uses AXIos to request /proto/get routes, and res.data in the callback function is the return value of the back end. Perform the following operations. Message3 printed out is also a parsed file.

 axios({
      method:'get',
      url: '/proto/get',
      headers: { 'contentType':'application/x-protobuf'} }).then(res => {
      let message3 = awesome.Test.deserializeBinary(res.data.data)
      letNums = message3.getnum () console.log(nums) // nums=42. It's going to be the 42 from the back endlet pm = awesome.Test.deserializeBinary(res.data.data)
      let protoBuf = pm.toObject()
      console.log('protoBuf: '}).catch((error) => {console.log(error)})Copy the code

The front end sends data to the back end

The front end needs to assign values to proto files and convert them to binary files to introduce dependent files to the back end.

  import awesome from '.. /.. /proto/test_pb.js'
  import protobuf from 'protobufjs'
Copy the code
     letMessage = new awesome.Test() // Call the Person object to instantiate message.setnum (23) message.setpayload ('asd') // serializeletBytes = message.serializebinary () // Byte streamlet blob = new Blob([bytes], {type: 'buffer'});
    axios({
      method:'post',
      url: '/proto/send',
      data: blob,
      headers: {
        'Content-Type': 'application/octet-stream'// This is set according to background requirements. Application /octet-stream}}).then(res => {console.log(res)}).catch((error) => {console.log(error) })Copy the code

The back end needs to accept file import file

let BufferHelper = require('bufferhelper');
Copy the code

Code to receive byte streams

app.post('/proto/send'.function (req, res) {
  let bufferHelper = new BufferHelper();
  req.on("data".function (chunk) {
    bufferHelper.concat(chunk);
  });
  req.on('end'.function () {
    let protobuf = require('protocol-buffers')
    letbuffer = bufferHelper.toBuffer(); Console. log(buffer) // This is already binaryletMessage3 = awesome. Test. DeserializeBinary (buffer) console. The log (message3. GetNum ()) / / printing is front of 23let pm = awesome.Test.deserializeBinary(buffer)
    letFor protoBuf = pm.toobject () console.log(protoBuf)'asd', payloads: 'asds'} console.log(protobuf.num) // print 23}); })Copy the code

So that’s protobuf’s back and forth interaction. Please point out any errors.