What is RPC?

Remote Procedure Call (RPC) is a Remote Procedure Call (RPC) that is called by a computer but not on the computer itself. The programmer does not need to pay attention to the Remote Call, just like executing a function locally.

Sounds fancy, but we’re going to implement an example of a sum:

function sum(a, b) {
	return a + b
}
Copy the code

As a client, it does not know the logic of sum. It only needs to pass a and B to the server, and the server returns the result.

So the question is, why are we calling a function remotely?

If the client has an account and password, it will need to be called remotely to obtain user details.

The relationship between RPC and HTTP protocols?

Now that we’ve explained this, I’m sure you’ve understood a little bit, but it raises a new question: How is this process so similar to the HTTP request response model, and what is the relationship between the two? In fact, in a broad sense, HTTP is an implementation of RPC, RPC is more like an idea, HTTP request and response is an implementation.

What is gRPC?

What we are talking about now is that gRPC is an implementation of RPC. It can also be called a framework. There is also thrift in the industry, but it is widely used in microservices, so it is what we are going to learn.

GRPC is A High Performance, Open Source Universal RPC Framework. A high-performance, open source generic RPC framework. It has the following four characteristics:

  • Simple service definition
  • Across languages and platforms
  • Rapid capacity expansion and shrinkage
  • HTTP/ 2-based bidirectional authentication

Reading:

  • Service definition is simple: based onProtocol BuffersDefining services (explained later)
  • Across languages and platforms: Proto defines code that can be generated for each language 🐂

This paper also focuses on two points and illustrates them through examples.

What is a Protocol Buffer?

VS Code provides VS code-Proto3 for proto highlighting

The Protocal buffer can be understood as a language, but its syntax is very simple and its function is very clear. It is used to define functions, function parameters, response results, and can be converted to different language functions by command line. The basic syntax is:

// user.proto

syntax = "proto3";

package user; / / package name

// Request parameters
message LoginRequest {
	string username = 1;
  string password = 2;
}

// Response result
message LoginResponse {
	string access_token = 1;
  int32 expires = 2;
}

// User-specific interface
service User {
	// The login function
	rpc login(LoginRequest) returns (LoginResponse);
}
Copy the code

For perspective purposes, I translate the above definitions into typescript definitions:

namespace user {
  interface LoginRequest {
    username: string;
    password: string;
  }

  interface LoginResponse {
    access_token: string;
    expires: number;
  }

  interface User {
    login: (LoginRequest) = > LoginResponse // Function arguments can have no names in the ts type definition.}}Copy the code

By comparison, we know:

  • Syntax = “proto3” : this sentence is equivalent to using the proto3 version of the protocol, now the unified 3, every proto file is written this way
  • Package: similar to namespace scope
  • Message: equivalent to interface in TS
  • Service: is also equivalent to js interface
  • String, int32: are types, respectively, and int32 is converted to number because the ts partition is not that fine
  • User: equivalent to a class or object in TS
  • Login: equivalent to the method in TS
  • Numbers 1, 2: the most puzzling is the number after the variable, it actually is the key to GRPC communication process, is used to encode and decode the data sequence, similar to the json object into a string, and then put the string into a json object in the colon and the role of the commas, semicolons, namely the rules of the serialization and deserialization.

From proto definition to Node code

Dynamically loaded version

A dynamically loaded version means that proTO is loaded and processed at nodeJS startup, and then data is codeced according to the PROTO definition.

  • Create directories and files

GRPC is a framework for client and server to exchange information. We will establish two JS files divided into client and server. The client sends the login request, and the server responds.

. ├ ─ ─ client. Js# client├ ─ ─ server. Js# the service side├ ─ ─ user. Proto# definition proto└ ─ ─ user_proto. JsBoth the client and server need to use the common code to load proTO
Copy the code
  • Install dependencies
yarn add @grpc/grpc-js  # @grpc/grpc-js: is an implementation of GRPC node (different languages have different implementations)
yarn add @grpc/proto-loader # @grPC /proto-loader: used to load proTO
Copy the code
  • Write user_proto. Js

User_proto.js is important for both the client and the server, where the client knows the data type and parameters it is sending, and the server knows the parameters it is accepting, the result it is responding to, and the name of the function it is implementing.

// user_proto.js
/ / load the proto
const path = require('path')
const grpc = require('@grpc/grpc-js')
const protoLoader = require('@grpc/proto-loader')

const PROTO_PATH = path.join(__dirname, 'user.proto') / / proto path
const packageDefinition = protoLoader.loadSync(PROTO_PATH, { keepCase: true.longs: String.enums: String.defaults: true.oneofs: true })
const protoDescriptor = grpc.loadPackageDefinition(packageDefinition)

const user_proto = protoDescriptor. user

module.exports = user_proto
Copy the code
  • Write server. Js
// service.js
/ / the server
const grpc = require("@grpc/grpc-js"); // Introduce the GPRC framework
const user_proto = require("./user_proto.js"); // Load parsed proto

// User Service implementation
const userServiceImpl = {
  login: (call, callback) = > {
    // Call. request is a request for information
    const { request } = call;
    const { username, password } = request;

    // The first argument is an error message, and the second argument is response information
    callback(null, {
      access_token: `username = ${username}; password = ${password}`.expires: "zhang"}); }};// Just like HTTP, you need to listen on a port and wait for a link
function main() {
  const server = new grpc.Server(); // Initialize the GRPC framework
  server.addService(user_proto.User.service, userServiceImpl); / / add the service
 	// Start listening service
  server.bindAsync("0.0.0.0:8081", grpc.ServerCredentials.createInsecure(), () = > {
      server.start();
      console.log("grpc server started"); }); } main();Copy the code

Because we only define in proto, there is no real implementation of login, so we need to implement login in server.js. Console. log(user_proto) :

{
  LoginRequest: { 
  	// ... 
  },
  LoginResponse: {
  	// ...
  },
	User: [class ServiceClientImpl extends Client] {
    service: { login: [Object]}}}Copy the code

So server.addService we can fill in user_proto.user. service.

  • Write client. Js
// client.js
const user_proto = require("./user_proto");
const grpc = require("@grpc/grpc-js");

// Use 'user_proto.User' to create a client whose target server address is' localhost:8081 '
// this is the address we just listened on with service.js
const client = new user_proto.User(
  "localhost:8081",
  grpc.credentials.createInsecure()
);

// Initiate a login request
function login() {
  return new Promise((resolve, reject) = > {
      // Convention parameters
      client.login(
        { username: 123.password: "abc123" },
        function (err, response) {
          if (err) {
            reject(err);
          } else{ resolve(response); }}); })}async function main() {
  const res = await login();
  console.log(res)
}

main();
Copy the code
  • Start the service

Nodeserver.js starts the server and keeps it listening, then NodeClient.js starts the client and sends the request.

And we see that we already have a response.

  • The salt of the earth

What if the data is sent in a format other than the one defined in PROTO?The answer is yesMandatory typeTo the type defined in Proto, such as when we change the return value of the Expires field in server.js tozhangSo he’ll be converted to a number0While the client sends the past123It’s also converted to a string.

Statically compiled version

Dynamic loading is to load proTO at run time, while static compilation is to compile proTO file into JS file in advance. We only need to load JS file, saving the time of compiling PROTO, which is also a more common way in work.

  • New project

Let’s create a new project, and this time there are only four files in the folder, which are:

. ├ ─ ─ gen# folder to store the generated code├ ─ ─ client. Js# Client code├ ─ ─ server. Js# server code└ ─ ─ user. Proto# proto file, remember to copy the contents
Copy the code
  • Install dependencies
yarn global add grpc-tools # Tools for files from proto -> js
yarn add google-protobuf @grpc/grpc-js Runtime dependencies
Copy the code
  • Generate JS code
grpc_tools_node_protoc \
--js_out=import_style=commonjs,binary:./gen/ \
--grpc_out=grpc_js:./gen/ user.proto
Copy the code

We see that two files user_pb.js and user_grpc_pb.js have been generated:

  • grpc_tools_node_protoc: it is installedgrpc-toolsCommand line tool generated after
  • --js_out=import_style=commonjs,binary:./gen/: is to generateuser_pb.jsThe command
  • --grpc_out=grpc_js:./gen/: is to generateuser_grpc_pb.jsThe command.

Pb is short for protobuf

If you look closely at both, you’ll find:

User_pb. js: extends the message definition in Proto, that is, LoginRequest and LoginResponse.

User_grpc_pb. js: defines methods for the service in proto.

  • Write server. Js
const grpc = require("@grpc/grpc-js");

const services = require("./gen/user_grpc_pb");
const messages = require("./gen/user_pb");

const userServiceImpl = {
  login: (call, callback) = > {
    const { request } = call;

    // Use the request method to get the request parameters
    const username = request.getUsername();
    const password = request.getPassword();

    // Set the response result with message
    const response = new messages.LoginResponse();
    response.setAccessToken(`username = ${username}; password = ${password}`);
    response.setExpires(7200);

    callback(null, response); }};function main() {
  const server = new grpc.Server();

  // Add a service using services.userService
  server.addService(services.UserService, userServiceImpl);
  server.bindAsync(
    "0.0.0.0:8081",
    grpc.ServerCredentials.createInsecure(),
    () = > {
      server.start();
      console.log("grpc server started"); }); } main();Copy the code

What we found different from the dynamic version is that addService uses the exported UserService definition directly, and then login can use a variety of encapsulated methods to handle request and response parameters.

  • Write client. Js
// client.js

const grpc = require("@grpc/grpc-js");

const services = require("./gen/user_grpc_pb");
const messages = require("./gen/user_pb");

// Initialize Client with services
const client = new services.UserClient(
  "localhost:8081",
  grpc.credentials.createInsecure()
);

// Initiate the login request
function login() {
  return new Promise((resolve, reject) = > {
    // Initialize parameters with message
    const request = new messages.LoginRequest();
    request.setUsername("zhang");
    request.setPassword("123456");

    client.login(request, function (err, response) {
      if (err) {
        reject(err);
      } else{ resolve(response.toObject()); }}); }); }async function main() {
  const res = await login()
  console.log(res)
}

main();
Copy the code

As you can see from the comments above, we load content directly from the generated JS file, and it provides many encapsulation methods to give us more control over parameter passing.

From JS to TS

As we can see from the above, the limitation of the parameter type is more cast, which can not be found at the writing stage. This is not scientific, but we need to use proTO to generate the TS type definition to solve this problem.

There are many solutions for generating TS from proTO on the Internet, we chose protoc + grpc_tools_node_protoc_ts + GRPC-tools.

  • New project
mkdir grpc_demo_ts && cd grpc_demo_ts Create project directory

yarn global add typescript ts-node @types/node Install ts and TS-Node

tsc --init Initialize ts
Copy the code
  • Install the Proto tool
yarn global add grpc-tools grpc_tools_node_protoc_ts Install the proto tool globally
Copy the code
  • Install runtime dependencies
yarn add google-protobuf @grpc/grpc-js # runtime dependencies
Copy the code
  • Create a file
mkdir gen Create a directory to store output files
touch client.ts server.ts user.proto # create file
Remember to copy the contents of user.proto
Copy the code
  • The installationprotoc

Then we need to install the protoc tool, first go to Github of Protobuf, enter release, download the file of the platform, then install, after installing, remember to add it to the setting environment variable, make sure it can be used globally.

MAC can be installed via Brew Install Protobuf and protoc commands will be available globally after installation

  • Generate JS files and TS type definitions
Generate user_pb.js and user_grpc_pb.js
grpc_tools_node_protoc \
--js_out=import_style=commonjs,binary:./gen \
--grpc_out=grpc_js:./gen \
--plugin=protoc-gen-grpc=`which grpc_tools_node_protoc_plugin` \
./user.proto

# generate d.ts definitions
protoc \
--plugin=protoc-gen-ts=`which protoc-gen-ts` \
--ts_out=grpc_js:./gen \
./user.proto
Copy the code
  • Write server. Ts
// server.ts

import * as grpc from "@grpc/grpc-js";
import { IUserServer, UserService } from "./gen/user_grpc_pb";
import messages from "./gen/user_pb";

// Implementation of User Service
const userServiceImpl: IUserServer = {
  // Implement the login interface
  login(call, callback) {
    const { request } = call;
    const username = request.getUsername();
    const password = request.getPassword();

    const response = new messages.LoginResponse();
    response.setAccessToken(`username = ${username}; password = ${password}`);
    response.setExpires(7200);
    callback(null, response); }}function main() {
  const server = new grpc.Server();
  
  // UserService is the definition, UserImpl is the implementation
  server.addService(UserService, userServiceImpl);
  server.bindAsync(
    "0.0.0.0:8081",
    grpc.ServerCredentials.createInsecure(),
    () = > {
      server.start();
      console.log("grpc server started"); }); } main();Copy the code

The type hint is perfect 😄

  • Write client. Ts
// client.ts

import * as grpc from "@grpc/grpc-js";
import { UserClient } from "./gen/user_grpc_pb";
import messages from "./gen/user_pb";

const client = new UserClient(
  "localhost:8081",
  grpc.credentials.createInsecure()
);

// Initiate a login request
const login = () = > {
  return new Promise((resolve, reject) = > {
    const request = new messages.LoginRequest();
    request.setUsername('zhang');
    request.setPassword("123456");

    client.login(request, function (err, response) {
      if (err) {
        reject(err);
      } else{ resolve(response.toObject()); }}); })}async function main() {
  const data = await login()
  console.log(data)
}

main();
Copy the code

When we enter the wrong type, TS does a mandatory check.

  • Start the service

We started both using TS-Node and found that they worked together.

From the Node to Go

In the above introduction, client and server are written using JS/TS, but in practice, node is more as a client to aggregate the interface written by other languages, also known as the BFF layer, we take go language as an example.

  • Transform the original TS project

We transform the above TS project into client and server two directories. Client is TS project as the client and server is go project as the server. Meanwhile, we delete the original server.ts and put user.proto on the outside. Both are shared.

. ├ ─ ─ the clientThe client folder contains the same content as the ts section except that the server.ts content has been removed│ │ ├ ─ ─ the client. The ts ├ ─ ─ gen │ │ ├ ─ ─ user_grpc_pb. Which s │ │ ├ ─ ─ user_grpc_pb. Js │ │ ├ ─ ─ user_pb. Which s │ │ └ ─ ─ user_pb. Js │ ├ ─ ─ package. Json │ ├ ─ ─ tsconfig. Json │ └ ─ ─ yarn. The lock ├ ─ ─ serverServer file└ ─ ─ user. Proto# proto file
Copy the code
  • Install the Go

Let’s Go to the official website of Go language, find the latest version to download and install: golang.google.cn/dl/

  • Setting up the GO Agent

Just like NPM, go language pull package, also need to set up mirror pull package faster.

go env -w GOPROXY=https://goproxy.cn,direct
Copy the code
  • Initialize the GO project

Similar to yarn init -y.

cd server Enter the server directory
go mod init grpc_go_demo Initialize the package
mkdir -p gen/user This is used to store the code generated later
Copy the code
  • Install the Protoc go language plugin

The code used to generate the GO language is the same as grpc-tools and GRpc_tools_node_protoc_ts.

Go install google.golang.org/protobuf/cmd/[email protected] go install google.golang.org/grpc/cmd/[email protected]Copy the code
  • Install runtime dependencies

We also need to install runtime dependencies that act like the Above Node google-Protobuf and @grpc/grpc-js.

go get -u github.com/golang/protobuf/proto
go get -u google.golang.org/grpc
Copy the code
  • Modify the user. The proto
syntax = "proto3";

option go_package = "grpc_go_demo/gen/user"; // Add this sentence

package user;

message LoginRequest {
	string username = 1;
  string password = 2;
}

message LoginResponse {
	string access_token = 1;
  int32 expires = 2;
}

service User {
	rpc login(LoginRequest) returns (LoginResponse);
}
Copy the code
  • Generate go code
Protoc --go_out=./gen/user -i =.. protoc --go_out=. / --go_opt=paths=source_relative \ --go-grpc_out=./gen/user -I=.. / --go-grpc_opt=paths=source_relative \ .. /user.protoCopy the code
  • Install the VS Code plug-in and create a new open project

When you click to view the generated user.pb.go or user_grpc.pb.go, you will find that vscode tells you to install the plugin, and then you may find that the go package cannot find the error, don’t worry, we will use server as the project root path to reopen the project.

  • Create main.go to write the server code
// server/main.go

package main

import (
	"context"
	"fmt"
	pb "grpc_go_demo/gen/user"
	"log"
	"net"

	"google.golang.org/grpc"
)

// Declare an object
type userServerImpl struct {
	pb.UnimplementedUserServer
}

The // object has a Login method
func (s *userServerImpl) Login(ctx context.Context, in *pb.LoginRequest) (*pb.LoginResponse, error) {
	// Returns the response result
    return &pb.LoginResponse{
		AccessToken: fmt.Sprintf("go: username = %v, password = %v", in.GetUsername(), in.GetPassword()),
		Expires: 7200,},nil
}


// Listen on the service and register the server object with the gRPC server
func main(a) {
    // Create a TCP service
	lis, _ := net.Listen("tcp".": 8081")
	
    // Create the GRPC service
	server := grpc.NewServer()
    
    // Register UserServer with server
	pb.RegisterUserServer(server, &userServerImpl{})
    
	log.Printf("server listening at %v", lis.Addr())

	iferr := s.Serve(lis); err ! =nil {
		log.Fatalf("failed to serve: %v", err)
	}
}
Copy the code

Why gRPC and not HTTP?

Most microservices architectures now use gRPC for communication between services, so why not use HTTP, which is familiar to us on the front end?

Some people say that high efficiency, gRPC is TCP protocol, binary transmission, high efficiency, high efficiency loss yes, but it will not have a significant gap relative to HTTP, on the one hand, HTTP JSON encoding and decoding efficiency and the number of space will not be worse than encoding into binary, secondly, TCP and HTTP in the Intranet environment, I personally feel that the performance is not much worse (PS: the official website of gRPC does not emphasize its efficiency compared to HTTP).

In fact, the core of the official website is highlighted in its language independence, through the protobuf intermediate form, can be converted into various languages of the code, to ensure the consistency of the code, rather than HTTP to Swagger or other document platform interface.

conclusion

This article is just a start, as for how gRPC combined with node framework for development or deeper knowledge still need you to explore.

Another bald day.