Using the WebAssembly VIRTUAL machine (SSVM), developers can convert any Tensorflow model to serverless functions with just a few lines of code. It is then provided as a Web service.

We provide a template with both source code and configuration files, which you can view here. Out of the box, just deploy the template to Tencent cloud through the command line, and there will be a usable picture classification Web application.

Screencast | online demo!

You can fork and modify this template, then change to another Tensorflow model based on your needs, personalize interpretation and presentation of reasoning results, update the Web UI, and so on. Finally, Serverless Framework is deployed to Tencent cloud in a few minutes. In this tutorial I will demonstrate how to make these adjustments.

Follow these simple instructions to install Rust, SSVMUp, and Serverless Framework. Make sure the –enable-aot extension is installed for SSVMUP.

Alternatively, you can create or run the template using Github Codespaces or Docker.

Modify the TensorFlow model that is different from the template

The SRC /main.rs file in the template is a Rust program. It takes the input image and then executes the Tensorflow Lite (TFLite) model on the image data.

Rust has been the favorite programming language among Stackoverflow users for the past five years. At first glance, it may seem complicated. However, as you can see from the examples, the SSVM Rust API is very simple and easy to get started with.

The relevant code is as follows:

fn main() {
 // 1. Load TFLite model file and probability label file
 let model_data: &[u8] = include_bytes! ("lite-model_aiy_vision_classifier_food_V1_1.tflite");
 let labels = include_str!("aiy_food_V1_labelmap.txt");

 // 2. Load the uploaded image into img_buf Vector. .// 3. Adjust img _ buf to the size required for the input tensor of the Tensorflow model
 let flat_img = ssvm_tensorflow_interface::load_jpg_image_to_rgb8(&img_buf, 192.192);

 // 4. Run the model using the image as the input tensor to obtain the output tensor.
 let mut session = ssvm_tensorflow_interface::Session::new(&model_data, ssvm_tensorflow_interface::ModelType::TensorFlowLite);
 session.add_input("input", &flat_img, &[1.192.192.3])
 .run();
 let res_vec: Vec<u8> = session.get_output("MobilenetV1/Predictions/Softmax");

 The output tensor is the probability list (0 to 255) for each label in the "labelmap.txt" file.. .let mut label_lines = labels.lines();
 for _i in 0..max_index {
 label_lines.next();
 }

 letclass_name = label_lines.next().unwrap().to_string(); . . }Copy the code

If you want to use another Tensorflow model, the following steps are required:

  1. Load your own Tensorflow model file and its associated data file. SSVM supports TFLite and TF freezing model files. You can load your own trained MobileNet image classification model or a completely different model.
  2. Load uploaded images or other model input. See the next section for details.
  3. Prepare input data according to the requirements of the model input tensor. In the MobileNet model, we resize the image and load the pixel values into the vector.
  4. Input data, input tensor name, and output tensor name are passed to the model.
  5. Takes the output tensor value in the vector and processes it to produce human-readable results. In the case of the MobileNet model, the output values correspond to the classification probability of each tag in the tag graph. Output labels with the highest probability.

For more examples, please click:

  • The face recognition example outputs a tensor containing the coordinates of the frame of the recognized face.
  • Example of Tiktok logo recognition Output the probability of the input image containing the Tiktok logo.

Input and output

Since this Tensorflow Serverless function runs in Tencent Cloud infrastructure, it needs to interact with Tencent Cloud’S API gateway to process web requests. The Tencent API Gateway encapsulates the entire incoming HTTP request in a JSON object and sends it to the function via STDIN. Therefore, the SRC /main.rs function needs to read from STDIN, parse the BODY field of the JSON object for the Base64-encoded image data, and then load the image into the IMg-buf vector.

Normally, you don’t need to change this part of the template, but it’s useful to know how it works.

fn main() {... .let mut buffer = String::new();
 io::stdin().read_to_string(&mut buffer).expect("Error reading from STDIN");
 let obj: FaasInput = serde_json::from_str(&buffer).unwrap();
 letimg_buf = base64::decode_config(&(obj.body), base64::STANDARD).unwrap(); . . }#[derive(Deserialize, Debug)]
struct FaasInput {
 body: String
}
Copy the code

The function uses println! The declaration returns the inference result to STDOUT.

if max_value > 50&& max_index ! =0 {
 println!("It {} a <a href='https://www.google.com/search?q={}'>{}</a> in the picture", confidence.to_string(), class_name, class_name);
} else {
 println!("It does not appears to be any food item in the picture.");
}
Copy the code

The Web application

Optional web UI Serverless function can be in the template website/content/index. Found in the HTML file. You can change it to suit your own application needs. The key part of this user interface is the JavaScript code that converts the selected image file into a Base64 text string, and then makes an HTTP POST that sends this Base64 text string to the API gateway URL of the Serverless function.

var reader = new FileReader();
reader.readAsDataURL(document.querySelector('#select_file').files[0]);
reader.onloadend = function () {
 $.ajax({
 url: window.env.API_URL,
 type: "post".data : reader.result.split("base64,") [1].dataType: "text".success: function (data) {
 document.querySelector('#msg').innerHTML = data;
 },
 error: function(jqXHR, exception){
 document.querySelector('#msg').innerHTML = 'Sorry, there is a problem. Try later'; }}); };Copy the code

As we discussed, the Tencent Cloud Serverless runtime will convert the POST string to the body field in a JSON object, which will then be passed to the Rust function.

Creation and deployment

When building an application, we need to use the ssVMup tool to create a.so file. This is an AOT-compiled WebAssembly function that is both high performance and secure. The default file name is PKG /scf.so.

$ ssvmup build --enable-ext --enable-aot
$ cp pkg/scf.so scf/
Copy the code

This template is organized according to the Serverless Framework structure.

  • layerThe project adds the Tensorflow and SSVM libraries to the SCF Custom Runtime.
  • scfThe project creates a custom Serverless runtime and its associated API gateway.
  • websiteThe project creates an includecontentFile site. The deployment script automatically connects the JavaScript of the Web page to the SCF API gateway URL of the previous step.

To deploy the application on Tencent Cloud, simply run the following command from Serverless Framework.

$ sls deploy ... . website: https://sls-website-ap-hongkong-kfdilz-1302315972.cos-website.ap-hongkong.myqcloud.com vendorMessage: 0700-tensorflow -SCF › "deploy" ran for 3 apps successfullyCopy the code

The Web application can now be accessed from the deployed website URL

The next?

We also have other examples of deploying SSVM-based Tensorflow-functions-as-A-Service on Tencent Cloud, click here.