YOMO is a programming framework for assisting developers to build Geo-Distributed Cloud System. The communication layer of YOMO is built on the QUIC protocol, bringing high-speed data transmission and built-in “stream function” of Streaming Serverless, which greatly improves the development experience of distributed cloud system. The distributed cloud system built by YOMO provides ultra-high speed communication mechanism between near field computing force and terminals, and has been widely applied in Metaverse, VR/AR, IoT and other fields.

YOMO is written in Go language, and the Streaming Serverless part uses Golang’s plug-in and shared library to dynamically load user codes and shared libraries. However, it also brings some limitations to developers, especially those using Windows. Combined with the Serverless architecture’s rigid requirement for isolation, this makes WebAssembly an excellent choice for running user-defined functions.

For example, in the process of real-time AI reasoning in AR/VR and intelligent factory, the camera can send real-time unstructured data through YOMO to the computing node in the near-field MEC (Multiple Access Edge Computing) equipment, and automatically execute the managed AI reasoning function. When the AI reasoning is completed, YOMO sends the AI calculation result to the terminal device in real time.

However, the challenge for YOMO is to merge and manage handler functions written by multiple external developers in edge computing nodes. This requires runtime isolation of these functions without sacrificing performance. Traditional software container solutions, such as Docker, are not up to this task because they are too heavy and too slow to handle real-time tasks.

WebAssembly provides a lightweight, high-performance software container. It is well suited as the Runtime of the YOMO data processing handler function.

In this article, we’ll show you how to create Rust functions for TensorFlow based image recognition, compile it into WebAssembly, and then run it as a streaming data handler using YOMO. We use WASEdge as the WebAssembly Runtime because WASEdge provides the best performance and maximum flexibility compared to other WebAssembly Runtimes. WASEdge is the only WebAssembly virtual machine that steadily supports TensorFlow. YOMO manages the WameEdge VM instances and WebAssembly bytecode applications within the container through WameEdge’s Golang API.

GitHub source code:
https://github.com/yomorun/yo…

The preparatory work

Obviously, you need to install Golang. We assume that you already have it installed.

The Golang version needs to be newer than 1.15 for our example to work.

At the same time, you need to install the Yomo CLI application. It arranges and coordinates the data flow and handler function calls.

$go install github.com/yomorun/cli/yomo@latest $yomo version yomo CLI version: v0.0.5

Next, install the WASMEDGE and TensorFlow shared libraries. WASEdge is the leading WebAssembly Runtime, hosted by CNCF. We will use it to embed and run the WebAssembly program from YOMO.

# Install wget WasmEdge $$at https://github.com/second-state/WasmEdge-go/releases/download/v0.8.1/install_wasmedge.sh chmod +x ./install_wasmedge.sh $ sudo ./install_wasmedge.sh /usr/local # Install WasmEdge Tensorflow extension $ wget https://github.com/second-state/WasmEdge-go/releases/download/v0.8.1/install_wasmedge_tensorflow_deps.sh $wget https://github.com/second-state/WasmEdge-go/releases/download/v0.8.1/install_wasmedge_tensorflow.sh $chmod + x ./install_wasmedge_tensorflow_deps.sh $ chmod +x ./install_wasmedge_tensorflow.sh $ sudo ./install_wasmedge_tensorflow_deps.sh /usr/local $ sudo ./install_wasmedge_tensorflow.sh /usr/local # Install WasmEdge Images extension $ wget https://github.com/second-state/WasmEdge-go/releases/download/v0.8.1/install_wasmedge_image_deps.sh $wget https://github.com/second-state/WasmEdge-go/releases/download/v0.8.1/install_wasmedge_image.sh $chmod + x ./install_wasmedge_image_deps.sh $ chmod +x ./install_wasmedge_image.sh $ sudo ./install_wasmedge_image_deps.sh /usr/local $ sudo ./install_wasmedge_image.sh /usr/local

Finally, since our demo WebAssembly function is written in Rust, you’ll also need to install the Rust compiler and the RustWASMC toolchain.

The rest of the demo can fork and clone the source code repo.

$ git clone https://github.com/yomorun/yomo-wasmedge-tensorflow.git

Image classification function

The image recognition functions that handle Yomo image streams are written in Rust. It uses the Wasmedge TensorFlow API to process the input images.

#[wasm_bindgen] pub fn infer(image_data: &[u8]) -> String { // Load the TFLite model and its meta data (the text label for each recognized object number) let model_data: &[u8] = include_bytes! ("lite-model_aiy_vision_classifier_food_V1_1.tflite"); let labels = include_str! ("aiy_food_V1_labelmap.txt"); // Pre-process the image to a format that can be used by this model let flat_img = wasmedge_tensorflow_interface::load_jpg_image_to_rgb8(image_data, 192, 192); // Run the TFLite model using the WasmEdge Tensorflow API let mut session = wasmedge_tensorflow_interface::Session::new(&model_data, wasmedge_tensorflow_interface::ModelType::TensorFlowLite); session.add_input("input", &flat_img, &[1, 192, 192, 3]) .run(); let res_vec: Vec<u8> = session.get_output("MobilenetV1/Predictions/Softmax"); // Find the object index in res_vec that has the greatest probability // Translate the probability into a confidence level // Translate the object index into a label from the model meta data food_name ret_str = format! ( "It {} a <a href='https://www.google.com/search?q={}'>{}</a> in the picture", confidence, food_name, food_name ); return ret_str; }

You can use the RustWASMC utility to compile this function into WebAssembly bytecode.

Here, we require Rust compiler version 1.50 or earlier for the WebAssembly function to work with WameEdge’s Golang API. Once the Interface Type specification is finalized and supported, we will catch up with the latest
Rust compiler version 。

$rustup default 1.50.0 $CD flow/rust_mobilenet_food $rustwasmc build '--enable-ext' # The output WASM will be pkg/rust_mobilenet_food_lib_bg.wasm. # Copy the wasm bytecode file to the flow/ directory $ cp pkg/rust_mobilenet_food_lib_bg.wasm .. /

Integrated with YoMo

On the YOMO side, we use the WameEdge Golang API to start and run the WameEdge virtual machine for the image recognition function. The app.go file in the source code project looks like this:

package main ... . var ( vm *wasmedge.VM vmConf *wasmedge.Configure counter uint64 ) func main() { // Initialize WasmEdge's VM initVM() defer vm.Delete() defer vmConf.Delete() // Connect to Zipper service cli, err := client.NewServerless("image-recognition").Connect("localhost", 9000) if err ! = nil {log.Print("❌ Connect to zipper failure: ", err) return } defer cli.Close() cli.Pipe(Handler) } // Handler process the data in the stream func Handler(rxStream rx.RxStream) rx.RxStream { stream := rxStream. Subscribe(ImageDataKey). OnObserve(decode). Encode(0x11) return stream } // decode Decode and perform image recognition var decode = func(v []byte) (interface{}, error) { // get image binary p, _, _, err := y3.DecodePrimitivePacket(v) if err ! = nil { return nil, err } img := p.ToBytes() // recognize the image res, err := vm.ExecuteBindgen("infer", wasmedge.Bindgen_return_array, img) return hash, nil } ... . // initVM initialize WasmEdge's VM func initVM() { wasmedge.SetLogErrorLevel() vmConf = wasmedge.NewConfigure(wasmedge.WASI) vm = wasmedge.NewVMWithConfig(vmConf) var wasi = vm.GetImportObject(wasmedge.WASI) wasi.InitWasi( os.Args[1:], /// The args os.Environ(), /// The envs []string{".:."}, /// The mapping directories []string{}, /// The preopens will be empty ) /// Register WasmEdge-tensorflow and WasmEdge-image var tfobj = wasmedge.NewTensorflowImportObject() var tfliteobj = wasmedge.NewTensorflowLiteImportObject() vm.RegisterImport(tfobj) vm.RegisterImport(tfliteobj) var imgobj = wasmedge.NewImageImportObject() vm.RegisterImport(imgobj) /// Instantiate wasm  vm.LoadWasmFile("rust_mobilenet_food_lib_bg.wasm") vm.Validate() vm.Instantiate() }

run

Finally, we start YOMO and see the entire data processing pipeline in action. Launch the Yomo CLI application from the project folder. The YAML file defines the port that YOMO should listen to and the workflow handler that triggers the incoming data. Notice that the stream name Image-recognition matches the data handler app.go mentioned above.

$ yomo serve -c ./zipper/workflow.yaml

Start the Handler program by running the app.go program mentioned above.

$ cd flow
$ go run --tags "tensorflow image" app.go

Start the simulated data source by sending data to YOMO. A video is a series of picture frames. The wasMedge function in app.go will be called for each image frame in the video.

# Download a video file
$ wget -P source 'https://github.com/yomorun/yomo-wasmedge-tensorflow/releases/download/v0.1.0/hot-dog.mp4'

# Stream the video to YoMo
$ go run ./source/main.go ./source/hot-dog.mp4

You can see the output of the WASMEDGE Handler function in the console. It prints the name of the object detected in each image frame of the video.

Looking to the future

This article discussed how to use the Wasmedge TensorFlow API in the YOMO framework and the Golang SDK to process image streams in near real time.

In collaboration with YOMO, we will soon deploy the WASMEDGE in the actual production of the smart plant for a variety of assembly line tasks. WASMEDGE is the software runtime for edge computing!