This article focuses on how flow processing is done in job pools. In the Internet of Things platform, the most common scenario is to upload the data collected by the perception layer to the message queue server, and then the service layer receives the data by using the corresponding client, and finally processes the data and saves it in the database. Finally through APP, small program, web and other ways to display the results. The core process is how to receive the data from message-oriented middleware, and how to efficiently process the data after receiving it? If there is such a platform, it can not only access the middleware through fast configuration, but also transform and process the data, and make persistent processing (database, ES, hbase), it can perfectly solve the data problems of the Internet of Things. Such a complex Internet of Things project is transformed into a normal Internet project, which only needs to present the data. This topic uses the MQTT protocol as an example to demonstrate the flow processing capability of the Kettle platform.

First, preparation

For setting up an MQTT server, see www.emqx.com/zh

After successful installation, the picture is as follows:

MQTT producers send data:

Write producer: the conversion file is written as follows

Start the producer and connect to the MQTT server as follows:

2. The KETTLE tool pDI subscribes MQTT data

MQTT subscribes to a topic that receives data

Through sub-transformation, processing stream data, will receive the data, call telemetry data REST interface, processing.

3. The Kettle platform processes MQTT data

Upload the data processing platform based on the kettle and start producers and consumers

Remote Data REST Interface:

Stream data processing log:

Four,

This article focuses on how the Kettle platform processes stream data. It supports kafka, Rabbit, Socket, and Websocket as well as MQTT.