You recently needed to emulate the server in your project: The server continuously sends images and their strings to the client, which receives the information, processes it, and sends the results back to the server.

On the Internet to find a lot of socket code, or can only send string information does not support picture transmission; Either the client sends the image to the server, because the general logic is to start the server first and then start the client, so it is also strange to simply change the body of the image to the server, because then only the client can start first.

Finally, I chose the second one, which was not very satisfactory after completion. First, I needed to start the client first and then start the server, which made me feel awkward. The second is the feeling that the code is a little bloated, does not look good.

After a lot of effort, I finally found Python’s WebSocket method (I only knew Java had WebSocket before). The code was much simpler than Socket, and I fell in love with it at first sight. Although this code can only transfer a simple string, but I am willing to change above!

My idea of change is relatively clear, is to compress the image into a format that can be transmitted efficiently over the network. In the middle is quite troublesome, format back and forth. I’m using a blog, thank you very much.

The final transformation scheme is as follows:

  • Encoding phase (Server) :numpy -> bytes -> base64 -> string(utf8) -> dict -> json
  • Decoding phase (Client) :json -> dict -> base64 -> bytes -> numpy

I used the dictionary dict above to package the video frame with other string information (such as the image’s name and size). The rest is the framework of the code mentioned earlier, and the demo can be found in the previous article [Simple demo using WebSocket for communication].


server.py

import asyncio import websockets import time import cv2 from sys import argv from base64 import b64encode from json Def make_json(image_bytes_data,image_name) def make_json(image_bytes_data,image_name) # # These two lines of code can directly read binary image, get bytes bytecode, Note that 'rb' # jpg_file = open(path, 'rb') # byte_content = jpg_file.read() # convert numpy image into bytes,encoded_image = Cv2.imencode (".jpg",image_bytes_data) image_bytes_data = encoded_image.tobytes() # Encode the original bytecode bytes into base64 bytecode Base64_Bytes = B64encode (image_bytes_data) # Decode base64 bytes_data into UTF-8 format Save data in the form of a dictionary DICT_DATA = {} DICT_DATA ["image_name"] = image_name DICT_DATA ["image_base64_string"] = base64_string # The dictionary becomes Json_data = dumps(dict_data, indent=2) return json_data async def echo(websocket, path): Video_path = 'D:/test/ CCC /mp4/1.mp4' # cap = CV2.videoCapture (video_path) # read video FPS = cap.get(CV2.cap_prop_fps) # While True: k=0 while cap.isopened (): success, frame = cap.read() if success: k += 1 if k % (FPS *2) == 0: Data = make_json(frame, STR (k)+'.jpg') await websocket.send(data) print(' %3d.jpg has been successfully sent ') '%k ') await asyncio.sleep(2) # sleep cap.release() if __name__ == '__main__': Start_server = websockets. Serve (echo,'127.0.0.1',6666) # Change your own address asyncio.get_event_loop().run_until_complete(start_server) asyncio.get_event_loop().run_forever()

client.py

Asyncio import websockets import jSON import numpy as np import cv2 Save the image to the local def get_json(json_data) as Numpy: Dict_data = JSON. loads(json_data) # dict_data to restore JSON to dict # Dictionary.image = dict_data['image_name'] Image_base64_string = dict_data['image_base64_string'] # Decode base64 to image bytes image_bytes_data = Base64.b64decode (image_base64_string) # # The following two lines can directly save the image in byte code format to the local binary form, notice 'wb'. # with open('./new.jpg', 'wb') as jpg_file: # with open('./new.jpg', 'wb') as jpg_file: # jpg_file.write(image_bytes_data) # image_buffer_numpy_data (image_bytes_data) # jpg_file.write(image_bytes_data) # image_buffer_numpy_data (image_bytes_data Numpy array (image_bytes_data, dtype=np.uint8) Image_buffer_numpy_data = cv2.imdecode(image_buffer_numpy_data, image_buffer_numpy_data) 1) return image_numpy_data,image_name async def hello(uri): async with websockets.connect(uri) as websocket: while True: Json_data = await websocket.recv() # receive message img, Print (' received successfully ',name) CV2.imwrite ('./save/'+name,img) print('./save/'+name,img '__main__': asyncio.get_event_loop().run_until_complete(hello('ws://127.0.0.1:6666')) # Change to your own address

If you run server.py first, then client.py, the effect is:

Reference:


https://websockets.readthedoc…


https://www.cnblogs.com/zhumi…