In Python3, sockets can only transmit information in the format of Bytes, which is reverted back to the original type when it reaches the server.

Many Socket code used to transmit pictures on the Internet is very complex, this paper aims to give a simple demo, if there are other needs can be expanded.

1. For Numpy format images:

  • Code:

    • The first step:_, img_encode = cv2.imencode('.jpg', img_numpy)
    • The second step:img_bytes = img_encode.tobytes()
  • When decoding:

    • The first step:img_buffer_numpy = np.frombuffer(img_bytes, dtype=np.uint8)
    • The second step:img_numpy = cv2.imdecode(image_buffer_numpy_data, 1)

2. For information in string format:

  • Code:

    • msg_bytes = msg_str.encode()
  • When decoding:

    • msg_str = msg_bytes.decode()

server.py

import socket import cv2 import numpy as np import os class VideoServer: def __init__(self): Socket (socket.af_inet, socket.sock_stream) self.socket.setsockopt (socket.sol_socket, socket.sock_stream) Socket.so_reuseaddr, 1) self. Socket.bind (('127.0.0.1', 8002)) # self. Socket.listen (1) def Get(self): Conn, addr = self.socket.accept () print(addr,'... ') os.makedirs('./save',exist_ok=True) while True: Img_data = conn.recv(4073800) # This number is larger than the product of the width and length of the image. Img_name = conn. Recv (1024) # Convert image bytecode bytes to a one-dimensional numpy array in the cache img_buffer_numpy = np.frombuffer(img_data) DType =np.uint8) # Read 1D NumPy data from the specified memory cache, Frame = cv2.imdecode(img_buffer_numpy, 1) name = img_name.decode() cv2.imwrite('./save/'+name, Frame) print(name) self.sock. Close () if __name__ == '__main__': vs = videoServer () vs Get()

client.py

import socket import cv2 import numpy import time class VideoClient: def __init__(self): Socket.socket (socket.af_inet, socket.sock_stream) self.socket.connect (('127.0.0.1', '127.0.0.1', '127.0.0.1') 8002)) def Send(self): Cap = cv2.videoCapture ('D:/test/ CCC /mp4/1.mp4') FPS = cap.get(cv2.cap_prop_fps) success, frame = cap.read() if success: k += 1 if k % (fps*4) == 0: Img_encode = cv2.imencode('.jpg', img_encode = cv2.imencode('.jpg'), img_encode = cv2.imencode('.jpg'), Frame) img_data = img_encode.tobytes() # Img_name = (STR (k)+'.jpg').encode() # Self. Sock. Send (img_data) self. Sock. Send (img_name) time.sleep(1) print(' %3d.jpg, sleep 1 SEC ') cap.release() self.sock.close() if __name__ == '__main__': vc = VideoClient() vc.Send()

Start server.py first, then client.py, and the effect is as follows: