This is the 13th day of my participation in the August More Text Challenge. For details, see:August is more challenging

Android Screen sharing – hard coding hard decoding

Speaking of screen sharing between Android, the first contact will be relatively unfamiliar, but how many people have known ffMPEG, does it look familiar? Ffmpeg is a set of processing audio and video open source program, but for students who know less about C, compilation is very complex, there is no pure JAVA operation method, there is.

First, the effect diagram

  • GIF

  • The Demo interface

Two, soft decoding and hard decoding

  • Soft decoding

The use of CPU computing for decoding, such as the use of FFmpeg decoding, because the decoding is through THE CPU operation, so increase the CPU burden, increase the power consumption.

  • Hard to decode

Using the mobile phone’s own chip processing video specialized module encoding, such as DSP. The CPU requirements are relatively low and mainly depend on the hardware, so the performance of the decoding chip in different mobile phones may be inconsistent. The advantage is that hard decoding is faster than soft decoding because it is a separate processing chip.

3. Code analysis

3.1 Android hardcoding

Hardcoding is mainly implemented by accessing the underlying codec using MediaCodec, which is a class provided by Android for codec and audio.

Overall, the steps are divided into the following steps:

Graph LR [record screen] -- > B/policy panel data for B - > C/hard decoding C - > D/handling code data D - > E [transfer]

The detailed code is as follows:

  1. Apply for screen recording permission
private void requestCapturePermission() throws Exception { if ((Build.VERSION.SDK_INT > Build.VERSION_CODES.LOLLIPOP)) { MMediaProjectionManager = (MediaProjectionManager) getSystemService( Context.MEDIA_PROJECTION_SERVICE); startActivityForResult(mMediaProjectionManager.createScreenCaptureIntent(), REQUEST_MEDIA_PROJECTION); } else {throw new Exception(" Android version < 5.0"); }}Copy the code
  1. In the confirmed callbackMediaProjection

MediaProjection mediaProjection = mMediaProjectionManager.getMediaProjection(resultCode, data);

Copy the code
  1. Configure and obtainMediaCodec
private MediaCodec prepareVideoEncoder() throws IOException { MediaFormat format = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, mVideoEncodeConfig.width, mVideoEncodeConfig.height); format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface); // format.setInteger(KEY_BIT_RATE, (int) (mVideoEncodeConfig.width * mVideoEncodeConfig.height * mVideoEncodeConfig.rate * mVideoEncodeConfig.factor)); format.setInteger(KEY_BIT_RATE, (int) (mVideoEncodeConfig.width * mVideoEncodeConfig.height * mVideoEncodeConfig.rate * mVideoEncodeConfig.factor)); format.setInteger(KEY_FRAME_RATE, mVideoEncodeConfig.rate); / / frame format. SetInteger (KEY_I_FRAME_INTERVAL, mVideoEncodeConfig i_frame); // This code can achieve great clarity, but on Huawei Nova 5i 10. Not supported on 0. Reference: https://www.jianshu.com/p/a0873b4a92b6 // format.setInteger(MediaFormat.KEY_BITRATE_MODE, MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_CQ); / / -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- ADD BY XU, WANG when only picture, repeat the last frame -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - format.setLong(MediaFormat.KEY_REPEAT_PREVIOUS_FRAME_AFTER, 1000000 / 45); / / -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- the MODIFY BY XU, WANG to increase to solve MIUI9.5 flowers screen... ------------------------------- if (Build.MANUFACTURER.equalsIgnoreCase("XIAOMI")) { format.setInteger(MediaFormat.KEY_BITRATE_MODE, MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_CQ); } else { format.setInteger(MediaFormat.KEY_BITRATE_MODE, MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_VBR); } format.setInteger(MediaFormat.KEY_COMPLEXITY, MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_CBR); MediaCodec mediaCodec = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_VIDEO_AVC); mediaCodec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE); Surface surface = mediaCodec.createInputSurface(); mVirtualDisplay = mMediaProjection.createVirtualDisplay("-display", mVideoEncodeConfig.width, mVideoEncodeConfig.height, 1, DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC, surface, null, null); return mediaCodec; }Copy the code
  1. Start theMediaCodeC

mMediaCodec.start();

Copy the code
  1. Start the thread, code continuously
mVideoEncodeThread = new Thread(new Runnable() { @Override public void run() { while (mVideoCoding && ! Thread.interrupted()) { try { ByteBuffer[] outputBuffers = mMediaCodec.getOutputBuffers(); int outputBufferId = mMediaCodec.dequeueOutputBuffer(vBufferInfo, 0); if (outputBufferId >= 0) { ByteBuffer bb = outputBuffers[outputBufferId]; onEncodedAvcFrame(bb, vBufferInfo); mMediaCodec.releaseOutputBuffer(outputBufferId, false); } } catch (Exception e) { e.printStackTrace(); break; }}}});Copy the code
  1. The onEncodedAvcFrame handles the encoded data

  2. To transmit data

3.2 Android hard decoding

Overall, the steps are divided into the following steps:

Graph LR [receiving data] -- > B [create SurfaceView] -- > B C [SurfaceView association] C - > D/decoding chip D - > E [rendering]

Detailed code steps are as follows:

  1. Receive data

Exiles the received binary data into the decoding utility class


@Override

public void onReceive(byte[] packet) {

mediaDecodeUtil.decodeFrame(packet);

}

Copy the code
  1. createSurfaceView

Associated with the decoder class after the SurfaceView is created

surface_view.getHolder().addCallback(new SurfaceHolder.Callback() { @Override public void surfaceCreated(SurfaceHolder holder) { Log.d(TAG, "surfaceCreated"); try { if (mediaDecodeUtil ! = null) mediaDecodeUtil.onInit(surface_view); } catch (IOException e) { e.printStackTrace(); } } @Override public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) { Log.d(TAG, "surfaceChanged"); } @Override public void surfaceDestroyed(SurfaceHolder holder) { Log.d(TAG, "surfaceDestroyed"); if (mediaDecodeUtil ! = null) mediaDecodeUtil.onDestroy(); }});Copy the code
private void configDecoder(MediaFormat newMediaFormat, SurfaceView surfaceView) { if (mediaCodec == null) return; Try {mediacodec.stop (); // Try {mediacodec.stop (); // Try {mediacodec.stop (); mediaFormat = newMediaFormat; // Configure the SurfaceView corresponding to the MediaCodec //!! Note that this line of code cannot be called until the SurfaceView interface has been drawn!! mediaCodec.configure(newMediaFormat, surfaceView.getHolder().getSurface(), null, 0); Mediaformat.setinteger (mediaformat.key_bitrate_mode, MediaCodecInfo. EncoderCapabilities. BITRATE_MODE_CBR) / / said they would try to control the output bit rate for the encoder Settings mediaFormat. SetInteger ( MediaFormat.KEY_BITRATE_MODE, MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_CQ ); // Do not control the bit rate at all // MediaFormat.setINTEGER (mediaformat.key_bitrate_mode, MediaCodecInfo. EncoderCapabilities. BITRATE_MODE_VBR) that / / encoder will according to the complexity of the image content (which is actually the size of the interframe variation) to dynamically adjust the output bit rate, image complex code rate is high, Mediacodec.start (); // Set the video retention aspect ratio, This method is valid must be executed after the configure and start mediaCodec. SetVideoScalingMode (mediaCodec. VIDEO_SCALING_MODE_SCALE_TO_FIT_WITH_CROPPING); } catch (Exception e) { e.printStackTrace(); }}Copy the code
  1. Decoding chip

This part of the logic is that when receiving data, have to look for free DSP decoding chip, so, go to processing, the short cycle time is not set to find free decoding chip, because only when he has free chips, can decode, otherwise there will be a green screen screen phenomenon.

Private void decodeFrameDetail(byte[] bytes) {private void decodeFrameDetail(byte[] bytes) {private void decodeFrameDetail(byte[] bytes) { It returns 1 int inIndex = mediaCodec. DequeueInputBuffer (TIME_OUT_US); If (inIndex > = 0) {/ / take out the corresponding index of the available area ByteBuffer ByteBuffer. = mediaCodec getInputBuffer (inIndex); if (byteBuffer ! = null) {// Put a frame's data into the available area, bytebuffer.put (bytes, 0, bytes.length); mediaCodec.queueInputBuffer(inIndex, 0, bytes.length, 0, 0); // mediaCodec.queueInputBuffer(inIndex, 0, bytes.length, System.currentTimeMillis(), 0); } else {// If no DSP is available, consider using a for loop that loops 5-10 times to find the available DSP. If you can't get him to do it. Log.d(TAG, "no DSP currently available "); return; } // Fetch the encoded data mediacodec.bufferInfo BufferInfo = new Mediacodec.bufferInfo (); if (! isNeedContinuePlay) return; int outIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, TIME_OUT_US); // Take out all the coded ones. while (outIndex >= 0) { mediaCodec.releaseOutputBuffer(outIndex, true); outIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, TIME_OUT_US); if (mIsNeedFixWH) { fixHW(); mIsNeedFixWH = false; }}}Copy the code

3.3 the transmission

Because the coded byte stream will be sent frequently and the amount of data is relatively large, so Android’s native Socket will be very troublesome to use TCP, such as the case of subcontract sticky packets, although these problems can be solved, but for convenience, still use WebSocket. Since WebSocket is a packet-based protocol and TCP is a stream-based protocol, we’ll use WebSocket in the Demo since the WebSocket protocol is already done.

  1. Dependencies need to be introduced first, because they are not native
Implementation "org. Java - websocket: Java - websocket: 1.3.6." "Copy the code
  1. Encapsulate two utility classes, one client and one server, so that we can get the coded data and put it in the utility class.

Client focus code


public class MWebSocketClient extends WebSocketClient {

private final String TAG = "MWebSocketClient";

private boolean mIsConnected = false;

private CallBack mCallBack;

public MWebSocketClient(URI serverUri, CallBack callBack) {

super(serverUri);

this.mCallBack = callBack;

}

@Override

public void onOpen(ServerHandshake handshakedata) {

// ...

}

@Override

public void onMessage(String message) {

// ...

}

@Override

public void onMessage(ByteBuffer bytes) {

byte[] buf = new byte[bytes.remaining()];

bytes.get(buf);

if (mCallBack != null)

mCallBack.onClientReceive(buf);

}

@Override

public void onClose(int code, String reason, boolean remote) {

// ...

}

@Override

public void onError(Exception ex) {

// ...

}

}

Copy the code

Server key code


public class MWebSocketServer extends WebSocketServer {

@Override

public void onOpen(WebSocket webSocket, ClientHandshake handshake) {

}

@Override

public void onClose(WebSocket conn, int code, String reason, boolean remote) {

}

@Override

public void onMessage(WebSocket conn, String message) {

}

@Override

public void onError(WebSocket conn, Exception ex) {

}

@Override

public void onStart() {

}

}

Copy the code