Recently, I learned how to develop educational live broadcast APP by using the TRTC SDK of Tencent Cloud real-time video and audio. There is a requirement that is often used in various live broadcast scenarios, that is:

How to realize multi-person online live broadcast at the same time

Release the effect picture first:

— Serious Face ignores the greasy face in the picture

So let’s talk about how to use Texture CollectionNode to do this.

To learn more about Texture, check out the Texture website

Before writing, we also need to introduce Tencent’s real-time audio and video TRTC. Through TRTC, we can quickly render real-time video data to the view. We do not need to consider how to realize real-time audio and video live interaction, so that we can focus on our own business logic.

Tencent real-time audio and video TRTC

Tencent Real-time Communication (TRTC) is open to developers through Tencent cloud services with the two scenarios of multi-person audio and video calls and low-delay interactive live broadcasting, which integrates Tencent’s 21 years of deep accumulation in network and audio and video technologies. Committed to helping developers quickly build low-cost, low latency, high-quality audio and video interactive solutions.

TRTC focuses on multi-person audio and video calls and low-delay interactive live broadcasting solutions with full platform communication. The SDK provides small program, Web, Android, iOS, Electron, Windows, macOS, Linux and other platforms for developers to quickly integrate and connect with the background of real-time audio and video TRTC cloud service. Through the interaction between different Tencent cloud products, it can also easily and quickly use real-time audio and video TRTC with instant messaging IM, cloud live CSS, cloud VOD and other cloud products to expand more business scenarios.

The real-time audio and video TRTC product architecture is shown in the figure below:

In the process of development, IT was found that the integration of Tencent real-time audio and video TRTC SDK was very fast, and the real-time experience of live video and voice broadcast delay was within the acceptable range. The core functions we are currently using are:

For detailed introduction and use, please refer to the TRTC product details on the official website

Tencent Demo

First of all, the TRTC SDK of Tencent real-time audio and video provided by Tencent is used. Through the supporting Demo provided by Tencent, you will find that each screen of live broadcast on the stage is one UIView, and then dynamically increase or remove UIView of live broadcast screen according to the situation of going on stage or stepping down. The specific code can be referred to:

@property (weak, nonatomic) IBOutlet UIView *renderViewContainer;

@property (nonatomic, strong) NSMutableArray *renderViews;

#pragma mark - Accessor
- (NSMutableArray *)renderViews
{
    if(! _renderViews){ _renderViews = [NSMutableArray array]; }return _renderViews;
}

#pragma mark - render view
- (void)updateRenderViewsLayout
{
    NSArray *rects = [self getRenderViewFrames];
    if(rects.count ! = self.renderViews.count){return;
    }
    for (int i = 0; i < self.renderViews.count; ++i) {
        UIView *view = self.renderViews[i];
        CGRect frame = [rects[i] CGRectValue];
        view.frame = frame;
        if(! view.superview){ [self.renderViewContainer addSubview:view]; } } } - (NSArray *)getRenderViewFrames { CGFloat height = self.renderViewContainer.frame.size.height; CGFloat width = self.renderViewContainer.frame.size.width / 5; CGFloat xOffset = 0; NSMutableArray *array = [NSMutableArray array];for (int i = 0; i < self.renderViews.count; i++) {
        CGRect frame = CGRectMake(xOffset, 0, width, height);
        [array addObject:[NSValue valueWithCGRect:frame]];
        xOffset += width;
    }
    return array;
}

- (TICRenderView *)getRenderView:(NSString *)userId streamType:(TICStreamType)streamType
{
    for (TICRenderView *render in self.renderViews) {
        if([render.userId isEqualToString:userId] && render.streamType == streamType){
            returnrender; }}return nil;
}

#pragma mark - event listener
- (void)onTICUserVideoAvailable:(NSString *)userId available:(BOOL)available
{
    if(available){
        TICRenderView *render = [[TICRenderView alloc] init];
        render.userId = userId;
        render.streamType = TICStreamType_Main;
        [self.renderViewContainer addSubview:render];
        [self.renderViews addObject:render];
        [[[TICManager sharedInstance] getTRTCCloud] startRemoteView:userId view:render];
    }
    else{
        TICRenderView *render = [self getRenderView:userId streamType:TICStreamType_Main];
        [self.renderViews removeObject:render];
        [render removeFromSuperview];
        [[[TICManager sharedInstance] getTRTCCloud] stopRemoteView:userId];
    }
    [self updateRenderViewsLayout];
}

- (void)onTICUserSubStreamAvailable:(NSString *)userId available:(BOOL)available
{
    if(available){
        TICRenderView *render = [[TICRenderView alloc] init];
        render.userId = userId;
        render.streamType = TICStreamType_Sub;
        [self.renderViewContainer addSubview:render];
        [self.renderViews addObject:render];
        [[[TICManager sharedInstance] getTRTCCloud] startRemoteSubStreamView:userId view:render];
    }
    else{
        TICRenderView *render = [self getRenderView:userId streamType:TICStreamType_Sub];
        [self.renderViews removeObject:render];
        [render removeFromSuperview];
        [[[TICManager sharedInstance] getTRTCCloud] stopRemoteSubStreamView:userId];
    }
    [self updateRenderViewsLayout];
}
Copy the code

I don’t know what the advantage of doing this is, but I don’t feel comfortable with the code. It’s ok to write this way, although I can literally understand the code. When someone comes on stage, add a UIView, move the frame, Render the remote stream or local stream onto UIView with the help of Tencent TRTC SDK.

However, combined with our specific business scenarios, we intuitively found that each live broadcast screen not only contains live stream, but also contains other interactive things and states, such as the nickname of the user on stage of each live broadcast screen, whether they have permission to speak, whether they are talking and so on. So each live screen UIView is more like a UICollectionView item.

So I need to modify, or “refactor,” this piece of code.

Now it’s time for our hero: ASCollectionNode

ASCollectionNode

ASCollectionNode is equivalent to UIKit’s UICollectionView and can be used in place of any UICollectionView.

Use the introduction can directly see the official website, as long as you have used UICollectionView to operate this is very simple, see the official website link

Initialize the

@interface ZJRendersView : UIView <ASCollectionDataSourceInterop, ASCollectionDelegate, ASCollectionViewLayoutInspecting>

@property (nonatomic, strong) ASCollectionNode *collectionNode;;
@property (nonatomic, strong) NSMutableDictionary<NSString*, NSMutableDictionary*> *onlineUsers;
Copy the code

Create a key-value pair of NSMutableDictionary array onlineUsers to hold user information.

- (instancetype)init {
    self = [super init];
    if(self) { _onlineUsers = [NSMutableDictionary new]; UICollectionViewFlowLayout* flowLayout = [[UICollectionViewFlowLayout alloc] init]; FlowLayout. MinimumInteritemSpacing = 0.1; FlowLayout. MinimumLineSpacing = 0.1; _collectionNode = [[ASCollectionNode alloc] initWithCollectionViewLayout:flowLayout]; _collectionNode.dataSource = self; _collectionNode.delegate = self; _collectionNode.backgroundColor = UIColorClear; _collectionNode.layoutInspector = self; [self addSubnode:_collectionNode]; [_collectionNode.view mas_makeConstraints:^(MASConstraintMaker *make) { make.left.mas_equalTo(self);  make.top.mas_equalTo(self); make.right.mas_equalTo(self); make.bottom.mas_equalTo(self); }]; }return self;
}
Copy the code

The initialization is relatively easy. The layout I would like to arrange is mainly for the navigation, so I would save a lot of thoughts on the layout. Since it is a team project, we would try our best to avoid storyboards, and we would try to complete layout and UIView by code.

ASCollectionDataSource

#pragma mark - ASCollectionNode data source.

- (ASCellNodeBlock)collectionNode:(ASCollectionNode *)collectionNode nodeBlockForItemAtIndexPath:(NSIndexPath *)indexPath
{
    NSString* key = _keys[indexPath.item];
    NSDictionary *user = _onlineUsers[key];
    ASCellNode *(^cellNodeBlock)() = ^ASCellNode *() {
        return [[ZJRenderNode alloc] initWithHashID:key user:user];
    };

    returncellNodeBlock; } // The below 2 methods are required by ASCollectionViewLayoutInspecting, but ASCollectionLayout and its layout delegate are the ones that really determine the size ranges and directions // TODO  Remove these methods once a layout inspector is no longer required under ASCollectionLayout mode - (ASSizeRange)collectionView:(ASCollectionView *)collectionView constrainedSizeForNodeAtIndexPath:(NSIndexPath *)indexPath {returnASSizeRangeMake(CGSizeMake([UIScreen mainScreen].bounds.size. Width / 7.0, self.bounds.size. Height), CGSizeMake([UIScreen mainScreen].bounds.size.width / 7.0, self.bounds.size.height); } - (ASScrollDirection)scrollableDirections {return ASScrollDirectionHorizontalDirections;
}

- (NSInteger)numberOfSectionsInCollectionNode:(ASCollectionNode *)collectionNode
{
    return 1;
}

- (NSInteger)collectionNode:(ASCollectionNode *)collectionNode numberOfItemsInSection:(NSInteger)section
{
    return _keys.count;
}
Copy the code

Here, according to the needs of our business, the power of live on the same line interface, which is set rolling direction is: ASScrollDirectionHorizontalDirections, row shows: numberOfSectionsInCollectionNode to 1. CGSizeMake([UIScreen mainScreen].bounds.size. Width / 7.0, self.bounds.size. Height)

Next is how to do the layout of each item.

ZJRenderNode

As shown in the picture below, each live broadcast interface contains many elements, such as the lecturer’s mark, user name, voice volume bar, number of trophies won, etc.

ASButtonNode, ASAbsoluteLayoutSpec, and ASInsetLayoutSpec were introduced in previous articles.

We’re going to look at some of the others that we use today.

- (instancetype)init {
    self = [super init];
    if (self) {
        _backgroundNode = [[ASDisplayNode alloc] init];
        [self addSubnode:_backgroundNode];

        _bottomBackgroundNode = [[ASDisplayNode alloc] init];
        _bottomBackgroundNode.backgroundColor = [UIColorMakeWithHex(@"#3d3d3d") colorWithAlphaComponent: 0.522]; [self addSubnode:_bottomBackgroundNode]; _nicknameNode = [[ASTextNode alloc] init]; _nicknameNode.maximumNumberOfLines = 1; _nicknameNode.backgroundColor = [UIColorMakeWithHex(@"#3d3d3d") colorWithAlphaComponent:0];
        [_bottomBackgroundNode addSubnode:_nicknameNode];

        _permissionNode = [ASImageNode new];
        _permissionNode.image = UIImageMake(@"icon_permission");
        _permissionNode.backgroundColor = [UIColorMakeWithHex(@"#3d3d3d") colorWithAlphaComponent:0];
        [self addSubnode:_permissionNode];

        _microNode = [ASImageNode new];
        _microNode.backgroundColor = [UIColorMakeWithHex(@"#3d3d3d") colorWithAlphaComponent:0];
        [_bottomBackgroundNode addSubnode:_microNode];

        _zanNode = [[ASButtonNode alloc] init];
        [_zanNode setImage:UIImageMake(@"icon_zan") forState:UIControlStateNormal];
        [_zanNode setContentHorizontalAlignment:ASHorizontalAlignmentMiddle];
        [_zanNode setContentSpacing:2];
        _zanNode.backgroundColor = [UIColorMakeWithHex(@"#3d3d3d") colorWithAlphaComponent:0];
        _zanNode.hidden = YES;
        [_bottomBackgroundNode addSubnode:_zanNode];

        _volumnNode = [[ASDisplayNode alloc] init];
        _volumnNode.backgroundColor = [UIColorMakeWithHex(@"#3d3d3d") colorWithAlphaComponent:0];
        [self addSubnode:_volumnNode];

        _teacherIconNode = [ASImageNode new];
        _teacherIconNode.image = UIImageMake(@"icon_jiangshi");
        _teacherIconNode.backgroundColor = [UIColorMakeWithHex(@"#3d3d3d") colorWithAlphaComponent:0];
        [self insertSubnode:_teacherIconNode aboveSubnode:_volumnNode];

        [self updatePermission:user];
    }
    return self;
}
Copy the code

There are three main layouts to consider.

The first is to set up a backgroundNode that accepts video streams from remote and local streams and displays live images. In our design, we used the video stream as a background layer and then added our other elements on top of it. So here we use the AsBackground Layer Out Spec.

ASBackgroundLayoutSpec

ASBackgroundLayoutSpec lays out a component (blue), stretching another component behind it as a backdrop (red).

The background spec’s size is calculated from the child’s size. In the diagram below, the child is the blue layer. The child’s size is then passed as the constrainedSize to the background layout element (red layer). Thus, it is important that the child (blue layer) must have an intrinsic size or a size set on it.

ASInsetLayoutSpec* backgroundInsetLayoutSpec =  [ASInsetLayoutSpec
            insetLayoutSpecWithInsets:UIEdgeInsetsMake(0, 0, 0, 0)
                                child:_backgroundNode];

    return [ASBackgroundLayoutSpec backgroundLayoutSpecWithChild:contentSpec background:backgroundInsetLayoutSpec];
Copy the code

The second is the bottom viewbottomBackgroundNodeFor layout of microphone buttons, nicknames, likes and so on, this is the layout we useMasonryLet’s do constraints.

Dispatch_async (dispatch_get_main_queue(), ^{NSString* voiceIcon = [_user[@"voice"] boolValue] ? @"icon_microphone_good" : @"icon_microphone_bad";
    _microNode.image = UIImageMake(voiceIcon);

    if([_key isEqualToString:_my_key])if ([_user[@"voice"] boolValue]) {
            [[[TICManager sharedInstance] getTRTCCloud] startLocalAudio];
        } else {
            [[[TICManager sharedInstance] getTRTCCloud] stopLocalAudio];
        }
        [[[TICManager sharedInstance] getTRTCCloud] muteLocalAudio:![_user[@"voice"] boolValue]]; } // Update thumbs-upif (_user && [_user[@"zan"] intValue] > 0) {
        _zanNode.hidden = NO;
        [_zanNode setTitle:_user[@"zan"] withFont:UIFontMake(10) withColor:UIColor.ZJ_tintColor forState:UIControlStateNormal]; } // User nickname informationif (_user[@"nickname"] != nil) {
        NSString *nickname = [_user[@"nickname"] stringValue].length > 7 ? [[_user[@"nickname"] stringValue] substringWithRange:NSMakeRange(0, 7)] : [_user[@"nickname"] stringValue]; _nicknameNode.attributedText = [[NSAttributedString alloc] initWithString:nickname attributes:@{ NSFontAttributeName : UIFontMake(10), NSForegroundColorAttributeName: UIColor.ZJ_tintColor, }]; } _teacherIconNode.hidden = ! [_user[@"isteacher"] boolValue];

    _permissionNode.hidden = [_user[@"isteacher"] boolValue] || ! [_user[@"board"] boolValue]; [_permissionNode.view mas_updateConstraints:^(MASConstraintMaker *make) { make.top.mas_equalTo(self.view.mas_top).offset(4); make.right.mas_equalTo(self.view.mas_right).offset(-4);  make.width.mas_equalTo(11); make.height.mas_equalTo(10); }]; [_microNode.view mas_updateConstraints:^(MASConstraintMaker *make) { make.centerY.mas_equalTo(_bottomBackgroundNode.view.mas_centerY); Make. Left. Mas_equalTo (_bottomBackgroundNode. View). Offset (4), make the width. The mas_equalTo (7.5), and make the height. The mas_equalTo (9); }]; [_zanNode.view mas_updateConstraints:^(MASConstraintMaker *make) { make.centerY.mas_equalTo(_bottomBackgroundNode.view.mas_centerY);  make.right.mas_equalTo(_bottomBackgroundNode.view.mas_right).offset(-4); make.width.mas_equalTo(18); Make. Height. Mas_equalTo (13.5);}]; CGSize size = [_nicknameNode calculateSizeThatFits:CGSizeMake(20, 16)]; [_nicknameNode.view mas_updateConstraints:^(MASConstraintMaker *make) { make.left.mas_equalTo(self.microNode.view.mas_right).offset(4);  make.centerY.mas_equalTo(_bottomBackgroundNode.view.mas_centerY); make.right.mas_equalTo(_zanNode.view.mas_left);  make.height.mas_equalTo(size.height); }]; });Copy the code

[_user[@”voice”] boolValue] > boolValue > boolValue

[[[TICManager sharedInstance] getTRTCCloud] stopLocalAudio];
Copy the code

Otherwise, open local audio:

[[[TICManager sharedInstance] getTRTCCloud] startLocalAudio]; [[TICManager sharedInstance] getTRTCCloud] muteLocalAudio:![_user[@"voice"] boolValue]];
Copy the code

All the navigation controls will be placed vertically and centered:

make.centerY.mas_equalTo(_bottomBackgroundNode.view.mas_centerY);
Copy the code

The important thing to note here is the _nicknameNode layout, because you need to calculate the size of the layout before you can layout it.

The layout here needs to be executed on the main thread:

dispatch_async(dispatch_get_main_queue(), ^{});
Copy the code

The third is the layout of our voice volume bar

[_volumnNode.view mas_updateConstraints:^(MASConstraintMaker *make) { make.left.equalTo(self.view.mas_left).offset(5); Make. Bottom. Mas_equalTo (_bottomBackgroundNode. View. Mas_top); make the height. The mas_equalTo (30); make the width. The mas_equalTo (5.5); }];for (NSUInteger i = 0; i < 10; i++) {
    ASImageNode *itemView = [[ASImageNode alloc] init];
    itemView.image = UIImageMake(@"icon_voiced");
    [itemView setHidden:YES]; [_volumnNode addSubnode:itemView]; [_renderNodes addObject:itemView]; [_renderViews addObject:itemView.view]; } [_renderViews mas_distributeViewsAlongAxis: MASAxisTypeVertical withFixedSpacing: 0.5 leadSpacing: 0 tailSpacing: 0]; [_renderViews mas_updateConstraints:^(MASConstraintMaker *make) {// Vertical orientation can be set to horizontal center Make. CenterX. Mas_equalTo (self. VolumnNode. The mas_centerX); make the width. The mas_equalTo (5.5), and make the height. The mas_equalTo (2.5); }];Copy the code

We divide the volume into 10 equal parts, each represented by ASImageNode, and then stack them vertically. Here we use the mas_distributeViewsAlongAxis vertical layout with a space of 0.5. Each volume occupies 2.5 height, and the layout height is controlled at 30, which fills the volumnNode layout.

Complete the layout

NSMutableArray *mainStackContent = [[NSMutableArray alloc] init];
    if ([_user[@"isteacher"] boolValue]) {_teacherIconNode. Style. PreferredSize = CGSizeMake (22, 22.5); _teacherIconNode.style.layoutPosition = CGPointMake(0, 0); UIEdgeInsets insets = UIEdgeInsetsMake(0, 0, 0, 0); ASInsetLayoutSpec *teacherIconSpec = [ASInsetLayoutSpec insetLayoutSpecWithInsets:insets child:_teacherIconNode]; [mainStackContent addObject:teacherIconSpec]; } _volumnNode. Style. PreferredSize = CGSizeMake (8.5, 50); _volumnNode.style.layoutPosition = CGPointMake(5, 20); _bottomBackgroundNode.style.preferredSize = CGSizeMake(constrainedSize.max.width, 16); _bottomBackgroundNode.style.layoutPosition = CGPointMake(0, constrainedSize.max.height - 16); [mainStackContent addObject:_volumnNode]; [mainStackContent addObject:_bottomBackgroundNode]; ASAbsoluteLayoutSpec *contentSpec = [ASAbsoluteLayoutSpec absoluteLayoutSpecWithChildren:mainStackContent]; ASInsetLayoutSpec* backgroundInsetLayoutSpec = [ASInsetLayoutSpec insetLayoutSpecWithInsets:UIEdgeInsetsMake(0, 0, 0, 0) child:_backgroundNode];return [ASBackgroundLayoutSpec backgroundLayoutSpecWithChild:contentSpec background:backgroundInsetLayoutSpec];
Copy the code

Because the layout structure is simple and the positioning is clear, we use ASAbsoluteLayoutSpec, which was introduced in the last article and won’t be covered more here.

Combining with the TRTC

With the ASCollectionNode layout, the next step is to combine TRTC for push flow and up-and-down logic.

Initialize the TRTC

// Podfile
use_frameworks!
pod 'TEduBoard_iOS'.'2.4.6.1'
pod 'TXIMSDK_iOS'.'4.6.101'
pod 'TXLiteAVSDK_TRTC'.'6.9.8341'
Copy the code

According to TIC, the education solution provided by Tencent Cloud, it is recommended to install the above three plug-ins (whiteboard function, IM chat, and Tencent real-time video and audio TRTC).

Initialize in the AppDelegate:

[[TICManager sharedInstance] init:sdkAppid callback:^(TICModule module, int code, NSString *desc) {
    if(code == 0){ [[TICManager sharedInstance] addStatusListener:self]; }}];Copy the code

The code provided by the official Demo was directly introduced and expanded according to business needs. This paper did not do secondary processing to them, so it was convenient to follow the update and iteration of the official website plug-in.

Note: official connection plug-in is CocoaAsyncSocket, website robbiehanson/CocoaAsyncSocket for reference

The next step is to log into the room.

[[TICManager sharedInstance] login:userId userSig:userSig callback:^(TICModule module, int code, NSString *desc) {
    if(code == 0){
        [JMLoadingHUD hide];
        [QMUITips showSucceed:@"Login successful" inView:[[UIApplication sharedApplication] keyWindow] hideAfterDelay:3];
        ZJClassRoomViewController *classRoom = [ZJClassRoomViewController new];
        TICClassroomOption *option = [[TICClassroomOption alloc] init];
        option.classId = (UInt32) [json[@"room"] [@"id"] intValue];
        classRoom.option = option;
        [ws.navigationController pushViewController:classRoom animated:YES];
    }
    else{
        [JMLoadingHUD hide];
        [[JMToast sharedToast] showDialogWithMsg:[NSString stringWithFormat:@"Login failed: %d,%@",code, desc]]; }}];Copy the code

Here userSig needs to be generated in coordination with the background, refer to the generation rules and interface documents.

[[TICManager sharedInstance] addMessageListener:self];
[[TICManager sharedInstance] addEventListener:self];
__weak typeof(self) ws = self;
[[TICManager sharedInstance] joinClassroom:option callback:^(TICModule module, int code, NSString *desc) {
    if(code == 0) {
//            [JMLoadingHUD hide];
        [QMUITips showSucceed:@"Class is ready." inView:[[UIApplication sharedApplication] keyWindow] hideAfterDelay:3]; // Other business code // / /}else {
        [[JMToast sharedToast] showDialogWithMsg:[NSString stringWithFormat:@"Failed to join class: %d,%@",code, desc]];
        if(code == 10015){
            [[JMToast sharedToast] showDialogWithMsg:@"Class does not exist, please \" create class ""];
        }
        else {
            [[JMToast sharedToast] showDialogWithMsg:[NSString stringWithFormat:@"Failed to join class: %d %@", code, desc]]; } [JMLoadingHUD hide]; }}];Copy the code

Here enter the class, mainly initialization whiteboard, join IM group logic, refer to the Demo provided by Tencent:

- (void)joinClassroom:(TICClassroomOption *)option callback:(TICCallback)callback { _option = option; _enterCallback = callback; // The whiteboard initializes __weak Typeof (self) ws = self; void (^createBoard)(void) = ^(void){ TEduBoardAuthParam *authParam = [[TEduBoardAuthParam alloc] init]; authParam.sdkAppId = ws.sdkAppId; authParam.userId = ws.userId; authParam.userSig = ws.userSig; TEduBoardInitParam *initParam = option.boardInitParam;if(! initParam){ initParam = [[TEduBoardInitParam alloc] init]; } [ws report:TIC_REPORT_INIT_BOARD_START]; ws.boardController = [[TEduBoardController alloc] initWithAuthParam:authParam roomId:ws.option.classId initParam:initParam]; [ws.boardController addDelegate:ws];if(option.boardDelegate){ [ws.boardController addDelegate:option.boardDelegate]; }}; [self report:TIC_REPORT_JOIN_GROUP_START]; Void (^succ)(void) = ^{[ws report:TIC_REPORT_JOIN_GROUP_END]; createBoard(); }; void (^fail)(int, NSString*) = ^(int code, NSString *msg){ [ws report:TIC_REPORT_JOIN_GROUP_END code:code msg:msg]; TICBLOCK_SAFE_RUN(callback, TICMODULE_IMSDK, code, msg); }; [self joinIMGroup:[@(_option.classId) stringValue] succ:^{if(ws.option.compatSaas){
            NSString *chatGroup = [self getChatGroup];
            [self joinIMGroup:chatGroup succ:^{
                succ();
            } fail:^(int code, NSString *msg) {
                fail(code, msg);
            }];
        }
        else{
            succ();
        }
    } fail:^(int code, NSString *msg) {
        fail(code, msg);
    }];
};
Copy the code

The whiteboard

Whiteboard is a core function of education live broadcast. Lecturers or users can participate in whiteboard operation and communication according to authorization:

UIView *boardView = [[[TICManager sharedInstance] getBoardController] getBoardRenderView]; [[TICManager sharedInstance] getBoardController]setDrawEnable:NO];
boardView.frame = self.boardBackgroudView.bounds;
[self.boardBackgroudView addSubview:boardView];
[[[TICManager sharedInstance] getBoardController] addDelegate:self];
Copy the code

Some functions of the whiteboard are required in actual service scenarios:

/** * @brief Set the whiteboard tool * @paramtype*/ - (void)onSelectToolType:(int)toolType {[[[TICManager sharedInstance] getBoardController]setToolType:(TEduBoardToolType)toolType]; } /** * @brief set the brush color * @param color to set the brush color ** brush color for all graffiti drawing */ - (void)onSelectBrushColor:(UIColor *)color { [[[TICManager sharedInstance] getBoardController]setBrushColor:color]; } /** * @brief set brush thickness * @param thin Set brush thickness ** Brush thickness is used for all doodle drawing. The actual pixel value is set to px (thin * whiteboard height / 10000). If the result is less than 1px, */ - (void)onBrushThinChanged:float)thin
{
    [[[TICManager sharedInstance] getBoardController] setBrushThin:thin]; } /** * @brief set the text color * @param color set the text color */ - (void)onSelectTextColor:(UIColor *)color {[[[TICManager sharedInstance] getBoardController]setTextColor:color]; } / * * * @ brief, set the current board page background color * @ param color to set the background color page * * whiteboard after create the default background color set by SetDefaultBackgroundColor interface * / - (void)onSelectBackgroundColor:(UIColor *)color { [[[TICManager sharedInstance] getBoardController]setBackgroundColor:color]; } /** * @brief sets text size * @param size specifies the text size to be set ** actual pixel value value (size * whiteboard height / 10000)px */ - (void)onTextSizeChanged:(float)thin
{
    [[[TICManager sharedInstance] getBoardController] setTextSize:thin]; } /** * @brief Sets whether the whiteboard allows doodling * @paramenableWhether graffiti is allowed,trueIt means you can doodle on the whiteboard,false}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}} getBoardController]setDrawEnable:state]; } /** * @brief Sets whether to enable data synchronization on the whiteboard * @paramenable* * Data synchronization is enabled or disabled by default after whiteboard creation. */ - (void)onSyncDataChanged:(BOOL)state {[[[TICManager sharedInstance] getBoardController]setDataSyncEnable:state]; } /** * @brief Sets the background H5 page of the current whiteboard * @param URL Specifies the background H5 page URL ** This interface and SetBackgroundImage interface are mutually exclusive */ - (void)onSetBackgroundH5:(NSString *)url { [[[TICManager sharedInstance] getBoardController]setBackgroundH5:url]; } /** * @brief set text style * @param style set text style */ - (void)onSetTextStyle:(int)style {[[[TICManager sharedInstance] getBoardController]setTextStyle:(TEduBoardTextStyle)style]; } /** * @brief Undo the last action on the whiteboard */ - (void)onUndo {[[[TICManager sharedInstance] getBoardController] undo]; } /** * @brief Redo the current whiteboard. */ - (void)onRedo {[[[TICManager sharedInstance] getBoardController] redo]; } /** * @brief clear {[[[TICManager sharedInstance] getBoardController] clear]; /** * @brief clear {[[TICManager sharedInstance] getBoardController] clear]; } /** * @brief Remove graffiti */ - (void)onClearDraw {[[[TICManager sharedInstance] getBoardController] clearDraws]; } /** * @brief Reset whiteboard ** This interface will delete all whiteboard pages and files */ - (void)onReset {[[[TICManager sharedInstance] getBoardController] reset]; } /** * @brief Sets the background image of the current whiteboard * @param URL Specifies the background image URL to be set, in UTF8 format * @param mode Specifies the image fill alignment mode to be used ** When the URL is a valid local file address, COS */ - (void)onSetBackgroundImage:(NSString *)path {[[[TICManager sharedInstance] getBoardController]setBackgroundImage:path mode:TEDU_BOARD_IMAGE_FIT_MODE_CENTER];
}
Copy the code

Video push stream and pull stream

When the remote stream is pushed through, it fires our message event:

@param userId userId * @param available screen whether to enable **/ - (void)onTICUserVideoAvailable:(NSString *)userId available:(BOOL)available { NSLog(@"onTICUserVideoAvailable userId: %@, available = %d", userId, available);
    [self.rendersView onTICUserVideoAvailable:userId available:available];
}
Copy the code

Corresponding to our operation, accept or stop accepting the remote stream:

- (void)onTICUserVideoAvailable:(NSString *)userId available:(BOOL)available {
    [[[TICManager sharedInstance] getTRTCCloud] muteRemoteVideoStream:userId mute:!available];
}
Copy the code

When our server pushes a user, we first add an ASCollectionNode item, that is, we turn the stream on and off in our ZJRenderNode:

- (void)updateVideoStatus:(bool)available {
    dispatch_async(dispatch_get_main_queue(), ^{
        if ([_key isEqualToString:_key]) {
            if (available) {
                NSLog(@"startLocalPreview:");
                [[[TICManager sharedInstance] getTRTCCloud] startLocalPreview:YES view:_backgroundNode.view];
            } else {
                NSLog(@"stopLocalPreview:"); [[[TICManager sharedInstance] getTRTCCloud] stopLocalPreview]; }}else {
            if (available) {
                [[[TICManager sharedInstance] getTRTCCloud] startRemoteView:_hash_id view:_backgroundNode.view];
            } else{ [[[TICManager sharedInstance] getTRTCCloud] stopRemoteView:_hash_id]; }}}); }Copy the code

Finally, when you get the server push, if you are included in the down list, you will directly close your local stream push:

// Stop pushing video and audio and operating whiteboardif([key isEqualToString:_my_key]) {// Stop local video push stream [[[TICManager sharedInstance] getTRTCCloud] stopLocalPreview]; [[[TICManager sharedInstance] getTRTCCloud] stopLocalAudio]; [[[TICManager sharedInstance] getBoardController]setDrawEnable:NO];
}
Copy the code

Audio volume operation

// hardware event callback - (void)onUserVoiceVolume:(NSArray<TRTCVolumeInfo *> *)userVolumes totalVolume:(NSInteger)totalVolume { [self.rendersView onUserVoiceVolume:userVolumes totalVolume:totalVolume]; } // ZJRendersView.m - (void)onUserVoiceVolume:(NSArray<TRTCVolumeInfo *> *)userVolumes totalVolume:(NSInteger)totalVolume {for (TRTCVolumeInfo *info in userVolumes) {
        if(keys[info.userId]) { ZJRenderNode *node = [_collectionNode nodeForItemAtIndexPath:keys[info.userId]]; [node updateVolumn:(info.volume / 10)]; }}} // ZJRenderNode // update volume UI, - (void)updateVolumn:(NSUInteger)count {dispatch_async(dispatch_get_main_queue(), ^{ NSUInteger i = 0;for (i = 0; i < 10 - count; ++i) {
            [_renderNodes[i] setHidden:YES];
        }

        for (NSUInteger j = i; j < 10; ++j) {
            [_renderNodes[j] setHidden:NO]; }}); }Copy the code

conclusion

So far, our core functions even if the development is complete, here is the lack of IM that piece, can be combined with the last article chat interface design to start to try.

ASCollectionNode and Tencent Cloud real-time video and audio TRTC SDK are combined to complete an educational multi-person interactive live broadcast. From the perspective of experience and live broadcast effect, Tencent Cloud real-time video and audio capability is very good. It’s easy to open live multiplayer and the latency is within the acceptable range of a few hundred milliseconds, so it’s recommended.