One, foreword

Gesture event collection is the core function of iOS click event collection. The idea of gesture event collection is not complicated, but there are many difficulties in it. This paper gives solutions for these difficulties one by one.

Let’s take a look at how to implement gesture event gathering in iOS.

Second, gesture introduction

Apple provides classes related to UIGestureRecognizer[1] for handling gesture operations. Common gestures are as follows:

UITapGestureRecognizer: Click; UILongPressGestureRecognizer: long press;

UIPinchGestureRecognizer

UIRotationGestureRecognizer: rotation. The UIGestureRecognizer class defines a set of common behaviors that can be configured for all specific gesture recognizers.

Gesture recognizers can touch in response to a particular view, so you need to associate views with gestures through UIView’s -addGesturerecognizer: method.

A gesture recognizer can have multiple target-action pairs that are independent of each other and send messages to each target-action pair after gesture recognition.

3. Acquisition scheme

Because each gesture recognizer can be associated with multiple target-actions, combined with the Method Swizzling Runtime, we can add an additional target-action pair to collect events when the user adds target-action for the gesture.

Figure 3-1 shows the overall process:

Figure 3-1 Gesture event collection process Let’s look at the specific code implementation.

  1. Method Swizzling:

`

  • (void)enableAutoTrackGesture { static dispatch_once_t onceToken; dispatch_once(&onceToken, ^{ [UIGestureRecognizer sa_swizzleMethod:@selector(initWithTarget:action:) withMethod:@selector(sensorsdata_initWithTarget:action:) error:NULL]; [UIGestureRecognizer sa_swizzleMethod:@selector(addTarget:action:) withMethod:@selector(sensorsdata_addTarget:action:) error:NULL]; });

} `

  1. Add target-action for collecting events:

`

  • (void)sensorsdata_addTarget:(id)target action:(SEL)action { self.sensorsdata_gestureTarget = [SAGestureTarget targetWithGesture:self]; [self sensorsdata_addTarget:self.sensorsdata_gestureTarget action:@selector(trackGestureRecognizerAppClick:)]; [self sensorsdata_addTarget:target action:action];

} `

  1. Gesture event collection:

`

  • (void) trackGestureRecognizerAppClick gesture: (UIGestureRecognizer *) {/ / gesture events acquisition…

} `

Through Method Swizzling, we can collect gesture events as we wish, but there is a problem: many behaviors of the system are also achieved through gestures, which will also be collected by us, but our original intention is to collect only gestures added by users. Table 3-1 shows some private gestures:

Table 3-1 Partial private gestures

How not to collect system private gesture events has become an urgent problem to be solved.

There is no essential difference between system private gestures and public gestures. They all inherit or indirectly inherit from UIGestureRecognizer.

Once target-action is added to a gesture, we can determine whether the current gesture is system private based on the Bundle of the class to which the Target object belongs.

System Library bundle format is as follows: / System/Library/PrivateFrameworks/UIKitCore framework/System/Library/Frameworks/its framework

Developers bundle format is as follows: / private/var/containers/bundle/Application / 8264 d420 DE23 – ac – 9985-48 A7F1E131A52A/CDDStoreDemo app implementation is as follows: `

  • (BOOL)isPrivateClassWithObject:(NSObject *)obj { if (! obj) { return NO; }

    NSString *bundlePath = [[NSBundle bundleForClass:[obj class]] bundlePath]; if ([bundlePath hasPrefix:@”/System/Library”]) { return YES; }

    return NO;

} `

One caveat here: this method does not apply to emulators.

This scheme can distinguish whether it is a system private gesture, but when the added Target is UIGestureRecognizer instance object itself, it cannot distinguish whether it is a gesture event to be collected, so this scheme is not feasible.

3.2 Collect only click and hold gestures

When debugging can be found, most systems private gestures is subclass, and developers rarely subclassing on hand gestures for operation, so we can only realize the collection of the UITapGestureRecognizer, UILongPressGestureRecognizer gestures, subclassing gestures is not collected.

When we create the Target object, we check the gesture and return a valid Target object if the gesture meets the condition. `

  • (SAGestureTarget * _Nullable)targetWithGesture:(UIGestureRecognizer *)gesture { NSString *gestureType = NSStringFromClass(gesture.class); if ([gesture isMemberOfClass:UITapGestureRecognizer.class] || [gesture isMemberOfClass:UILongPressGestureRecognizer.class]) { return [[SAGestureTarget alloc] init]; } return nil;

} `

Four, difficult to overcome

So far, the tap and hold gestures seem to work fine. However, this is far from the truth, and there are still some difficulties to resolve. Scenario 1: After the developer added target-action, it was removed; Scenario 2: Target is released in some scenarios after the developer adds target-action; Scene 3: although only collected UITapGestureRecognizer, UILongPressGestureRecognizer, but there are still some system private gestures are not subclassing, collected by mistake; Scenario 4: UIAlertController Click event collection requires special processing; Scenario 5: Some gesture states require special processing.

4.1 management Target – the Action

The SDK should not collect gesture events for scenarios 1 and 2. However, the SDK has added target-action, so it is necessary to determine whether there is a valid target-action in addition to the target-action added by the SDK during collection. If there is no target-action, gesture events should not be collected.

The UIGestureRecognizer system does not provide a public API to get all the target-actions of the current gesture. Although it can be obtained through the private API ‘_targets’, it may affect customers. So we log the number of target-actions ourselves by hook correlation.

New SAGestureTargetActionModel class, used to manage the Target and Action: ` @ interface SAGestureTargetActionModel: NSObject

@property (nonatomic, weak) id target; @property (nonatomic, assign) SEL action; @property (nonatomic, assign, readonly) BOOL isValid;

  • (instancetype)initWithTarget:(id)target action:(SEL)action;
  • (SAGestureTargetActionModel * _Nullable)containsObjectWithTarget:(id)target andAction:(SEL)action fromModels:(NSArray <SAGestureTargetActionModel >)models;

@end`

Record the number of targets in -addTarget :action: and -removeTarget: Action: :

`

  • (void)enableAutoTrackGesture { static dispatch_once_t onceToken; dispatch_once(&onceToken, ^{ … [UIGestureRecognizer sa_swizzleMethod:@selector(removeTarget:action:) withMethod:@selector(sensorsdata_removeTarget:action:) error:NULL]; });

}

  • (void)sensorsdata_addTarget:(id)target action:(SEL)action { if (self.sensorsdata_gestureTarget) { if (! [SAGestureTargetActionModel containsObjectWithTarget:target andAction:action fromModels:self.sensorsdata_targetActionModels]) { SAGestureTargetActionModel *resulatModel = [[SAGestureTargetActionModel alloc] initWithTarget:target action:action]; [self.sensorsdata_targetActionModels addObject:resulatModel]; [self sensorsdata_addTarget:self.sensorsdata_gestureTarget action:@selector(trackGestureRecognizerAppClick:)]; } } [self sensorsdata_addTarget:target action:action];

}

  • (void)sensorsdata_removeTarget:(id)target action:(SEL)action { if (self.sensorsdata_gestureTarget) { SAGestureTargetActionModel *existModel = [SAGestureTargetActionModel containsObjectWithTarget:target andAction:action fromModels:self.sensorsdata_targetActionModels]; if (existModel) { [self.sensorsdata_targetActionModels removeObject:existModel]; } } [self sensorsdata_removeTarget:target action:action];

} During event collection, verify whether the collection conditions are met:

  • (void)trackGestureRecognizerAppClick:(UIGestureRecognizer *)gesture { if ([SAGestureTargetActionModel filterValidModelsFrom:gesture.sensorsdata_targetActionModels].count == 0) { return NO; } // Gesture event collection…

} `

4.2 the blacklist

For scenario 3, shence SDK added the configuration of blacklist, and blocked the collection of these gestures by configuring the View type. { “public”: [ “UIPageControl”, “UITextView”, “UITabBar”, “UICollectionView”, “UISearchBar” ], “private”: [ “_UIContextMenuContainerView”, “_UIPreviewPlatterView”, “UISwitchModernVisualElement”, “WKContentView”, “UIWebBrowserView” ] }

When comparing types, we make a distinction between public and private types:

  • To expose the class name, use -iskindofClass: judgment;
  • Private class names are determined using string matching.

`

  • (BOOL)isIgnoreWithView:(UIView *)view { … -iskindofClass: check id publicClasses = info[@”public”]; if ([publicClasses isKindOfClass:NSArray.class]) { for (NSString *publicClass in (NSArray *)publicClasses) { if ([view isKindOfClass:NSClassFromString(publicClass)]) { return YES; }} {private class name = info[@”private”]; if ([privateClasses isKindOfClass:NSArray.class]) { if ([(NSArray *)privateClasses containsObject:NSStringFromClass(view.class)]) { return YES; } } return NO;

} `

4.3 UIAlertController Click event collection

UIAlertController internally implements user interaction through gestures, but the View on which the gestures are located is not the View that the user operates, and the internal implementation is slightly different in different versions of the system.

We handle this particular logic by using different processors.

To decide to use new factory class SAGestureViewProcessorFactory processor: ` @ implementation SAGestureViewProcessorFactory

  • (SAGeneralGestureViewProcessor *)processorWithGesture:(UIGestureRecognizer *)gesture { NSString *viewType = NSStringFromClass(gesture.view.class); if ([viewType isEqualToString:@”_UIAlertControllerView”]) { return [[SALegacyAlertGestureViewProcessor alloc] initWithGesture:gesture]; } if ([viewType isEqualToString:@”_UIAlertControllerInterfaceActionGroupView”]) { return [[SANewAlertGestureViewProcessor alloc] initWithGesture:gesture]; } return [[SAGeneralGestureViewProcessor alloc] initWithGesture:gesture];

}

@end`

And then deal with differences in the specific processor: ` # pragma mark – adapter iOS 10 previous Alert @ implementation SALegacyAlertGestureViewProcessor

  • (BOOL)isTrackable { if (! [super isTrackable]) { return NO; } UIViewController *viewController = [SAAutoTrackUtils] {UIViewController *viewController = [SAAutoTrackUtils findNextViewControllerByResponder:self.gesture.view]; if ([viewController isKindOfClass:UIAlertController.class] && [viewController.nextResponder isKindOfClass:SAAlertController.class]) { return NO; } return YES;

}

  • (UIView *)trackableView { NSArray <UIView *>*visualViews = sensorsdata_searchVisualSubView(@”_UIAlertControllerCollectionViewCell”, self.gesture.view); CGPoint currentPoint = [self.gesture locationInView:self.gesture.view]; for (UIView *visualView in visualViews) { CGRect rect = [visualView convertRect:visualView.bounds toView:self.gesture.view]; if (CGRectContainsPoint(rect, currentPoint)) { return visualView; } } return nil;

}

@end

IOS 10 # pragma mark – adaptation and so on Alert @ implementation SANewAlertGestureViewProcessor

  • (BOOL)isTrackable { if (! [super isTrackable]) { return NO; } UIViewController *viewController = [SAAutoTrackUtils] {UIViewController *viewController = [SAAutoTrackUtils findNextViewControllerByResponder:self.gesture.view]; if ([viewController isKindOfClass:UIAlertController.class] && [viewController.nextResponder isKindOfClass:SAAlertController.class]) { return NO; } return YES;

}

  • (UIView *)trackableView { NSArray <UIView *>*visualViews = sensorsdata_searchVisualSubView(@”_UIInterfaceActionCustomViewRepresentationView”, self.gesture.view); CGPoint currentPoint = [self.gesture locationInView:self.gesture.view]; for (UIView *visualView in visualViews) { CGRect rect = [visualView convertRect:visualView.bounds toView:self.gesture.view]; if (CGRectContainsPoint(rect, currentPoint)) { return visualView; } } return nil;

}

@end`

4.4 Handling gesture status

Gesture recognizer is driven by a state machine, the default state is UIGestureRecognizerStatePossible, said are ready to start event handling.

Figure 4-1 shows the transitions between states.

Figure 4-1 Gesture state transition [2]

For all buried point, whether gestures state UIGestureRecognizerStateEnded or UIGestureRecognizerStateCancelled shall collect gesture events:

`

  • (void)trackGestureRecognizerAppClick:(UIGestureRecognizer *)gesture { if (gesture.state ! = UIGestureRecognizerStateEnded && gesture.state ! = UIGestureRecognizerStateCancelled) { return; } // Gesture event collection…

}

@end`

Five, the summary

This paper introduces a specific implementation of iOS gesture event collection, and also introduces how to deal with some difficulties. For more details, please refer to shence iOS SDK source code [3]. If you have better ideas, welcome to join the open source community to discuss.

References:

[1] developer.apple.com/documentati… [2] developer.apple.com/documentati… [3] github.com/sensorsdata…