preface

IPhone has a good user interaction experience, which results from the efficient processing and excellent response of iOS system to interactive events. It is very convenient for App developers to handle user interaction. This is due to the encapsulation and default processing of user actions by iOS and UIKit. This article explores the specific process of iOS event delivery and processing.

The body of the

What is an event?

The events mentioned here are abstractions of user interaction, such as IOHIDEvent and UIEvent, which are encapsulation of different processing stages.

IOHIDEvent is the encapsulation of iOS system events, interested in the source code iohidevent. h and iohidevent. CPP (HID is the abbreviation of Human Interface Device).

UIEvent is an object encapsulated by UIKit that describes user operation types, including touch events, Motion events, remote-Control events, press events, etc. Different events are handled differently in the response chain. Here we mainly analyze the delivery and processing of touch events.

The user clicks on the screen of the phone

Outside App: The user clicks -> Hardware Response -> Parameter quantization -> Data forwarding ->App receiving.

After the user touches the screen, the screen hardware accepts the user’s actions and collects key parameters to pass to IOKit, which packages the data and passes it to Springboard. app, which then forwards it to the foreground app.

In App: child thread receives event -> main thread encapsulates event ->UIWindow starts hitTest determines target view ->UIApplication starts sending event -> Touch event starts callback.

App startup will launch a com. Apple. Uikit. Eventfetch – thread thread, is responsible for receiving the SpringBoard. App forwarded data (through the runloop source1 monitoring, The stack is __CFRunLoopDoSource1), and the data is encapsulated into IOHIDEvent objects and forwarded to the main thread;

Similarly, the main program listens to source0 during startup, receives IOHIDEvent data sent by eventFetch – Thread, encapsulates it into UIEvent, and determines whether hitTest needs to be started according to the type of UIEvent. Motion events do not require hitTest, and touch events have parts that do not require hitTest, such as touch termination events.

Once the target view is identified, the UIApplication sends an event that sends UITouch and UIEvent to the target view, triggering its Touches methods.

UIKit’s process of finding the target view

The search process relies on two UIView methods: -hitTest:withEvent and -pointinsdie :withEvent.

- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
Copy the code

The hitTest method returns the view corresponding to the point and event;

- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
Copy the code

The pointInside method returns whether the point and event are on its current view;

So both of these methods UIView provides default implementations, so the hitTest method by default calls all of the subviews’ hitTest methods, if one of them returns.

UIKit will start looking for the target view from UIWindow, first calling UIWindow’s hitTest method to ask if there’s a responding view, and then the hitTest method will first call UIWindow’s pointInside method to ask if it’s within the click range.

A. If pointInside returns NO, then UIWindow cannot respond to the event, and hitTest immediately returns nil; B. If the pointInside method returns YES, then UIWindow can respond to the event, and the hitTest method then calls the hitTest method of UIWindow’s child view. => b1. If the subview hitTest method returns a view, then the UIWindow hitTest method will return that view; => b2. If none of the subview hitTest methods returns a view, then UIWindow’s hitTest method returns itself.

UIWindow is a subclass of UIView, and the implementation of UIView’s hitTest method is the same as above.

Think: when UIView calls its subview hitTest, which subviews do you call first?

Call hitTest from the end of the subView array. The smaller the subview array, the lower the view hierarchy.

UIKit determines the target view

When UIKit determines the target view, it creates a UITouch, and the Window and View properties of the UITouch are the UIWindow and target view in the procedure above.

Then UIApplication calls sendEvent:, and UIWindow calls sendTouchesForEvent: in sendEvent:, as shown below:

The sendTouchesForEvent: method on UIWindow calls the four familiar touches: -TouchesBegan :withEvent: -TouchesMoved :withEvent: -touchesEnded:withEvent: -touchesCancelled:withEvent: Starting with the target view, the target view is called first with the touches method, then with the parent view of the target view, then with the parent view of the view, and then with the ViewController’s view method if the view is a. View property. Up to UIWindow, UIApplication, UIApplicationDelegate (the AppDelegate we created).

(Responder chains in an app)

At what stage does gesture processing take place

UIGestureRecognizer is an important way to interact on the iPhone. Gesture Recognition introduces how to recognize gestures and even add custom gestures.

The UIGestureRecognizer also has touches:

We can see when gesture processing happens by looking at the gesture’s touchesBegan:withEvent: method. When we break the gesture’s touchesBegan method, we see the stack:

Notice that the sendEvent: method of UIApplication in the stack, sendEvent happens after UIKit finds the target view. Another way to think about it, the touchesBegan method is going to use the UITouch, and the View property in the UITouch is the target view, so the gesture should also come after the UIKit finds the target view.

When the Gesture’s touchesBegan:withEvent: processing is complete, it triggers the Target view’s touchesBegan method.

However, after the gesture recognition is successful, the subsequent touch operations will be cancelled by default, and the response chain starting from the target view will receive the ‘touchesCancelled’ method instead of the normal ‘touchesEnded’ method. The stack is as follows:

This behavior can also be avoided by cancelsTouchesInView=NO below to avoid triggering the touchesCancelled method.

Notice that there is a UIGestureEnvironment class in the stack, either for the touchesBegan method at the start of gesture processing or for the touchesCancelled method after gesture recognition is successful. This is a private UIKit class.

@interface UIGestureEnvironment : NSObject { NSMutableArray * _delayedPresses; NSMutableArray * _delayedPressesToSend; NSMutableArray * _delayedTouches; NSMutableArray * _delayedTouchesToSend; UIGestureGraph * _dependencyGraph; NSMutableArray * _dirtyGestureRecognizers; bool _dirtyGestureRecognizersUnsorted; struct __CFRunLoopObserver { } * _gestureEnvironmentUpdateObserver; NSMutableSet * _gestureRecognizersNeedingRemoval; NSMutableSet * _gestureRecognizersNeedingReset; NSMutableSet * _gestureRecognizersNeedingUpdate; NSMapTable * _nodesByGestureRecognizer; bool _updateExclusivity; } - (void)addGestureRecognizer:(id)arg1; - (void)addRequirementForGestureRecognizer:(id)arg1 requiringGestureRecognizerToFail:(id)arg2; - (bool)gestureRecognizer:(id)arg1 requiresGestureRecognizerToFail:(id)arg2; - (id)init; - (void)removeGestureRecognizer:(id)arg1; .Copy the code

From the method declaration of the file, we can roughly know that this is a gesture management class, and gestures are added, removed, and responded to internally.

Think about:

How is UIButton click callback implemented? 2. If YOU add a Tap gesture to UIButton, will UIButton Tap gesture or UIButton Tap callback be triggered when UIButton is clicked?

conclusion

Therefore, in summary, we can know that the whole process is roughly as follows:

  1. Find the targetView: UIApplication->UIWindow->ViewController->View->targetView
  2. Gesture recognition: UIGestureEnvironment-> UIGestureRecognizer
  3. Response chain callback: targetView->Viewd->ViewController->UIWindow->UIApplication

User interaction on iOS is very complex. Due to the limited time, here only from the event transmission and processing, to establish a basic cognition.

The appendix

reference

Gesture recognition developer.apple.com/documentati…

Response chain introduces developer.apple.com/documentati…

To consider

How is UIButton click callback implemented?

UIButton is a subclass of UIControl that tracks touch events and gets UIControlEvents defined by UIControl. UIButton clicks are triggered by UIControlEvents’ event-change callbacks, which essentially rely on the Touches methods in response to chain callbacks.

2. If YOU add a Tap gesture to UIButton, will UIButton Tap gesture or UIButton Tap callback be triggered when UIButton is clicked?

The ‘tap’ gesture occurs before the ‘Touches’ callback of UIButton’s’ Touches’ method. If a UIButton, listening is a common UIControlEventTouchUpInside events will not callback; If the listening is UIControlEventTouchCancel event is fired after the Tap gesture, also will receive a callback.