This article will analyze the entire process from the time a finger touches the screen until the touch event is processed. (Focus on the part of the current application that handles touch events)

Before I begin, I think it’s important to understand some inheritance relationships: NSObject is the base class that we deal with most everyday, And then you can kind of think of it in iOS as a Foundation framework that represents the data layer and the classes that start with NS like NSDictionary and NSArray and so on are almost all directly derived from NSObject or indirect subclasses of NSObject, They basically rely on the NSObject base class to do all of their memory management and so forth. What about the UI layer? UIView they’re all subclasses of UIResponder, UIResponder is a subclass of NSObject, UIButton inherits from UIControl, UIControl inherits from UIView, UIView and other subclasses can be used as responders because they inherit from UIResponder, and are called responders because they override UIResponder touches… (Response to touch events), Presses… (Response to key events), Motion… It’s just a series of functions. The UIControl family responds to user events based on the target-Action mechanism. Well, the inheritance relationship is these, must be in our daily development of nothing to press command down the process has been familiar with their inheritance relationship in mind, so below we will carry out a detailed expansion of these content!

IOKit.framework/SpringBoard

Iokit. framework is a low-level framework for communicating with hardware or kernel services. Although it is a common framework, developers are discouraged from using it, and any apps that use it will be rejected by the App Store. IOKit.framework

Springboard. app is an app launcher for the iPhone. It provides all the application launch services, icon management, status bar controls, and more. For a class that manages Springboard.app, see Springboard. SpringBoard.app

Springboard. app is the basic program for iOS and iPadOS to manage the home screen and start Windows Server, start applications (the program that does this is called an application launcher) and set certain Settings on the device when the device boots up. The home screen is sometimes used as a shorthand for SpringBoard. It mainly processes events such as buttons (lock screen/mute, etc.), touch, acceleration and distance sensor (UIEventTypeMotion), which are then forwarded to the required APP through MAC port interprocess communication.

Mac OSX uses Launchpad, which allows users to launch applications by clicking on ICONS from an ios-like interface called SpringBoard. Before Launchpad, users could launch apps as a Dock, Finder, Spotlight or terminal. But Launchpad doesn’t take up the entire home screen. It’s more like a Space (like a dashboard). Count the flow of iOS touch events

When a hardware event (touch/lock/shake, etc.) occurs, an IOHIDEvent event is first generated by IOKit. Framework and received by SpringBoard. Details of this process can be found in IOHIDFamily. SpringBoard only receives events such as buttons (lock screen/mute), touch, acceleration, and distance sensor (UIEventTypeMotion). SpringBoard determines whether there is a foreground application on the desktop. If there is no such Event (such as page turning on the desktop), The source0 event callback of run loop within the SpringBoard application is triggered, which is consumed by the desktop application. If so, it is forwarded to the desired foreground App process through a Mach port.

Now let’s look at which Mach port of the foreground App receives the SpringBoard message and wakes up the main thread of the foreground App.

com.apple.uikit.eventfetch-thread

App will create a named after startup com. Apple. Uikit. Eventfetch – thread thread, and directly start the thread run loop, And in its kCFRunLoopDefaultMode mode added a callback function is __IOHIDEventSystemClientQueueCallback source1, Used to receive messages from the Above Mentioned SpringBoard via the Mach Port.

The foreground App process com. Apple. Uikit. Eventfetch – thread thread is SpringBoard according to specify the Mach port after wake up, Perform its source1 corresponding __IOHIDEventSystemClientQueueCallback callback function, And set the callback function of the main run loop which is the source0 excursion of __handleEventQueue to YES to signal it as pending and wake up the main run loop, The main thread calls __handleEventQueue to handle events (IOHIDEvent).

Ibireme bosses said within _UIApplicationHandleEventQueue will wrap IOHIDEvent process, and as a UIEvent processing or distribution, but add symbols breakpoints and cannot find this approach, I guess IOHIDEvent should now be processed in the __handleEventQueue function. Then it is [UIApplication sendEvent:] to distribute the UIEvent to our App.

After creating a Single View App with Xcode and running it directly, we click the pause button at the bottom of Xcode to locate the current com.apple.main-thread thread to mach_MSg_trap. Then there is a named below com. Apple. Uikit. Eventfetch – thread thread, we select it directly, and then in the Xcode console input: Po [NSRunLoop currentRunLoop] prints the run loop of the current thread. You can see that it has only one kCFRunLoopDefaultMode marked common. There is only one source0 and three source1s, and their callback events are related to IOHIDEvent. Here we list only the callback function is __IOHIDEventSystemClientQueueCallback source1.

. sources1 = <CFBasicHash0x600001fff330 [0x7fff80617cb0]>{type = mutable set, count = 3,
entries =>
...
1 : <CFRunLoopSource 0x6000024e4540 [0x7fff80617cb0]>{signalled = No, valid = Yes, order = 0, context = <CFMachPort 0x6000026ec210 [0x7fff80617cb0]>{valid = Yes, port = 440b, source = 0x6000024e4540, callout = __IOHIDEventSystemClientQueueCallback (0x7fff25e91d1d), context = <CFMachPort context 0x7fb555601c50>}}... }...Copy the code

Then we use Po [NSRunLoop mainRunLoop] to print main run loop, It can be seen that in its UITrackingRunLoopMode and kCFRunLoopDefaultMode modes the same callback function is __handleEventQueue source0, Tracking Mode and Default Mode are marked common.

.2 : <CFRunLoopMode 0x6000028101a0 [0x7fff80617cb0]>{name = UITrackingRunLoopMode, port set = 0x2a03, queue = 0x600003d1ca00, source = 0x600003d1cb00 (not fired), timer port = 0x2c03, 
sources0 = <CFBasicHash 0x600001d319b0 [0x7fff80617cb0]>{type = mutable set, count = 4,
entries =>
...
4 : <CFRunLoopSource 0x600002618240 [0x7fff80617cb0]>{signalled = No, valid = Yes, order = - 1, context = <CFRunLoopSource context>{version = 0, info = 0x6000026103c0, callout = __handleEventQueue (0x7fff48126d97)}}},... .4 : <CFRunLoopMode 0x600002814410 [0x7fff80617cb0]>{name = kCFRunLoopDefaultMode, port set = 0x2503, queue = 0x600003d10e80, source = 0x600003d10f80 (not fired), timer port = 0x1e03, 
sources0 = <CFBasicHash 0x600001d31a10 [0x7fff80617cb0]>{type = mutable set, count = 4,
entries =>
...
4 : <CFRunLoopSource 0x600002618240 [0x7fff80617cb0]>{signalled = No, valid = Yes, order = - 1, context = <CFRunLoopSource context>{version = 0, info = 0x6000026103c0, callout = __handleEventQueue (0x7fff48126d97)}}}...Copy the code

We created a custom UIView named CustomView in the Single View App above, overwrote the touchesBegan:withEvent: method of the View, and then made a breakpoint in it. Then add a CustomView object to the root view of the current App, run the program and click on the CustomView. Then type bt on the console and press Enter to see the following function call stack:

(lldb) bt
* thread #1, queue = 'com.apple.main-thread', stop reason = breakpoint 5.1
  * frame #0: 0x0000000101522cbd EmptySimpleApp`-[CustomView touchesBegan:withEvent:](self=0x00007ff88fc0b5e0, _cmd="touchesBegan:withEvent:", touches=1 element, event=0x000060000190c500) at CustomView.m:22:30
    frame #1: 0x00007fff480ce8de UIKitCore`-[UIWindow _sendTouchesForEvent:] + 1867
    frame #2: 0x00007fff480d04c6 UIKitCore`-[UIWindow sendEvent:] + 4596
    frame #3: 0x00007fff480ab53b UIKitCore`-[UIApplication sendEvent:] + 356
    frame #4: 0x0000000103724bd4 UIKit`-[UIApplicationAccessibility sendEvent:] + 85
    frame #5: 0x00007fff4812c71a UIKitCore`__dispatchPreprocessedEventFromEventQueue + 6847
    frame #6: 0x00007fff4812f1e0 UIKitCore`__handleEventQueueInternal + 5980
    frame #7: 0x00007fff23bd4471 CoreFoundation`__CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 17
    frame #8: 0x00007fff23bd439c CoreFoundation`__CFRunLoopDoSource0 + 76
    frame #9: 0x00007fff23bd3b74 CoreFoundation`__CFRunLoopDoSources0 + 180
    frame #10: 0x00007fff23bce87f CoreFoundation`__CFRunLoopRun + 1263
    frame #11: 0x00007fff23bce066 CoreFoundation`CFRunLoopRunSpecific + 438
    frame #12: 0x00007fff384c0bb0 GraphicsServices`GSEventRunModal + 65
    frame #13: 0x00007fff48092d4d UIKitCore`UIApplicationMain + 1621
    frame #14: 0x0000000101523104 EmptySimpleApp`main(argc=1, argv=0x00007ffeee6dbd38) at main.m:18:12
    frame #15: 0x00007fff5227ec25 libdyld.dylib`start + 1
    frame #16: 0x00007fff5227ec25 libdyld.dylib`start + 1
(lldb) 
Copy the code

Where [UIApplication sendEvent:] starts the distribution of UIEvent events in the App, then sees and distributes them to UIWindow. The following involves the search for the first responder and the responder chain execution event, and we will continue to study in chapters.

[UIApplication sendEvent:]

Schedule events to the corresponding responder object in the application.

- (void)sendEvent:(UIEvent *)event;
Copy the code

Event: A UIEvent object that encapsulates information about the event, including the touches involved.

You can intercept incoming events by subclassing UIApplication and overriding this method if desired. For each event you intercept, you must call [Super sendEvent:event] after processing the event in the implementation.

UIApplication will then send events to UIWindow based on the function call stack above.

[UIWindow sendEvent:]

Dispatches the specified event to its view.

- (void)sendEvent:(UIEvent *)event; // called by UIApplication to dispatch events to views inside the window is called by UIApplication to dispatch events to views inside the window
Copy the code

Event: Indicates the event to be scheduled.

UIApplication object calls this method to schedule events to the window. The Window object dispatches touch events to the view where the touch occurred and dispatches other types of events to the most appropriate target object. You can call this method in your application as needed to schedule custom events that you create. For example, you can call this method to schedule custom events to the window’s responder chain.

So we kind of see that UIApplication sends a UIEvent to UIWindow, so what is a UIEvent? So the idea is to branch out and look at the UIEvent document, and then go back to our main branch.

UIEvent

Objects that describe how a single user interacts with your application.

UIKIT_EXTERN API_AVAILABLE(ios(2.0)) @interface UIEvent : NSObject
Copy the code

Applications can receive many different types of events, including Touch Events, motion Events, remote-Control Events, and press Events.

  • Touch events are the most common and are passed to the view where the touch occurred in the first place.
  • Motion events are UIKit triggered, separate from Motion events reported by the Core Motion framework.
  • Remote control events allow the responder object to receive commands from an external attachment or headset so that it can manage the management of audio and video, for example, playing a video or jumping to the next audio track.
  • Button events represent interactions with game controllers, AppleTV remote controls, or other devices with physical buttons.

You can use type and subtype attributes to determine the type of the event.

The Touch event object contains touches (fingers on the screen) that have some relationship to the event. A Touch event object can contain one or more touches, and each touch is represented by a UITouch object. When a touch event occurs, the system routes it to the corresponding responder and calls the corresponding method, such as touchesBegan:withEvent:. The responder then uses touches to determine the appropriate course of action.

In multi-touch sequences, UIKit reuses the same UIEvent object as it passes updated touch data to your application. You should never retain a UIEvent object or any object returned by a UIEvent object. If you need to keep the data outside of the responder method used to process it, copy the data from the UITouch or UIEvent object to a local data structure.

For more information on how to handle events in UIKit applications, see the Event Handling Guide for UIKit Apps. (The UIKit documentation is too extensive; here we will only read the Handling Touches in Your View document)

UIEventType

Specifies the general type of the event.

typedef NS_ENUM(NSInteger, UIEventType) {
    UIEventTypeTouches,
    UIEventTypeMotion,
    UIEventTypeRemoteControl,
    UIEventTypePresses API_AVAILABLE(ios(9.0)),};
Copy the code

You can get the type of the event from the Type property. To further identify the event, you may also need to determine the subtype, which is obtained from the SubType attribute.

  • UIEventTypeTouches: This event relates to touches on the screen.
  • UIEventTypeMotion: This event is related to the movement of the device, such as the user shaking the device.
  • UIEventTypeRemoteControl: this is a remote control events. Remote control events originate from commands received from headphones or external accessories and are used to control multimedia on the device.
  • UIEventTypePresses: This event is related to pressing the physical button.

UIEventSubtype

Specifies the subtype of the event relative to its normal type.

typedef NS_ENUM(NSInteger, UIEventSubtype) {
    // available in iPhone OS 3.0
    UIEventSubtypeNone                              = 0.// for UIEventTypeMotion, available in iPhone OS 3.0
    UIEventSubtypeMotionShake                       = 1./ / for UIEventTypeRemoteControl, available in iOS 4.0
    UIEventSubtypeRemoteControlPlay                 = 100,
    UIEventSubtypeRemoteControlPause                = 101,
    UIEventSubtypeRemoteControlStop                 = 102,
    UIEventSubtypeRemoteControlTogglePlayPause      = 103,
    UIEventSubtypeRemoteControlNextTrack            = 104,
    UIEventSubtypeRemoteControlPreviousTrack        = 105,
    UIEventSubtypeRemoteControlBeginSeekingBackward = 106,
    UIEventSubtypeRemoteControlEndSeekingBackward   = 107,
    UIEventSubtypeRemoteControlBeginSeekingForward  = 108,
    UIEventSubtypeRemoteControlEndSeekingForward    = 109};Copy the code

You can get the event subtype from the subType property.

  • UIEventSubtypeNone: This event has no subtype. This is a subtype of the event of the general type UIEventTypeTouches.
  • UIEventSubtypeMotionShake: the events related to user shaking device. It is a subtype of the UIEventTypeMotion regular event type.
  • UIEventSubtypeRemoteControlPlay: play the audio or video remote control. It is a subtype of the UIEventTypeRemoteControl general event type.
  • UIEventSubtypeRemoteControlPause: pause audio or video remote control. It is a subtype of the UIEventTypeRemoteControl general event type.
  • UIEventSubtypeRemoteControlStop: used to stop play audio or video remote control. It is a subtype of the UIEventTypeRemoteControl general event type.
  • UIEventSubtypeRemoteControlTogglePlayPause: switch between play and pause audio or video remote control. It is a subtype of the UIEventTypeRemoteControl general event type.
  • UIEventSubtypeRemoteControlNextTrack: jump to an audio or video track under remote control events. It is a subtype of the UIEventTypeRemoteControl general event type.
  • UIEventSubtypeRemoteControlPreviousTrack: jump to an audio or video track of the remote control. It is a subtype of the UIEventTypeRemoteControl general event type.
  • UIEventSubtypeRemoteControlBeginSeekingBackward: a remote control, began to search backwards through the audio or video media. It is a subtype of the UIEventTypeRemoteControl general event type.
  • UIEventSubtypeRemoteControlEndSeekingBackward: end through audio or video media backward search remote control events. It is a subtype of the UIEventTypeRemoteControl general event type.
  • UIEventSubtypeRemoteControlBeginSeekingForward: a start through audio or video media forward search remote control events. It is a subtype of the UIEventTypeRemoteControl general event type.
  • UIEventSubtypeRemoteControlEndSeekingForward: end through audio or video media forward search remote control events. It is a subtype of the UIEventTypeRemoteControl general event type.

type

Returns the type of the event.

@property(nonatomic,readonly) UIEventType type API_AVAILABLE(ios(3.0));
Copy the code

The UIEventType constant returned by this property indicates the general type of the event, for example, whether it is a touch or motion event.

subtype

Returns the subtype of the event.

@property(nonatomic,readonly) UIEventSubtype  subtype API_AVAILABLE(ios(3.0));
Copy the code

The UIEventSubtype constant returned by this attribute indicates the subtype of the event associated with the general type that is returned from the Type attribute.

timestamp

The time of the incident.

@property(nonatomic,readonly) NSTimeInterval  timestamp;
Copy the code

This property contains the number of seconds that have elapsed since the system was started. For a description of this time value, see the description of the systemUptime method of the NSProcessInfo class.

allTouches

Returns all Touches associated with the event. (The allTouchs attribute contains only the last touch in the touch sequence)

@property(nonatomic, readonly, nullable) NSSet <UITouch *> *allTouches; // Read-only collection
Copy the code

A set of UITouch objects that represent all Touches associated with the event.

If the event touches are derived from different views and Windows, the UITouch objects obtained from this method will be associated with different responder objects.

The ‘touchesBegan:withEvent’ function uses the same parameters as the ‘allTouches’ event. [UIApplication sendEvent:] just passed an event object. The touchesBegan withEvent: The function can pass only Touches and pass the Event argument to nil. Because hit-testing is done using only touches on a view, it is possible to determine whether the view is a responder. (If you want to call this method from outside the event handling code, you can specify nil.) )

touchesForWindow:

Returns the UITouch object belonging to the specified window from the UIEvent.

- (nullable NSSet <UITouch *> *)touchesForWindow:(UIWindow *)window;
Copy the code

Window: the UIWindow object in which Touches were originally created.

A set of UITouch objects that represent the touches that belong to the specified window.

touchesForView:

Returns the UITouch object belonging to the specified view from the UIEvent.

- (nullable NSSet <UITouch *> *)touchesForView:(UIView *)view;
Copy the code

View: the UIView object that was originally created with Touches.

A set of UITouch objects that represent touches belonging to a specified view.

touchesForGestureRecognizer:

Returns the UITouch object to be passed to the designated Gesture Recognizer.

- (nullable NSSet <UITouch *> *)touchesForGestureRecognizer:(UIGestureRecognizer *)gesture API_AVAILABLE(ios(3.2));
Copy the code

Gesture: Instance of a subclass of the abstract base class UIGestureRecognizer. This Gear-recognizer object must be attached to the view to receive that the view and its subviews have been hit-tested.

A set of UITouch objects representing touches that are passed to the designated Gesture recognizer of the UIEvent represented by the Receiver.

The following two functions are used to get the UITouch associated with the Main Touch objecct of the UIEvent object and the predicted UITouch array. It is mainly used for Apple Pencil to obtain high-precision touch input (when Apple Pencil is used on iPad, the iPad screen will be updated to 120 Hz) and predict the direction of touch to improve user experience. (Don’t want to see can be ignored)

coalescedTouchesForTouch:

Returns all Touches associated with the specified Main Touch. A set of auxiliary UITouch for touch events not passed by a given main touch. This also includes ancillary versions of Main Touch itself.)

- (nullable NSArray <UITouch *> *)coalescedTouchesForTouch:(UITouch *)touch API_AVAILABLE(ios(9.0));
Copy the code

Touch: The Main Touch object reported with the event. The touch object you specify determines the sequence of additional touches to return.

Return Value: Array of UITouch objects representing all UITouch reported against the specified UITouch since the last passed event. The order of the objects in the array matches the order in which touches are reported to the system, with the last touch being a copy of the same UITouch you specified in the Touch parameter. If the object in the touch parameter is not associated with the current event, the return value is nil.

Use this method to get any other UITouch that was received by the system but not passed in the previous UIEvent object. Some devices collect UITouch data at frequencies as high as 240 Hz, which is usually higher than the rate at which the UITouch is transferred to applications. Although these additional UITouch data provide greater precision, many applications do not require this precision and do not want to incur the overhead associated with processing them. However, applications that want to improve accuracy can use this method to retrieve additional UITouch objects. For example, a drawing application could use these UITouch to get a more accurate record of the user’s drawing input. You can then apply additional UITouch data to the content of the application. If you want a combined UITouch or a set of UITouch passed to the responder method, use this method, but do not mix the two sets of UITouch together. This method returns the complete sequence of UITouch reported since the last event, including a copy of the UITouch reported to the responder method. The event passed to the responder method contains only the last touch in the sequence. Similarly, the AllTouchs attribute contains only the last touch in the sequence. (Think of the iPad screen refresh up to 120 Hz when the Apple Pencil is used on the iPad.)

predictedTouchesForTouch:

Returns the UITouch array that is expected to occur against the specified UITouch. A set of auxiliary UITouch that predicts the touch events that will occur under a given main touch. These predictions may not exactly match the actual behavior of touch and should therefore be interpreted as an estimate.)

- (nullable NSArray <UITouch *> *)predictedTouchesForTouch:(UITouch *)touch API_AVAILABLE(ios(9.0));
Copy the code

Touch: The Main Touch object reported with the event. The UITouch object you specify is used to determine the sequence that returns the attached UITouch.

Return Value: A set of UITouch objects that represent the next set of UITouch objects that the system will predict. The order of the objects in the array matches the order in which touches are expected to be passed to your application. This array does not include the original UITouch that you specified in the Touch parameters. If the object in the touch parameter is not associated with the current event, the return value is nil.

Using this approach minimizes the significant delay between the user’s touch input and the on-screen content rendering. It takes time to process UITouch input from the user and convert that information into drawing commands, and additional time to convert those drawing commands into rendered content. If the user’s finger or Apple Pencil moves fast enough, these delays can cause a noticeable gap between the current touch position and the rendered content. To minimize perceived delay, use the expected effect of this method as additional temporary input to the content.

The UITouch returned by this method represents where the system estimates the user’s touch input will be based on the user’s past input. These UITouch’s are only temporarily attached to the structure used to draw or update content, and discarded as soon as a new event with the new UITouch is received. When used in combination with the combined UITouch and efficient drawing code, you can create a feeling that the user’s input is being processed immediately with little delay. This feeling improves the user’s experience with a drawing application or any application that lets the user manipulate objects directly on the screen.

UIEvent document has so much content, not much, in fact, we should not take it for granted that UIEvent is complex, in fact, it is very simple, we need to pay more attention to the finger touch position contained in UIEvent, only these positions or touch point information is the most important, Let’s take a look at the documentation for UITouch.

UITouch

Objects that represent the location, size, movement, and force (for 3D Touch and Apple Pencil) of touches that occur on the screen.

UIKIT_EXTERN API_AVAILABLE(ios(2.0)) @interface UITouch : NSObject
Copy the code

Print a UITouch object and see the following:

<UITouch: 0x7f9089614b70> 
phase: Moved 
tap count: 1 
force: 0.000 

window: <UIWindow: 0x7f908950d170; frame = (0 0; 375 812); gestureRecognizers = <NSArray: 0x600003c5ff90>; layer = <UIWindowLayer: 0x60000321e4e0>> 
view: <CustomView: 0x7f90897078d0; frame = (112.667 331; 150 150); autoresize = RM+BM; layer = <CALayer: 0x600003219a80>> 

location in window: {219.33332824707031.428.66665649414062} 
previous location in window: {220.429} 
location in view: {106.66666158040363.97.666656494140625} 
previous location in view: {107.33333333333331.98}
Copy the code

You can access the allTouches property via a UIEvent object passed to the Responder object (UIResponder or a subclass of it) for event handling. Touch objects include access for the following objects:

  • The view or window that is touched
  • Touch the position in the View or window
  • Approximate radius of touch
  • Force of Touch (on devices that support 3D Touch or Apple Pencil)

The Touch object also contains a timestamp indicating when the touch occurred; An integer that represents the number of user tapped screens; And the stages of the touch, which have a constant form that describes whether the touch starts, moves or ends, or whether the system cancels the touch.

To learn how to use swipes, read Handling Swipe and Drag Gestures in the Event Handling Guide for UIKit Apps.

The touch object always exists in the multi-touch sequence. You can store a reference to touch while working with a multi-touch sequence, as long as the reference is released at the end of the sequence. If you need to store information about touch outside of the multi-touch sequence, copy that information from touch.

Touch gestureRecognizers contains gesture recognizers that are currently handling touch. Each Gesture recognizer is an instance of a concrete subclass of UIGestureRecognizer.

locationInView:

Returns the current position of the UITouch object in the given view coordinate system. Return the position of a UITouch object in the view’s coordinate system (CGPoint)

- (CGPoint)locationInView:(nullable UIView *)view;
Copy the code

View: The view object to position touch in its coordinate system. A custom view that handles touch can specify self to get the touch position in its own coordinate system. Pass nil to get the touch position in the window coordinate system.

Return Value: A point that specifies the position of the UITouch in the view.

This method returns the current position of the UITouch object in the specified view’s coordinate system. Because the Touch object may have been forwarded from another view to a view, this method performs any necessary transformations of the touch position to the specified view’s coordinate system.

previousLocationInView:

Returns the previous position of the UITouch in the given view coordinate system.

- (CGPoint)previousLocationInView:(nullable UIView *)view;
Copy the code

View: The view object to position touch in its coordinate system. A custom view that handles touch can specify self to get the touch position in its own coordinate system. Pass nil to get the touch position in the window coordinate system.

Return Value: This method returns the previous position of the UITouch object in the specified view’s coordinate system. Because the Touch object may have been forwarded from another view to a view, this method performs any necessary transformations of the touch position to the specified view’s coordinate system.

view

The view to which the UITouch passes, if any.

@property(nullable,nonatomic,readonly,strong) UIView *view;
Copy the code

The value of this property is the view object to which Touche is passed, not necessarily the view in which Touch is currently located. For example, when the Gesture recognizer recognizes a touch, this property is nil because no view is receiving the touch.

window

The window that originally happened to touch.

@property(nullable,nonatomic,readonly,strong) UIWindow *window;
Copy the code

The value of this property is the window where touch originally occurred. This window may be different from the current window that contains Touch.

majorRadius

The radius of touch (expressed in points).

@property(nonatomic,readonly) CGFloat majorRadius API_AVAILABLE(ios(8.0));
Copy the code

Use the value in this property to determine the touch size of the hardware report. This value is an approximation of the size and can vary according to the amount specified in the majorRadiusTolerance property.

majorRadiusTolerance

Tolerance of the radius of touch (in points).

@property(nonatomic,readonly) CGFloat majorRadiusTolerance API_AVAILABLE(ios(8.0));
Copy the code

This value determines the accuracy of the value in the majorRadius attribute. Add this value to the radius to get the maximum touch radius. Subtract this value to obtain the minimum touch radius.

preciseLocationInView:

Returns the exact position of the UITouch (if available).

- (CGPoint)preciseLocationInView:(nullable UIView *)view API_AVAILABLE(ios(9.1));
Copy the code

View: The view containing the touch.

Return Value: Indicates the exact position of the touch.

Use this method to get extra precision for touch, if available. Do not use return points for hit testing. In some cases, a hit test might indicate that touch is in the view, but a hit test for a more precise location might indicate that touch is outside the view.

precisePreviousLocationInView:

Returns the exact previous position of the touch, if available.

- (CGPoint)precisePreviousLocationInView:(nullable UIView *)view API_AVAILABLE(ios(9.1));
Copy the code

Use this method to obtain additional precision of the previous position of the touch, if available. Do not use return points for hit testing. In some cases, a hit test might indicate that touch is in the view, but a hit test for a more precise location might indicate that touch is outside the view.

tapCount

The number of times a touch tap is given.

@property(nonatomic,readonly) NSUInteger tapCount; // Touch within a certain point in a certain time
Copy the code

The value of this property is an integer that contains the number of hits on this touch that occurred within a predefined period of time. Use this property to evaluate whether the user is single-clicking, double-clicking, or even triple-clicking a particular view or window.

timestamp

The time when touch occurred or the last time mutated occurred.

@property(nonatomic,readonly) NSTimeInterval timestamp;
Copy the code

The value of this property is the time (in seconds) since the system started when touch was triggered or last changed. You can store the value of this property and compare it to the timestamp in the subsequent UITouch object to determine the duration of the touch and, if the touch is swiping, the speed of movement. For a definition of the time after system startup, see the description of the systemUptime method of the NSProcessInfo class.

UITouchType

Touch type.

typedef NS_ENUM(NSInteger, UITouchType) {
    UITouchTypeDirect, // A direct touch from A finger (on A screen)
    UITouchTypeIndirect, // An indirect touch (not a screen)
    UITouchTypePencil API_AVAILABLE(ios(9.1)).// Add pencil name variant
    UITouchTypeStylus API_AVAILABLE(ios(9.1)) = UITouchTypePencil, // A touch from A stylus (deprecated name, use pencil)
    
    // A touch representing a button-based, indirect input device describing the input sequence from button press to button release
    // Represents the touch of a button-based indirect input device, describing the input sequence from the button down to the button release
    UITouchTypeIndirectPointer API_AVAILABLE(ios(13.4), tvos(13.4)) API_UNAVAILABLE(watchos),}API_AVAILABLE(ios(9.0));
Copy the code
  • UITouchTypeDirect: Touch generated by direct contact with the screen. Direct contact occurs when a user’s finger touches the screen.
  • UITouchTypeIndirect: Touches that are not caused by touching the screen. Indirect touch is produced by a touch input device separated from the screen. The trackpad of the Apple TV remote, for example, produces indirect touch.
  • UITouchTypePencil: The touch of the Apple Pencil. Pencil Touch occurs when the Apple Pencil interacts with the device’s screen.
  • UITouchTypeStylus: Obsolete, use UITouchTypePencil instead.

type

Represents a property of type touch.

@property(nonatomic,readonly) UITouchType type API_AVAILABLE(ios(9.0));
Copy the code

For a complete list of touch types, see maximumPossibleForce.

UITouchPhase

Touch phase.

typedef NS_ENUM(NSInteger, UITouchPhase) {
    UITouchPhaseBegan, // whenever a finger touches the surface. As soon as your finger touches the surface.
    UITouchPhaseMoved, // whenever a finger moves on the surface. When the finger moves over the surface.
    UITouchPhaseStationary, // whenever a finger is touching the surface but hasn't moved since the previous event. When the finger touches the surface but has not moved since the last incident.
    UITouchPhaseEnded, // whenever a finger leaves the surface. When the finger leaves the surface.
    UITouchPhaseCancelled, // whenever a touch doesn't end but we need to stop tracking (e.g., putting device to face) Forcing touch to cancel when holding the device to your face while answering a call, or recognizing that touch is a gesture)
    
    UITouchPhaseRegionEntered   API_AVAILABLE(ios(13.4), tvos(13.4)) API_UNAVAILABLE(watchos).// Whenever a touch is entering the region of a user interface
    
    // when a touch is inside the region of a user interface, but hasn’t yet made contact or left the region
    // When the touch is in the user interface area, but has not been contacted or left the area
    UITouchPhaseRegionMoved     API_AVAILABLE(ios(13.4), tvos(13.4)) API_UNAVAILABLE(watchos),
    
    UITouchPhaseRegionExited    API_AVAILABLE(ios(13.4), tvos(13.4)) API_UNAVAILABLE(watchos).// When a touch is no longer intended to withdraw the region of a user interface when a touch exits the user interface area
};
Copy the code

The phases of the UITouch instance change as the system receives updates during the event. This value is accessed through the phase property.

  • UITouchPhaseBegan: Touch was pressed on the screen for the given event.
  • UITouchPhaseMoved: The touch of the given event has been moved on the screen.
  • UITouchPhaseStationary: Pressed the touch from the screen for the given event, but hasn’t moved since the last one.
  • UITouchPhaseEnded: The touch for the given event has been lifted from the screen.
  • UITouchPhaseCancelled: For example, when the user moves the device against his face, the system disables tracking of the touch.
  • UITouchPhaseRegionEntered: at the touch of a given event has entered the window on the screen. UITouchPhaseRegionEntered, UITouchPhaseRegionMoved and UITouchPhaseRegionSited phase is not always aligned with the state attribute UIHoverGestureRecognizer. The Hover Gesture recognizer’s state only works in the context of Gesture’s view, while Touch States works with Windows.
  • UITouchPhaseRegionMoved: The touch of the given event is within the window on the screen, but has not yet been pressed.
  • UITouchPhaseRegionExited: for the touch of a given event left a window on the screen.

phase

Touch phase. Property value is a constant indicating whether the touch starts, moves, ends, or cancels. For a description of possible values for this property, see UITouchPhase.

@property(nonatomic,readonly) UITouchPhase phase;
Copy the code

gestureRecognizers

Receiving Gesture recognizers of Touch objects.

@property(nullable,nonatomic,readonly,copy)   NSArray <UIGestureRecognizer *> *gestureRecognizers API_AVAILABLE(ios(3.2));
Copy the code

The objects in the array are instances of subclasses of the abstract base class UIGestureRecognizer. This property contains an empty array if no Gesture recognizers are currently receiving touch.

⬇️⬇️ Below is some content related to 3D Touch and Apple Pencil. If you are not interested, you can ignore it.

force

The force of touch, where the value 1.0 represents the average touch force (predetermined by the system, not specific to the user).

@property(nonatomic,readonly) CGFloat force API_AVAILABLE(ios(9.0));
Copy the code

This property is available on devices that support 3D Touch or Apple Pencil. To check whether the device supports 3D Touch at run time, read the value of the forceTouchCapability property on the trait set of any object in your application that has a trait environment.

The forces reported by the Apple Pencil are measured along the axis of the Pencil. If you want the force perpendicular to the device, you need to calculate the value using the altitudeAngle value.

The forces reported by Apple Pencil are initially estimated and may not always be updated. To determine whether to need to update, please refer to the estimatedPropertiesExpectingUpdates and find UITouchPropertyForce logo. In this case, the estimationUpdateIndex index contains a non-nil value that you can associate with the original touch when the update occurs. When there is no expected force update, the entire touch sequence usually has no update, so custom tool-specific force curves can be applied to the touch sequence.

maximumPossibleForce

Maximum capability of touch.

@property(nonatomic,readonly) CGFloat maximumPossibleForce API_AVAILABLE(ios(9.0));
Copy the code

The value of this attribute is high enough to provide a wide dynamic range for the value of the force attribute.

This property is available on devices that support 3D Touch or Apple Pencil. To check whether the device supports 3D Touch at run time, read the value of the forceTouchCapability property on the trait set of any object in your application that has a trait environment.

altitudeAngle

Height of Pencil (radians). This parameter applies only to UITouchTypePencil.

@property(nonatomic,readonly) CGFloat altitudeAngle API_AVAILABLE(ios(9.1));
Copy the code

A value of 0 radians means the Apple Pencil is parallel to the surface. This property has the value Pi/2 when the Apple Pencil is perpendicular to the surface.

The following two methods seem to be temporarily unavailable.

azimuthAngleInView:

Returns the azimuth in radians of the Apple Pencil.

- (CGFloat)azimuthAngleInView:(nullable UIView *)view API_AVAILABLE(ios(9.1));
Copy the code

In the screen plane, azimuth refers to the direction in which the stylus is pointing. This property has a value of 0 radians when the tip of the stylus touches the screen and the cap tip of the stylus (the end opposite the tip) points to the positive X axis of the device screen. The azimuth increases when the user swings the stylus tip clockwise around the tip.

Note: Obtaining azimuth (as opposed to azimuth unit vector) is more expensive, but also more convenient.

azimuthUnitVectorInView:

Returns the unit vector pointing in the azimuth direction of the Apple Pencil.

- (CGVector)azimuthUnitVectorInView:(nullable UIView *)view API_AVAILABLE(ios(9.1));
Copy the code

It is cheaper to get the unit vector of azimuth than to get the azimuth. Unit vectors are also more useful if you’re creating transformation matrices.

UITouchProperties

Some bitmasks for touch properties that may be updated.

typedef NS_OPTIONS(NSInteger, UITouchProperties) {
    UITouchPropertyForce = (1UL << 0),
    UITouchPropertyAzimuth = (1UL << 1),
    UITouchPropertyAltitude = (1UL << 2),
    UITouchPropertyLocation = (1UL << 3), // For predicted Touches
} API_AVAILABLE(ios(9.1));
Copy the code
  • UITouchPropertyForce: Indicates the touch property of force in the bitmask.
  • UITouchPropertyAzimuth: The touch property of azimuth in the bitmask. (For Apple Pencil)
  • UITouchPropertyAltitude: The touch attribute in the bitmask that represents Altitude. (For Apple Pencil)
  • UITouchPropertyLocation: The touch property representing the location in the bitmask.

estimatedProperties

A set of touch properties whose values contain only estimates.

@property(nonatomic,readonly) UITouchProperties estimatedProperties API_AVAILABLE(ios(9.1));
Copy the code

This property contains a constant bitmask indicating which touch properties cannot be reported immediately. For example, the Apple Pencil records the force of a touch, but that information must be transmitted over the air to the underlying iPad. Delays in transferring data can cause information to be received after a touch is reported to the application.

There is no guarantee that the value in this property will be updated later. About want to update its value attribute list, please refer to the estimatedPropertiesExpectingUpdates.

estimatedPropertiesExpectingUpdates

A set of Touch properties whose values are expected to be updated in the future.

@property(nonatomic,readonly) UITouchProperties estimatedPropertiesExpectingUpdates API_AVAILABLE(ios(9.1));
Copy the code

This property contains a bitmask for constants indicating which touch attributes cannot be reported immediately and which need to be updated later. When this property contains the problem sets is not empty, can expect UIKit later use update the value of a given attribute invocation response program or gesture recognizer toucheEstimatedPropertiesUpdated: method. Append the value from the estimationUpdateIndex property to the touch data copy of the application. Later when the UIKit calls toucheEstimatedPropertiesUpdated: method, the use of new touch estimate update the index to locate and update the application’s touch a copy of the data.

When this property contains an empty set, no more updates are required. In this scenario, the estimated or updated value is the final value.

estimationUpdateIndex

An index number used to associate the updated touch with the original touch.

@property(nonatomic,readonly) NSNumber * _Nullable estimationUpdateIndex API_AVAILABLE(ios(9.1));
Copy the code

This property contains a unique token for the current touch data. When a touch contains an estimated attribute, this index is saved in the application’s data structure along with the rest of the touch data. When the system later reports the actual touch value, this index is used to locate the original data in the application data structure and replace the previously stored estimate. For example, when a touch contains an estimation attribute, this attribute can be used as a key in a dictionary whose value is the object used to store the touch data.

The value of this attribute increases monotonically for each touch that contains the estimation attribute. This property has a value of zero when the Touch object does not contain an estimated or updated property.

The most important thing is to record its coordinate points in view and window, and phase etc. This will be used in learning UIResponder. The responder of UITouch will call different response functions at different stages. (touchesBegan: withEvent:, touchesMoved: withEvent:, touchesEnded: withEvent:, touchesCancelled: withEvent:)

UIResponder content we’ll leave behind. Now that we’re done with UIEvent and UITouch, we can continue down with [UIApplication sendEvent:], [UIWindow sendEvent:], You need to find the first responders layer by layer. So how do we find the first responder? Let’s move on.

Hit-Testing

Determine whether a touch point in a view involves several functions in the UIView class below. (UIWindow inherits from UIView, which you probably know)

hitTest:withEvent:

Returns the farthest subview (and possibly itself) of UIView in the view hierarchy containing the specified point.

- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event;
Copy the code

Point: the point specified in UIView’s local coordinate system (bounds). Event: The event for which this method needs to be called. If you want to call this method from outside the event handling code, you can specify nil.

Return Value: The view object is the farthest child view of the current view and contains point. If the point is completely outside the UIView’s view hierarchy, nil is returned.

This method iterates through the view hierarchy by calling the pointInside:withEvent: method of each child view to determine which child view should receive touch events. PointInside :withEvent: returns YES if pointInside:withEvent:, and similarly traverses the hierarchy of its children until it finds the foremost view that contains point. If the view does not contain the point, branches of its view hierarchy are ignored. You rarely need to call this method yourself, but you can override it to hide touch events from subviews, or to expand the view response scope.

This method ignores view objects that have hidden set to YES, disable user interaction (userInteractionEnabled set to NO), or whose alpha is less than 0.01. When determining a certain hit, this method does not consider the content of the view. Therefore, even if the point is in a transparent part of the view’s content, the view can still return.

Points that go out of the View’s bounds are never reported as hits, even if they are actually in a child view of the receiver. This happens if the clipsToBounds property of the current view is set to NO and the affected subviews are outside the bounds of the view. (For example, if a button button is out of its parent’s bounds, clicking a button within the bounds of the parent view will respond to the click event, or clicking a button outside the bounds of the parent view will not respond to the click event)

HitTest :withEvent: The process of finding a view containing a point can be understood as follows:

- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event{
    // 3 states cannot respond to events
    // 1): userInteractionEnabled is NO, which disables user interaction.
    // 2): hidden = YES;
    // 3): Alpha less than or equal to 0.01, transparency less than 0.01.
    if (self.userInteractionEnabled == NO || self.hidden == YES ||  self.alpha <= 0.01) return nil;
    
    // Touch points cannot respond to events if they are not on the current view
    if ([self pointInside:point withEvent:event] == NO) return nil;
    
    // ⬇️⬇️⬇️ traverses the subview array from back to front (in reverse order)
    int count = (int)self.subviews.count;
    for (int i = count - 1; i >= 0; i--) {
        // Get the subview
        UIView *childView = self.subviews[i];
        
        // Coordinate conversion, the touch point coordinates on the current view to the coordinates on the child view
        CGPoint childP = [self convertPoint:point toView:childView];
        
        // Ask for the best response view in the subview hierarchy (recursive)
        UIView *fitView = [childView hitTest:childP withEvent:event];
        
        if (fitView) {
            // Return if there is a more suitable subview
            returnfitView; }}// If no more suitable response view is found in the subview, then the response view itself is the most suitable
    return self;
}
Copy the code

pointInside:withEvent:

Returns a Boolean value indicating whether UIView contains point.

- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event;
Copy the code

Point: the point specified in UIView’s local coordinate system (bounds). Event: The event for which this method needs to be called. If you want to call this method from outside the event handling code, you can specify nil.

Return YES if point is contained in UIView’s bounds, NO otherwise.

convertPoint:toView:

Converts point from UIView’s coordinate system to the point (CGPoint) of the specified view.

- (CGPoint)convertPoint:(CGPoint)point toView:(UIView *)view;
Copy the code

Point: the point specified in UIView’s local coordinate system (bounds).

The above method is to find the farthest child view that can contain touch from a given view. So going down [UIApplication sendEvent:], [UIWindow sendEvent:] is actually our App root Window ([UIApplication SharedApplication]. KeyWindow, iOS 13 after the introduction of UISceneSession to obtain the root window has been changed, Here we still use the previous way to get the root of the root window) controller ([UIApplication sharedApplication]. KeyWindow. The rootViewController) View, Then find the first responder along the view hierarchy (hit-testing each view). ‘Touches’ is handled directly by’ Touches’. ‘Touches’ is handled by’ touches’. Call ‘touches’ if you don’t want to, or call’ super ‘if you don’t implement touches at all. Series of functions, it is up the responder chain to find responders that can respond to the event. If none is eventually found, the event is discarded. So we’re going to look at Responser and Responder Chian, so what are they? Read on…

So let’s take a look at the UIResponder documentation.

Refer to the link

Reference link :🔗

  • Using Responders and the Responder Chain to Handle Events
  • Handling Touches in Your View
  • Responder object
  • Events (iOS)
  • Target-Action
  • Count the flow of iOS touch events
  • IOS responder chain and event handling
  • IOS development series – Touch event, gesture recognition, shake event, headset wire control