Learn more about touch events in iOS

IOS event mechanism is an important part of iOS human-computer interaction. This article follows the history of iOS event mechanics and explores the details. To understand how the event mechanism works globally. Along the way, we’ll also try to discover the design philosophy and thinking behind Apple.

Apple takes a nuanced approach to the user experience. In the whole human-computer interaction system. Apple has designed many ingenious methods and defined many Rules. Finally, the user experience of iOS is good enough. At the same time, developers can easily use these features and achieve a consistent experience.

IOS Touch event state machine

Let’s first look at the flow of touch with a single finger slide:

Stage of experience:

UITouch state machine

Began: represents the migration of fingers

Moved your fingers

Ended: Fingers left

Cancelled: To be interrupted externally or internally.

Stationary: Keeping your finger on the screen and not moving. This state system is not notified to upper-layer applications, and we generally don’t care about it. But he is part of the state machine.

IOS Touch event model

Now that we know the state machine for a typical touch flow, let’s take a look at what objects are in the iOS touch event model.

  • UITouch
  • UIEvent
  • UIResponder
    • UIWindow
    • UIViewController
    • UIView
    • UIControl
  • UIGestureRecognizer

UITouch

Represents a touch source, usually a finger.

Remember, the UITouch is not literally a touch. It represents the finger and follows it through its life cycle. The state of the UITouch is the state of the finger at different times.

The UITouch is captured by hardware. Refreshes are usually performed at a frequency of 60HZ. After the iPad Air2, devices with a 120HZ refresh rate started. Although the hardware captures the UITouch twice as much, iOS is compatible with it. Of course, you can also through UIEvent [UIEvent coalescedTouchesForTouch:] method to get the two points.

TapCount is an important attribute of the UITouch. When a double click is performed, the hardware determines at some latency to report the two UITouch moves in — or out? Or just report one UITouch and mark its tapCount. This property is critical to determining whether a double or multiple click is required.

UIEvent

A UIEvent refers to a touch event.

IOS was designed with extensibility in mind. A UIEvent is not always a touch event, although usually it is a touch event. This article will also cover only touch events. See UIEvent object type or subtype enumeration for details.

The UIEvent represents the full life cycle of the interaction. In the process, at any time there will be a new finger left, or new fingers moved in. Until at some point all fingers leave and a latency persists. That’s when UIEvent is over.

// UIEvent
@property(nonatomic.readonly.nullable) NSSet <UITouch *> *allTouches;
Copy the code

The allTouches property gives you access to all UITouch objects for this interaction.

- (nullable NSSet <UITouch *> *)touchesForWindow:(UIWindow *)window;
- (nullable NSSet <UITouch *> *)touchesForView:(UIView *)view;
Copy the code

You can also get all UITouch objects bound to a window or view using the above two methods.

The diagram of relation between UIEvent and other objects:

UIResponder

All objects inherited from UIResponder can handle UITouch events. The tree of UIWindow, UIViewController, UIView makes up the Responder Chain.

Responder Chain diagram:

Events start in the Hit Test View and are forwarded down the responder chain (visually, from top to bottom). Until it stops passing up. The default behavior of the UIResponder object is to pass it up, so if you don’t want to pass it up, you just don’t call super.

UIWindow

If your application contains multiple UIWindows, the system gets the UITouch event after. The system follows the UIWindow hierarchy. Go through the hittest layer by layer until a UIWindow returns a UIView object.

UIWindow parses the UIEvent object and passes the corresponding UITouch object to either the Hit Test View or the UIGestureEnvironment.

UIViewController

The UITouch event is on its way up the Responder Chain forward. If the root view of a UIViewController does not respond to the UITouch. The system will distribute to the corresponding UIViewController first instead of the superView directly. Which is then sent to superView.

Overriding the touch method in UIViewController makes it unnecessary to create UIButtons or subclass UIViews to implement event callbacks. For example, click on the blank hidden keyboard.

UIView

The most common Touch event handling unit.

For a Drawing App. We need to override the Touch method. Other methods are rare

Overriding the HitTest and PointInside methods allows you to manipulate the HitTest procedure. This makes Hit Test dynamic and dynamically returns the Hit Test View based on the App’s state. We’ll talk more about the Hit Test later

UIControl

UIControl is a special UIView that implements many user touch-related interactions. And interact with your code through the target-Action design pattern.

Its coexistence with UIGestureRecognizer will also be handled differently from a normal UIView. We’ll talk about that later.

// UIControlEvents

UIControlEventTouchDown
UIControlEventTouchDownRepeat
UIControlEventTouchDragInside
UIControlEventTouchDragOutside
UIControlEventTouchDragEnter
UIControlEventTouchDragExit
UIControlEventTouchUpInside
UIControlEventTouchUpOutside
UIControlEventTouchCancel
Copy the code

The above event types cover almost any scenario you need. Therefore, the easiest way to do this is to add a UIControl or UIButton object instead of overriding the Touch event.

UIGestureRecognizer

UIGestureRecognizer is the focus of this article.

UIGestureRecognizer was first introduced by apple with iOS3.2(late 2009). The UIGestureRecognizer is an abstraction of a series of finger operations. Is a powerful tool.

The UIGestureRecognizer is built on top of the UITouch. It encapsulates the specific Touch Handling. It allows you to use its abstract clicks, pinches, and pans without having to handle Touch events yourself. We should use higher-level tools whenever possible. Unless you want personalized features.

A two-step iOS touch event mechanism

After understanding the basic concepts of some key objects in the base iOS event model. Now let’s look at how the system makes these objects work together.

IOS devices use hardware to get touch coordinates and status information. The system then needs to deliver this information accurately to our visual objects (views). It goes through two processes.

Hit Test

IOS uses the tree of views to calculate the top View that contains touch points.

The system uses [UIView hitTest:withEvent:] to get the clicked view. It’s a recursive function, and each UIView object calls its own subvew. Until [UIView pointInside:withEvent:] returns true for a UIView and there are no subviews.

The pseudo function is shown as follows:

Flow chart:

IOS exposes two key functions that handle the Hit Test process. This allows us to dynamically control the Hit Test process.

The inconsistencies in the capabilities of the two functions allow us to control in different dimensions.

These two functions are also the earliest we can customize iOS event mechanics.

Distribution of the message

Typically, iOS will distribute these touch messages to the corresponding view. However, if this View does not handle. By default, the system forwards its messages through the responder chain.

It’s distributed through UIApplication, UIWindow

// UIApplication
- (void)sendEvent:(UIEvent *)event;

// UIWindow
- (void)sendEvent:(UIEvent *)event;
Copy the code

Let’s take a look at the UITouch event distribution process

In the figure: UIGestureEnvironment is the class responsible for managing all UIGesture.

multi-touch

The multipleTouchEnabled property controls whether a UIView supports multi-touch. The default is false.

Usually we don’t need to handle multiple finger Touch events at the same time. At this point, for the entire UIEvent cycle, the first UITouch to click on the View will mark the View properties correctly. All subsequent clicks on the UITouch of that view will set the View property to nil. But it’s also attached to a UIEvent.

Because of this, you won’t feel the difference when you see an anyObject call. Because by default, it’s the same object. As shown in figure:

- (void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
  // You often see people accessing UITouch objects in this way.
  UITouch*touch = [touches anyObject]; . }Copy the code

If true, these UITouch events are passed to the UIView Touch event.

UIGestureRecognizer

Why did Apple design the UIGestureRecognizer?

A case study

If you had to implement a pinch-scale and move logic with two fingers, what would you do? You’re going to be a lot of data crunching. That’s right.

Here, according to the coordinates of two fingers, manually calculate the value of the matrix. The first four values are used for the linear transformation of the matrix, and the last two values are used for the offset. For more information on matrix transformations, you can refer to the high number 😂.

And this is only for up to two fingers. If you have three or more. The code needs to be improved.

Most of the above code comes from the MultiTouchDemo in WWDC2009. View the full Demo

In fact, prior to iOS3.2, this is how UIScrollView was scaled. The UIGestureRecognizer is not used.

The purpose of Apple’s UIGestureRecognizer is to encapsulate these mathematical operations and make it easier for developers to transform user touches, while avoiding the experience differences caused by inconsistent implementation by developers.

(To be continued…)