UIGestureRecognizer

The base class for the concrete gesture recognizer. A gesture recognizer object, or simply a gesture recognizer, decouples the logic of a series of touches (or other inputs) and operates on that recognition. When one of the objects recognizes a common gesture, or in some cases, a change in the gesture, it sends an action message to each specified target object.

The specific subclasses of UIGestureRecognizer are as follows:

  • UITapGestureRecognizer Click on the gesture recognizer
  • UIPinchGestureRecognizer Scales gesture recognizer
  • Rotate the gesture recognizer UIRotationGestureRecognizer
  • UISwipeGestureRecognizer swipe gesture recognizer
  • UIPanGestureRecognizer panning gesture recognizer
  • UIScreenEdgePanGestureRecognizer edge sideslip gesture recognizer
  • UILongPressGestureRecognizer long press gesture recognizer

The UIGestureRecognizer class defines a common set of behaviors that can be configured for all specific gesture recognizers. It can also be its delegate (a UIGestureRecognizerDelegate protocols used objects) communication, able to custom for some behavior more exquisite.

The gesture recognizer performs touch checking on a particular view and all of its children. Therefore, it must be associated with that view. To establish this association, you must call UIView’s addGestureRecognizer: method. The gesture recognizer does not participate in the view’s responder chain.

A gesture recognizer has one or more target-action pairs associated with it. If there are multiple goal-action pairs, they are discrete rather than cumulative. Recognition of the gesture causes the action message to be dispatched to the target of each associated pair. The action method being called must match one of the following signatures:

– (void)handleGesture;

– (void)handleGesture:(UIGestureRecognizer *)gestureRecognizer;

Methods conforming to the latter signature allow the target to query the gesture recognizer that sent the message for additional information under certain circumstances. Goals can ask UIRotationGestureRecognizer object, for example, since the last action method calls the gestures of the rotation Angle (in radians). The agent of the gesture recognizer can also ask for the location of the gesture by calling locationInView: or locationOfTouch:inView:.

Gestures interpreted by gesture recognizers can be discrete or continuous. A discrete gesture, such as a double tap, occurs only once in a multi-touch sequence, resulting in an action being sent. However, when a gesture recognizer interprets a continuous gesture (such as a rotation gesture), it sends an action message for each incremental change until the multi-touch sequence ends.

A window passes touch events to the gesture recognizer before passing them to the click-verify view attached to the gesture recognizer. Typically, if a gesture recognizer analyzes the touch flow in a multi-touch sequence, but does not recognize its gesture, the view will receive the full complement of the touch. If the gesture recognizer recognizes its gesture, the rest of the view’s touches are cancelled. The sequence of actions for gesture recognition usually follows the path determined by the default values of cancelsTouchesInView, delaysTouchesBegan, and delaysTouchesEnded attributes:

CancelsTouchesInView – If a gesture recognizer recognizes its gesture, it unbinds the rest of that gesture’s touches from their view (so the window doesn’t pass them). Window through (touchesCancelled: withEvent:) message to cancel the touch of earlier delivery. If the gesture recognizer does not recognize its gesture, the view receives all touches in the multi-touch sequence.

DelaysTouchesBegan – As long as the gesture recognizer doesn’t recognize its gesture when analyzing the touch event, the window will keep the touch object sent to the attached view in the UITouchPhaseBegan phase. If the gesture recognizer then recognizes its gesture, the view will not receive these touch objects. If the gesture recognizer does not recognize its gesture, the window will deliver these objects in a call to the view’s touchesBegan:withEvent: method (and possibly a subsequent touchesMoved:withEvent: call to inform it to touch the current location).

DelaysTouchesEnded – As long as the gesture recognizer does not recognize its gesture while analyzing the touch event, the window keeps the touch object sent to the attached view in the UITouchPhaseEnded stage. If the gesture recognizer subsequently identified its gestures, touch will be canceled (in touchesCancelled: withEvent: message). If the gesture recognizer does not recognize its gestures, the window will deliver these objects in a call to the view’s touchesEnded:withEvent: method.

Commonly used attributes

@property(nullable, nonatomic,readonly) UIView *view;

Property description: View to which the gesture recognizer is attached. Use the addGestureRecognizer: method to attach (or add) a gesture recognizer to a UIView object.

@property(nullable, nonatomic,readonly) UIView *view;    
Copy the code

@property(nonatomic, getter=isEnabled) BOOL enabled;

Property description: Indicates whether the Boolean property of the gesture recognizer is enabled. Disable the gesture recognizer so that it does not receive touches. The default value is YES. If you change this property to NO while the gesture recognizer is currently recognizing a gesture, the gesture recognizer transitions to the cancelled state.

@property(nonatomic, getter=isEnabled) BOOL enabled;
Copy the code

@property(nonatomic,readwrite) UIGestureRecognizerState state;

Property Description: Current state of the gesture recognizer. The possible state of a gesture recognizer is represented by a constant of type UIGestureRecognizerState. Some of these states do not apply to discrete gestures.

@property(nonatomic,readwrite) UIGestureRecognizerState state; 
Copy the code

UIGestureRecognizerState provides the following enumeration values:

Typedef NS_ENUM(NSInteger, UIGestureRecognizerState) {// The recognizer has not yet recognized its gesture, but may be evaluating the touch event. This is the default state UIGestureRecognizerStatePossible, / / recognizer has received recognition for the touch of gestures. Operation method will be invoked in the running cycle of the next round UIGestureRecognizerStateBegan, / / recognizer to receive touch is identified as gestures of change. Operation method will be invoked in the running cycle of the next round UIGestureRecognizerStateChanged, / / recognizer has received the touch of is identified as the end of the gesture. The operation will be in the running cycle of the next round is called, the recognizer will be reset to UIGestureRecognizerStatePossible UIGestureRecognizerStateEnded, / / recognizer has received the touch of leading to the cancellation of the gesture. The action method is called in the next round of the run cycle. Identifier will be reset to UIGestureRecognizerStatePossible UIGestureRecognizerStateCancelled, / / recognizer to receive touch sequence cannot be identified as gestures. Operation will not be invoked, identifier will be reset to UIGestureRecognizerStatePossible / / discrete gestures - identify discrete event but not report changes (for example, click) gesture recognizer not at the start of conversion between state and change, And will not fail or cancelled UIGestureRecognizerStateFailed, / / recognizer has received recognition for the touch of hand gestures. The action method will be called on the next turn of the run loop, Identifier will be reset to UIGestureRecognizerStatePossible UIGestureRecognizerStateRecognized = UIGestureRecognizerStateEnded};Copy the code

@property(nullable,nonatomic,weak) id delegate;

Property description: Proxy for gesture recognizer. Gesture recognizers keep weak references to their proxies. Agent must use UIGestureRecognizerDelegate protocol and implement its one or more methods.

@property(nullable,nonatomic,weak) id <UIGestureRecognizerDelegate> delegate;
Copy the code

Commonly used functions

– (CGPoint)locationInView:(nullable UIView*)view;

Function description: Returns the point in the given view where the gesture represented by the receiver is computed as a position. The return value is the generic single point position of the gesture computed by the UIKit framework. It is usually the center of mass of the touch involved in a gesture. For objects of the UISwipeGestureRecognizer and UITapGestureRecognizer classes, the position returned by this method has special significance for gestures.

Parameters:

View: A UIView object on which gestures are made. Specify nil to represent the window.

Return value: The point that identifies the gesture position in the view’s local coordinate system. If the view is specified as nil, this method returns the gesture position in the window’s base coordinate system.

- (CGPoint)locationInView:(nullable UIView*)view;  
Copy the code

– (void)addTarget:(id)target action:(SEL)action;

Function description: Adds targets and actions to a gesture recognizer object. This method can be called multiple times to specify multiple target-action pairs. However, if a request is made to add an already added target-action pair, the request is ignored.

Parameters:

Target: The receiver that receives the action information sent when the gesture being expressed occurs. Nil is not a valid value.

Action: A selector that identifies the target method to be invoked by the action message. NULL is not a valid value.

- (void)addTarget:(id)target action:(SEL)action; 
Copy the code

– (void)requireGestureRecognizerToFail:(UIGestureRecognizer *)otherGestureRecognizer;

Function description: When creating an object, create a dependency between the receiver and another gesture recognizer. This works well when the gesture recognizer is not created elsewhere in the application or framework, and the set of gesture recognizers remains the same. If failure requirements need to be set up lazily or in different view hierarchies, Using gestureRecognizer: shouldRequireFailureOfGestureRecognizer: methods and gestureRecognizer: shouldBeRequiredToFailByGestureRecogniz Er: Method replacement. (note that shouldRequireFailureOfGestureRecognizer: method and shouldBeRequiredToFailByGestureRecognizer: let subclasses define the scope of such failure needs.)

This method creates a gesture recognizer with another relationship, the relationship will be delay receiver from UIGestureRecognizerStatePossible transformation. The state the receiver transitions to depends on the other recognizers:

If other gesture recognizer into UIGestureRecognizerStateFailed, receiver into its normal the next state.

If other gesture recognizer into UIGestureRecognizerStateRecognized or UIGestureRecognizerStateBegan, the receiver into UIGestureRecognizerStateFailed.

An example of how this method might be called is when a single dot gesture is wanted, requiring a double dot gesture recognition after failure.

Parameters:

OtherGestureRecognizer: Another gesture recognizer object (instance of a UIGestureRecognizer subclass).

- (void)requireGestureRecognizerToFail:(UIGestureRecognizer *)otherGestureRecognizer;
Copy the code

UIGestureRecognizerDelegate

– (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer;

Function description: Asks the proxy gesture recognizer if it should start interpreting touches. When the gesture recognizer is trying to convert a UIGestureRecognizerStatePossible state, this method is called. Return NO leads to gesture recognizer into UIGestureRecognizerStateFailed state.

Parameters:

GestureRecognizer: instance of a subclass of the abstract base class UIGestureRecognizer. This gesture recognizer object is about to start processing touches to determine if its gesture is taking place.

Return value: YES(the default) tells the gesture recognizer to continue interpreting the touch, NO prevents it from trying to recognize its own gesture.

- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer;
Copy the code

– (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer;

Function description: Asks the agent whether to allow two gesture recognizers to recognize gestures simultaneously. Call this method when recognition of a gesture by a gesture recognizer or other gesture recognizers prevents other gesture recognizers from recognizing its gesture. Note that returning YES guarantees simultaneous recognition; On the other hand, returning NO is not guaranteed to prevent simultaneous recognition, because delegates from other gesture recognizers may return YES.

Parameters:

GestureRecognizer: instance of a subclass of the abstract base class UIGestureRecognizer. This is the object that sends the message to the broker.

OtherGestureRecognizer: instance of a subclass of the abstract base class UIGestureRecognizer.

Return value: YES, allows the gesture recognizer and other gesture recognizers to recognize its gesture simultaneously. The default implementation returns NO, and NO two gestures can be recognized simultaneously.

- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer;
Copy the code

– (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer ShouldRequireFailureOfGestureRecognizer: (UIGestureRecognizer *) otherGestureRecognizer API_AVAILABLE (ios (7.0));

Function description: Asking the proxy gesture recognizer if the recognition requires another gesture recognizer fails. This method is called once on each recognition attempt, so failure recognition can be determined latently and can be set between recognizers across view hierarchies. Note that returning YES guarantees that failure recognition is set; On the other hand, returning NO is not guaranteed to prevent or remove a failure recognition, because other gesture recognizers may make themselves a failure recognition by using their own subclasses or delegate methods.

Parameters:

GestureRecognizer: instance of a subclass of the abstract base class UIGestureRecognizer. This is the object that sends the message to the broker.

OtherGestureRecognizer: instance of a subclass of the abstract base class UIGestureRecognizer.

Return value: YES, sets dynamic failure recognition between the previously recognized gesture recognizer and other gesture recognizers. The default implementation returns NO, and the currently recognized gesture recognizer does not need other gesture recognizers to fail.

- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer ShouldRequireFailureOfGestureRecognizer: (UIGestureRecognizer *) otherGestureRecognizer API_AVAILABLE (ios (7.0));Copy the code

UITapGestureRecognizer — Click on the gesture recognizer

A concrete subclass of UIGestureRecognizer. Find single or multiple clicks. To recognize gestures, a specified number of fingers must click on the view a specified number of times. Click gestures are discrete for each state of the gesture recognizer. Therefore, the relevant action message is sent when the gesture begins and for each intermediate state up to (including) the end state of the gesture. Therefore, code that handles click gestures should test the state of the gesture. To handle this gesture, call UIGestureRecognizer’s locationInView: method to get the overall position of the gesture. If there are multiple clicks, this position is the first one; If there are multiple touches, this position is the center of mass where all fingers tap the view. The client can get the location of a particular click in the click by calling the locationOfTouch:inView: method, or the location of the first click if multiple clicks are allowed.

Commonly used attributes

@property (nonatomic) NSUInteger numberOfTapsRequired;

Attribute description: The number of clicks to identify the gesture. The default value is 1.

@property (nonatomic) NSUInteger  numberOfTapsRequired;
Copy the code

@property (nonatomic) NSUInteger numberOfTouchesRequired API_UNAVAILABLE(tvOS);

Attribute description: The hand index required for gesture recognition. The default value is 1.

@property (nonatomic) NSUInteger  numberOfTouchesRequired API_UNAVAILABLE(tvOS); 
Copy the code

Code snippet for the exercise (view add click and double click) :

@interface GestureRecognizerViewController ()<UIGestureRecognizerDelegate> @end @implementation GestureRecognizerViewController - (void)viewDidLoad { [super viewDidLoad]; The self. The navigationItem. Title = @ "gesture recognition"; [self createUI]; } - (void)createUI{// testView UIView *testView = [[UIView alloc]initWithFrame:CGRectMake(CGRectGetMaxX(sell.view.frame) / 2  - 100 / 2, CGRectGetMaxY(self.view.frame) / 2 - 100 / 2, 100, 100)]; testView.backgroundColor = [UIColor redColor]; [self.view addSubview:testView]; UITapGestureRecognizer *singleClick = [[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(singleClickEvent)]; / / to read gestures clicks singleClick numberOfTapsRequired = 1; / / hand gestures recognition for index singleClick. NumberOfTouchesRequired = 1; / / click add gesture recognizer [testView addGestureRecognizer: singleClick]; UITapGestureRecognizer *doubleClick = [[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(doubleClickEvent)]; / / to identify gestures clicks. DoubleClick numberOfTapsRequired = 2; / / hand gestures recognition for index doubleClick. NumberOfTouchesRequired = 1; / / click add gesture recognizer [testView addGestureRecognizer: doubleClick]; / / double-click gesture recognition after failing to click, not for double-click gesture recognition failure [singleClick requireGestureRecognizerToFail: doubleClick]; } - (void)singleClickEvent{ UIAlertController *alertController = [UIAlertController alertControllerWithTitle:@"Alert" Message: @ "click response" preferredStyle: UIAlertControllerStyleAlert]; UIAlertAction *yesAction = [UIAlertAction actionWithTitle:@"YES" style:UIAlertActionStyleCancel handler:^(UIAlertAction *action){ NSLog(@"Top YES Button"); }]; [alertController addAction:yesAction]; [self presentViewController:alertController animated:true completion:nil]; } - (void)doubleClickEvent{ UIAlertController *alertController = [UIAlertController alertControllerWithTitle:@"Alert" Message: @ "double-click response" preferredStyle: UIAlertControllerStyleAlert]; UIAlertAction *yesAction = [UIAlertAction actionWithTitle:@"YES" style:UIAlertActionStyleCancel handler:^(UIAlertAction *action){ NSLog(@"Top YES Button"); }]; [alertController addAction:yesAction]; [self presentViewController:alertController animated:true completion:nil]; } @endCopy the code

The effect is as follows:

UIPinchGestureRecognizer — Scale gesture recognizer

A concrete subclass of UIGestureRecognizer for finding kneader gestures involving two touches. When the user moves two fingers relative to each other, the conventional meaning is to shrink; When the user removes two fingers, the conventional meaning is magnification.

A kneading gesture is a continuous gesture. When two touch mobile enough to be regarded as pinch gestures, gestures (UIGestureRecognizerStateBegan). When finger movement (two fingers to keep pressing the state), gesture will change (UIGestureRecognizerStateChanged). When two fingers from the view of lift, end of the gesture (UIGestureRecognizerStateEnded).

Commonly used attributes

@property (nonatomic) CGFloat scale;

Property description: Scale factor relative to two touch points in screen coordinates. You can set the scale factor, but doing so resets the rate at which you indent the scale factor. The scaling factor gradually approaches 0 in the zoom display, gradually exceeds 1 in the zoom display, and is 1 in the normal display, but the scaling factor will reset each time the gesture is recognized.

@property (nonatomic)          CGFloat scale;  
Copy the code

@property (nonatomic,readonly) CGFloat velocity;

Property description: Rate of indent scaling factor per second. Fingers close together is negative, fingers apart is positive.

@property (nonatomic,readonly) CGFloat velocity;
Copy the code

Code snippet for the exercise :(view zooming)

@interface GestureRecognizerViewController ()<UIGestureRecognizerDelegate> @end @implementation GestureRecognizerViewController {/ / the last gesture of scaling factor CGFloat lastRecognizerScale; CGFloat viewMaxScale; CGFloat viewMinScale; UIImageView *imageView; } - (void)viewDidLoad { [super viewDidLoad]; The self. The navigationItem. Title = @ "gesture recognition"; [self initScale]; [self createUI]; } - (void)initScale{// Initialize the scale factor of the last gesture, default is 1; // Initialize the maximum view scale. Default is 2 viewMaxScale = 2; // Initialize the minimum view scale, default is 1 viewMinScale = 1; } - (void)createUI{// scale gesture recognizer test imageView imageView = [[UIImageView alloc]initWithImage:[UIImage imageNamed:@"tian_kong_long"]]; Imageview. frame = CGRectMake(0, 0, CGRectGetWidth(self.view.frame), CGRectGetHeight(self.view.frame)); / / set the contents of the display mode imageView. ContentMode = UIViewContentModeScaleAspectFit; / / set to interact with users, UIImageView. The default is NO imageView userInteractionEnabled = YES; // Add UIImageView to the view [self.view addSubview:imageView]; UIPinchGestureRecognizer *pinch = [[UIPinchGestureRecognizer alloc]initWithTarget:self action:@selector(pinchEvent:)]; / / add the zooming gesture recognizer [imageView addGestureRecognizer: pinch]; Recognizer {switch (recognizer.state) {case pinchEvent:(UIPinchGestureRecognizer *)recognizer{// recognizer.state) {case UIGestureRecognizerStateBegan: / / zoom start case UIGestureRecognizerStateChanged: / / zoom change {/ / get the current view CGFloat scaling currentViewScale = [[imageView.layer valueForKeyPath:@"transform.scale"] floatValue]; // Record the scale factor change of the two touch points (add 1 to make this newly generated scale factor greater than 0). This newly generated scale factor will be used as the function call each time, Recognizerscale = recognizer.scale-lastreCognizerScale + 1; CGFloat newRecognizerScale = recognizerscale. Recognizerscale = MIN(newRecognizerScale, viewMaxScale/currentViewScale); // RecognizerScale = MIN(newRecognizerScale, viewMaxScale/currentViewScale); Recognizerscale = MAX(newRecognizerScale, viewMinScale/currentViewScale); // RecognizerScale = MAX(newRecognizerScale, viewMinScale/currentViewScale); Transform = CGAffineTransformScale(ImageView. transform, newRecognizerScale, newRecognizerScale); LastRecognizerScale = recognizer.scale; // Record the scale factor of the two touch points as the scale factor of the last two touch points. } break; Case UIGestureRecognizerStateEnded: / / end of the zoom {/ / every time at the end of the zoom gestures, will record the last gesture of scale factor to 1 lastRecognizerScale = 1; } break; default: break; } } @endCopy the code

The effect is as shown in the picture (the simulator uses Option: Bring up the double-finger zoom in and out effect.) :

UIRotationGestureRecognizer – rotating gesture recognizer

A concrete subclass of UIGestureRecognizer for finding rotating gestures involving two touches. As the user moves his fingers relative to each other in a circle, the underlying view should rotate in the corresponding direction and speed. Rotation is a continuous posture. It begins when the two touches move enough to be considered a rotation. When two fingers move down, the gesture changes. It’s over when both fingers are up. At each stage of a gesture, the gesture recognizer sends an action message.

Commonly used attributes

@property (nonatomic) CGFloat rotation;

Attribute Description: Gesture rotation in radians. The rotation value can be set to any value; However, setting the rotation resets the speed. A rotation value is a single value that changes over time. There are pluses and minuses along with the rotation direction, the two fingers are positive when they rotate clockwise and negative when they rotate counterclockwise, which is not the incremental value of rotation reported last time. The rotation value is reset each time the gesture is recognized.

@property (nonatomic)          CGFloat rotation;  
Copy the code

@property (nonatomic,readonly) CGFloat velocity;

Property description: Rotation gesture speed (in radians per second).

@property (nonatomic,readonly) CGFloat velocity; 
Copy the code

Exercise code snippet (view rotation) :

@interface GestureRecognizerViewController ()<UIGestureRecognizerDelegate> @end @implementation GestureRecognizerViewController {/ / test scaling and rotation of the image view UIImageView * imageView; // The rotation value of the last rotation gesture CGFloat lastRotationValue; } - (void)viewDidLoad { [super viewDidLoad]; The self. The navigationItem. Title = @ "gesture recognition"; [self createUI]; } - (void)createUI{// rotate gesture recognizer test imageView imageView = [[UIImageView alloc]initWithImage:[UIImage imageNamed:@"tian_kong_long"]]; Imageview. frame = CGRectMake(0, 300, CGRectGetWidth(self.view.frame), 200); / / set the contents of the display mode imageView. ContentMode = UIViewContentModeScaleAspectFit; / / set to interact with users, UIImageView. The default is NO imageView userInteractionEnabled = YES; // Add UIImageView to the view [self.view addSubview:imageView]; / / initialize the rotating gesture recognizer UIRotationGestureRecognizer * rotation = [[UIRotationGestureRecognizer alloc] initWithTarget: self action:@selector(rotationEvent:)]; / / rotating gesture recognizer is added to the view [imageView addGestureRecognizer: rotation]; } / / / rotation - (void) rotationEvent: (UIRotationGestureRecognizer *) rotationGestureRecognizer {switch (rotationGestureRecognizer. State) {case UIGestureRecognizerStateBegan: / / rotating case Rotation UIGestureRecognizerStateChanged: / / {NSLog (@ "% f", rotationGestureRecognizer rotation); / / each rotate value based on the end of the last set of pictures pictures the location of the rotation value calculated CGFloat newRotationValue = rotationGestureRecognizer. Rotation + lastRotationValue; / / to get additional gestures view UIView * view = rotationGestureRecognizer in view; // Get the view's layer CALayer *layer = view.layer; / / 3 d rotating effect CATransform3D rotationAndPerspectiveTransform = CATransform3DIdentity; RotationAndPerspectiveTransform. M34 = 1.0 / - 500; // Rotate t in angular radians around the vectors (x, y, z) and return the result. If the length of the vector is zero, The behavior is undefined rotationAndPerspectiveTransform = CATransform3DRotate (rotationAndPerspectiveTransform newRotationValue, 1.0 f, 0.0 f, 0.0 f); / / set layer content conversion layer. The transform = rotationAndPerspectiveTransform; } break; Case UIGestureRecognizerStateEnded: / / rotation end {/ / at the end of each rotation gestures, Will record the image of the location at the close of the last gesture of rotary value lastRotationValue + = rotationGestureRecognizer. Rotation; } break; default: break; } } @endCopy the code

The effect is as follows:

UISwipeGestureRecognizer — Swipe (swipe) gesture recognizer

A concrete subclass of UIGestureRecognizer for finding sliding gestures in one or more directions. A swipe is a discrete gesture, so each gesture sends the associated action message only once. UISwipeGestureRecognizer recognizes sliding when the specified number of touches (numberOfTouchesRequired) moves most of the way in the allowed direction (direction) enough to be considered sliding. Sliding can be fast or slow. Slow brush requires high directional accuracy, but the distance is small; Fast sliding requires lower directional accuracy but greater distance. You can determine where the slide starts by calling the UIGestureRecognizer methods locationInView: and locationOfTouch:inView:. The former method gives the center of mass when multiple touches are involved in a gesture, while the latter gives the location of a particular touch.

Commonly used attributes

@property(nonatomic) NSUInteger numberOfTouchesRequired API_UNAVAILABLE(tvOS);

Attribute description: The hand index that must be present to recognize the sliding gesture. The default value is 1.

@property(nonatomic) NSUInteger                        numberOfTouchesRequired API_UNAVAILABLE(tvOS);
Copy the code

@property(nonatomic) UISwipeGestureRecognizerDirection direction;

Description: the direction of the sliding needed, default is UISwipeGestureRecognizerDirectionRight. Multiple directions can be specified if they will result in the same behavior (for example, UITableView slide removal).

@property(nonatomic) UISwipeGestureRecognizerDirection direction;
Copy the code

UISwipeGestureRecognizerDirection provides options:

Typedef NS_OPTIONS (NSUInteger UISwipeGestureRecognizerDirection) {/ / touch slide to the right. This is the default UISwipeGestureRecognizerDirectionRight = 1 < < 0, / / touch scroll left UISwipeGestureRecognizerDirectionLeft = 1 < < 1, / / touch up sliding UISwipeGestureRecognizerDirectionUp = 1 < < 2, / / touch down sliding UISwipeGestureRecognizerDirectionDown = 1 < < 3};Copy the code

Code snippet for the exercise:

@ interface GestureRecognizerViewController () < UIGestureRecognizerDelegate > / / test scaling and rotation of the image view @ property (nonatomic, strong) UIImageView *imageView; @property(nonatomic, copy) NSMutableArray *imageNameArray; @property(nonatomic, assign) NSInteger currentIndex; @end @implementation GestureRecognizerViewController{ } - (void)viewDidLoad { [super viewDidLoad]; The self. The navigationItem. Title = @ "gesture recognition"; [self initScale]; [self createUI]; } - (void)initScale{// initializes the imageNameArray self.imageNameArray = [[NSMutableArray alloc]initWithObjects:@"tian_kong_long",@"yi_shen_long",@"ju_shen_bing", nil]; Self. ImageView = [[UIImageView alloc]initWithImage:[UIImage imageNamed:@"tian_kong_long"]]; Self.imageview. frame = CGRectMake(0, CGRectGetMidY(self.view.frame) -100, CGRectGetWidth(self.view.frame), 200); / / set the contents of the display self. ImageView. ContentMode = UIViewContentModeScaleAspectFit; / / set to interact with users, UIImageView. The default is NO self imageView. UserInteractionEnabled = YES; / / add UIImageView to view [self view addSubview: self. ImageView]; // Initialize the gesture recognizer (right swipe, Separately for each direction) UISwipeGestureRecognizer *rightSwipe = [[UISwipeGestureRecognizer alloc]initWithTarget:self action:@selector(swipeEvent:)]; / / the direction of the light sweep rightSwipe. Direction = UISwipeGestureRecognizerDirectionRight; / / light sweeping gestures need hand index rightSwipe. NumberOfTouchesRequired = 2; // Set agent rightSwipe. Delegate = self; / / add light sweeping gestures [self imageView addGestureRecognizer: rightSwipe]; // Initialize the gesture recognizer (left swipe, UISwipeGestureRecognizer *leftSwipe = [[UISwipeGestureRecognizer alloc]initWithTarget:self action:@selector(swipeEvent:)]; / / the direction of the light sweep leftSwipe. Direction = UISwipeGestureRecognizerDirectionLeft; / / light sweeping gestures need hand index leftSwipe. NumberOfTouchesRequired = 2; // Set the agent leftswiper. Delegate = self; / / add light sweeping gestures [self imageView addGestureRecognizer: leftSwipe]; } /// swipe events - (void)swipeEvent:(uiswipepegesturerecognizer *)swipeGestureRecognizer{switch (swipeGestureRecognizer. Direction) {case UISwipeGestureRecognizerDirectionRight: / / on the right side swept gently [self transitionAnimation: 0]; break; Sweep case UISwipeGestureRecognizerDirectionLeft: / / on the left side of the light [self transitionAnimation: 1); break; default: break; }} /// Rotate animation - (void)transitionAnimation:(BOOL)isNext{[UIView animateWithDuration:1.0f animations:^{// Get the image view layer CALayer * layer = self. ImageView. Layer; / / 3 d rotating effect CATransform3D rotationAndPerspectiveTransform = CATransform3DIdentity; / / matrix, value around 1.0/1000-500 to 1.0 / - good rotationAndPerspectiveTransform. M34 = 1.0 / - 500; If (isNext){// Left swipe // Rotate t in an angular radian around the vector (x, y, z) and return the result. If the length of the vector is zero, The behavior is undefined rotationAndPerspectiveTransform = CATransform3DRotate (rotationAndPerspectiveTransform - M_E, 0.0 f, 1.0 f, 0.0 f); }else{// Swipe right // Rotate t in angular radians around the vectors (x, y, z) and return the result. If the length of the vector is zero, The behavior is undefined rotationAndPerspectiveTransform = CATransform3DRotate (rotationAndPerspectiveTransform M_E, 0.0 f, 1.0 f, 0.0 f); } / / set layer contents conversion layer. The transform = rotationAndPerspectiveTransform; }completion:^(BOOL finished) {// Call self.imageView.image = [self getImage:isNext]; / / reset the view layer of the transform images, or after the transform change will affect swept gently around the self. The imageView. Layer. The transform = CATransform3DIdentity;}]; } getImage:(BOOL)isNext{if (isNext) { The next self. CurrentIndex = (self. CurrentIndex + 1) % self. The imageNameArray. Count; } else {/ / light plug on the right side, on a piece of the self. The currentIndex = (self. CurrentIndex - 1 + self. ImageNameArray. Count) % self. The imageNameArray. Count;  } NSString *imageName = self.imagenamearRay [self.currentIndex]; Return [UIImage imageNamed:imageName]; } @endCopy the code

The effect is shown as follows :(in the simulator test, swipe lightly, press the option key first to bring up 2 fingers, then adjust the position, press shitf to fix it, and slide the mouse)

UIPanGestureRecognizer — Panning gesture recognizer

A concrete subclass of UIGestureRecognizer for finding pan (drag) gestures. The user must press one or more fingers when panning a view. The client implementing the gesture recognizer action method can request the current conversion and speed of the gesture.

The translation is continuous. When it began to (UIGestureRecognizerStateBegan), allowed the small hand index (minimumNumberOfTouches) moved enough to be considered a pan. When his fingers move, press the minimum number of fingers at least at the same time, it will change (UIGestureRecognizerStateChanged). When all fingers are raised, it’s over.

A client of this class can query the UIPanGestureRecognizer object in its operation method to obtain the current translation of the gesture (translationview 🙂 and the conversion speed (velocityview :). They can specify the view to which the coordinate system is applied for the horizontal and velocity values. The client can also reset the conversion to the desired value.

Commonly used attributes

@property (nonatomic) NSUInteger minimumNumberOfTouches API_UNAVAILABLE(tvOS);

Property description: The minimum hand index that can be contacted to view to identify this gesture. The default value is 1.

@property (nonatomic)          NSUInteger minimumNumberOfTouches API_UNAVAILABLE(tvOS);
Copy the code

@property (nonatomic) NSUInteger maximumNumberOfTouches API_UNAVAILABLE(tvos);

Property description: The maximum hand index that can be contacted to identify this gesture. The default value is NSUIntegerMax.

@property (nonatomic)          NSUInteger maximumNumberOfTouches API_UNAVAILABLE(tvos);   
Copy the code

Commonly used functions

– (CGPoint)translationInView:(nullable UIView *)view;

Function description: Translation of panning gesture in the specified view’s coordinate system. The x and y values report the total conversion over time. They are not incremental values from the last time the transformation was reported. Do not concatenate this value every time the handler is called.

Parameters:

View: The view in which panning gestures should be evaluated. If you want to adjust the position of a view to keep it under the user’s finger, request panning in the superView coordinate system for that view.

Return value: The point that identifies the view’s new position in the coordinate system of its specified superview.

- (CGPoint)translationInView:(nullable UIView *)view;
Copy the code

– (void)setTranslation:(CGPoint)translation inView:(nullable UIView *)view;

Function Description: Sets the translation value in the coordinate system of the specified view. Changing the pan value resets the speed of the pan.

Parameters:

Translation: The point that identifies the new transformation value.

View: a view translated in its coordinate system.

- (void)setTranslation:(CGPoint)translation inView:(nullable UIView *)view;
Copy the code

– (CGPoint)velocityInView:(nullable UIView *)view;

Function description: Pan the speed of a gesture in the specified view’s coordinate system.

Parameters:

View: A view that calculates panning gesture speed in its coordinate system.

Return value: speed of panning gesture in points per second. Velocity has a horizontal component and a vertical component.

- (CGPoint)velocityInView:(nullable UIView *)view; 
Copy the code

Code snippet for the exercise (controller slippage returns) :

@interface GestureRecognizerViewController ()<UIGestureRecognizerDelegate> @end @implementation GestureRecognizerViewController - (void)viewDidLoad { [super viewDidLoad]; The self. The navigationItem. Title = @ "gesture recognition"; [self createUI]; / / UINavigationController + FDFullscreenPopGesture set navigation display self. Fd_prefersNavigationBarHidden = NO. } - (void) createUI {/ / / / test button to initialize button UIButton * testCodeButton = [UIButton buttonWithType: UIButtonTypeCustom]; Testcodebutton.frame = CGRectMake(CGRectGetMaxX(self.view.frame) / 2-75/2, CGRectGetMinY(self.view.frame) + HEAD_BAR_HEIGHT, 75, 30); / / the background color of the button testCodeButton. BackgroundColor = [UIColor blueColor]; / / button caption font size testCodeButton. TitleLabel. The font = [UIFont systemFontOfSize: 15]; / / button title [testCodeButton setTitle: @ "test code" forState: UIControlStateNormal]; / / button caption color [testCodeButton setTitleColor: [UIColor whiteColor] forState: UIControlStateNormal]; / / button to add the click event [testCodeButton addTarget: self action: @ the selector (testCode) forControlEvents: UIControlEventTouchUpInside]; / / add a button to view [self view addSubview: testCodeButton]; } / / / test button click - (void) testCode {/ / implementation tips [self. View makeToast: @ "can begin to slide back" duration: 1 position: CSToastPositionTop]; // initialize the panGestureRecognizer object UIPanGestureRecognizer *panGestureRecognizer = [[UIPanGestureRecognizer alloc] init]; / / can be contact with a view to identify the gesture of the hand index panGestureRecognizer. MaximumNumberOfTouches = 1; / / the current controller is responsible for the navigation stack pop-up top view controller gesture recognizer is attached to add translation view [self. NavigationController. InteractivePopGestureRecognizer. View addGestureRecognizer:panGestureRecognizer]; / / use KVC get gestures array NSArray * internalTargets = [self. NavigationController. InteractivePopGestureRecognizer valueForKey:@"targets"]; / / for this gesture object id internalTarget = [internalTargets. FirstObject valueForKey: @ "target"]. / / get the internal handleNavigationTransition: function number SEL internalAction = NSSelectorFromString (@ "handleNavigationTransition:"); / / set the proxy panGestureRecognizer. Delegate = self; // Add the gesture [panGestureRecognizer addTarget:internalTarget Action :internalAction]; } // Pop gesture is disabled because FDFullscreenPopGesture is used, //UINavigationController+FDFullscreenPopGesture Setting interactive POP gesture disabled - (BOOL)fd_interactivePopDisabled {return YES; } /// ask the proxy gesture recognizer if it should start interpreting touches. / / prevent suspended animation - (BOOL) gestureRecognizerShouldBegin: (gestureRecognizer UIGestureRecognizer *) {/ / the current controller for root controller does not allow executing the if gestures (self.navigationController.viewControllers.count <= 1) { return NO; } / / if the push and pop animation is executed (private property) are not allowed to gestures if ([[self. The navigationController valueForKey: @ "_isTransitioning"] boolValue]) { return NO; } return YES; } @endCopy the code

The effect is as follows:

UIScreenEdgePanGestureRecognizer – edge sideslip gesture recognizer

A gesture recognizer that looks for panning (dragging) gestures that start near the edge of the screen. In some cases, the system uses edge gestures to initiate view controller transitions. You can use this class to copy the same gesture behavior for your own operations.

After creating the screen edge panning gesture recognizer, assign an appropriate value to the edges property before attaching the gesture recognizer to the view. Use this property to specify which side a gesture can start from. The gesture recognizer ignores any touch other than the first one.

Commonly used attributes

@property (readwrite, nonatomic, assign) UIRectEdge edges;

Attribute Description: Start edge acceptable for gesture. The specified edge is always relative to the current interface orientation of the application. This behavior ensures that the gesture always occurs in the same place in the user interface, regardless of the current orientation of the device.

@property (readwrite, nonatomic, assign) UIRectEdge edges;
Copy the code

UIRectEdge provides the following values (for personal testing, the edge swipe gesture recognizer only works with the left edge and right edge) :

Typedef NS_OPTIONS(NSUInteger, UIRectEdge) {// To define the edge UIRectEdgeNone = 0, // the top edge UIRectEdgeTop = 1 << 0, // the left edge of the rectangle. UIRectEdgeLeft = 1 << 1, // the bottom edge of the rectangle UIRectEdgeBottom = 1 << 2, // the right edge of the rectangle UIRectEdgeRight = 1 << 3, / / all side of the rectangle UIRectEdgeAll = UIRectEdgeTop | UIRectEdgeLeft | UIRectEdgeBottom | UIRectEdgeRight} API_AVAILABLE (ios (7.0));Copy the code

The code snippet of the exercise

@interface TestCodeController ()<UIGestureRecognizerDelegate> @end @implementation TestCodeController - (void)viewDidLoad { [super viewDidLoad]; UIImageView *imageView = [[UIImageView alloc]initWithFrame:CGRectMake(CGRectGetMaxX(sell.view.frame)) -10, 0, 10, 200)]; imageView.userInteractionEnabled = YES; Imageview. image = [UIImage imageNamed:@" universe "]; [self.view addSubview:imageView]; / / / initializes the edge sideslip gesture recognizer UIScreenEdgePanGestureRecognizer * screenEdgePanGestureRecognizer = [[UIScreenEdgePanGestureRecognizer alloc]initWithTarget:self action:@selector(screenEdgeEvent:)]; / / set the gestures of acceptable start edge screenEdgePanGestureRecognizer. Edges = UIRectEdgeRight; / / add edge sideslip gesture recognizer [imageView addGestureRecognizer: screenEdgePanGestureRecognizer]; } /// screenEdgeEvent:(UIPanGestureRecognizer *)gestureRecognizer{UIImageView *imageView = (UIImageView) *)gestureRecognizer.view; if(gestureRecognizer.state == UIGestureRecognizerStateBegan || gestureRecognizer.state == UIGestureRecognizerStateChanged) {[UIView animateWithDuration: 0.5 animations: ^ {imageView. Frame = CGRectMake (0, 0, [UIScreen mainScreen].bounds.size.width, 200); }]; } } @endCopy the code

The effect is as follows:

UILongPressGestureRecognizer – long-press gesture recognizer

##### Common Attributes

@property (nonatomic) NSUInteger numberOfTapsRequired;

Property description: The number of clicks on the view required to recognize the gesture. The default number of clicks is 0. (For example, if the value is set to 1, click the mouse once and hold down after clicking. If the value is directly identified, set it to 0.)

@property (nonatomic) NSUInteger numberOfTapsRequired;
Copy the code

@property (nonatomic) NSUInteger numberOfTouchesRequired API_UNAVAILABLE(tvOS);

Attribute description: The hand index that must be pressed on the view to recognize the gesture. The default hand index is 1.

@property (nonatomic) NSUInteger numberOfTouchesRequired API_UNAVAILABLE(tvOS);
Copy the code

@property (nonatomic) NSTimeInterval minimumPressDuration;

Attribute Description: The finger must press the minimum cycle on the view to recognize the gesture. The interval is in seconds. The default duration is 0.5 seconds.

@property (nonatomic) NSTimeInterval minimumPressDuration;
Copy the code

@property (nonatomic) CGFloat allowableMovement;

Property Description: Maximum finger movement on the view before gesture failure. Permissible distance, in points. The default distance is 10.

@property (nonatomic) CGFloat allowableMovement;   
Copy the code

The code snippet of the exercise

@interface TestCodeController ()<UIGestureRecognizerDelegate> @end @implementation TestCodeController - (void)viewDidLoad { [super viewDidLoad]; UILongPressGestureRecognizer *longPress = [[UILongPressGestureRecognizer alloc]initWithTarget:self action:@selector(longPressEvent)]; LongPress. MinimumPressDuration = 1.0; UIView *testView = [[UIView alloc]initWithFrame:CGRectMake(CGRectGetMaxX(self.view.frame) / 2 - 100 / 2,CGRectGetMidY(self.view.frame) - 50, 100, 100)]; testView.backgroundColor = [UIColor redColor]; testView.userInteractionEnabled = YES; [self.view addSubview:testView]; [testView addGestureRecognizer:longPress]; } - (void)longPressEvent{ UIAlertController *alertController = [UIAlertController alertControllerWithTitle:@"Alert" Message: @ "long-press response" preferredStyle: UIAlertControllerStyleAlert]; UIAlertAction *yesAction = [UIAlertAction actionWithTitle:@"YES" style:UIAlertActionStyleCancel handler:^(UIAlertAction *action){ NSLog(@"Top YES Button"); }]; [alertController addAction:yesAction]; [self presentViewController:alertController animated:true completion:nil]; } @endCopy the code

The effect is as follows: