The nested sliding |Nested sliding

The Android system always distributes touch events from the parent layout to the topmost child View, which can sometimes limit the complexity of our interaction design.

TouchEventBus addresses non-nested sliding conflicts, such as multiple fragments handling touch events at the same level: The touch event will reach the onTouch method of the top Fragment first, and then determine whether to consume layer by layer. If neither of them is consumed, the touch event will reach the bottom Fragment. Moreover, these hierarchies are not nested and do not form the relationship between parent and child. Means to onInterceptTouchEvent () or requestDisallowInterceptTouchEvent () method to adjust the event distribution is impossible.

Sibling view of the touch event

The following is the preview page of MOBILE YY:

There are many handling of touch events on this page, including but not limited to:

  • Clicking on the screen triggers the focus of the camera (where the yellow box appears)
  • Double finger zooming triggers camera zooming
  • Swipe left or right to switchViewPager, toggles between the Live and Play tabs
  • The list on the “Play Games” TAB can be swiped
  • Controls on the Live TAB can be clicked (Start button, add pictures…)
  • Because the preview page and the broadcast page are the sameActivitySo thisActivityThere’s a lot more onFragmentFor example, there are touch events on public screens and so on

The hierarchy of the View Tree and the level of touch processing can be determined visually:

On the left is a hierarchy of UI, with button controls and viewPagers at the top and fragments for the video stream at the bottom. On the right is the level of touch event processing. Double finger zoom /View click/focus click must be on the top of the ViewPager, otherwise it will be consumed by the ViewPager, but the UI level of the ViewPager is higher than the Fragment level of the video. This is the core contradiction of non-nested sliding conflicts:

The level of business logic is not consistent with the UI level that the user sees

Redistribution of touch events

There are a lot of fragments in YY live broadcast room of mobile phone, and due to the reason of plug-in, each business plug-in can dynamically add/remove its own business fragments in the live broadcast room. These fragments have the same level and are not nested and have their own relatively independent business logic. There will also be a need to handle events like clicks/swipes. However, due to the complexity of business scenarios, the order of the upper and lower levels of fragments can change dynamically. As a result, it is easy for some fragments to fail to receive touch events or be consumed by other businesses when changing business templates.

TouchEventBus is used to redistribute touch events in this scenario, and we can decide the hierarchical order of business logic at will.

So each gesture is handled with a TouchEventHandler, like CameraZoomHandler for zoom, CameraClickHandler for focus, The ViewPager slide is the PreviewSlideHandler, which is then reordered to pass the MotionEvent as needed by the business. Next, TouchEventHandler corresponds to the UI: Bind/unbind the corresponding UI using the Attach/dettach method of the Handler. The UI can be a concrete Fragment or an abstract interface, a business that responds to touch events.

For example, in the broadcast preview page focus click processing, first define the UI interface:

public interface CameraClickView {
    /** * Displays a yellow rectangular focus box ** centered at the specified position@paramX finger touches the coordinate x star@paramY finger touches the coordinate y */
    void showVideoClickFocus(float x, float y);

    /** * Pass a touch event to the VideoSdk to focus the camera at the specified coordinates **@paramE Touch event */
    void onTouch(MotionEvent e);
}
Copy the code

Then there’s the definition of TouchEventHandler:

public class CameraClickHandler extends TouchEventHandler<CameraClickView> {
    
    private boolean performClick = false;
    / /...
    
    @Override
    public boolean onTouch(@NonNull CameraClickView v, MotionEvent e, boolean hasBeenIntercepted) {
        super.onTouch(v, e, hasBeenIntercepted);
        if(! isCameraFocusEnable()) {// Disable camera focusing for some special services
            return false;
        }
        // Check whether performClick is true using MotionEvent
        switch (e.getAction()) {
            case MotionEvent.ACTION_DOWN:
                / /...
                break;
            case MotionEvent.ACTION_MOVE:
                / /...
                break;
            case MotionEvent.ACTION_UP: 
                / /...
                break;
            default:
                break;
        }

        if (performClick) { // Call UI interface as click action
            v.showVideoClickFocus(e.getRawX(), e.getRawY());
            v.onTouch(e);
        }
        return performClick; // Consume touch events when clicked}}Copy the code

Finally, the corresponding binding of TouchEventHandler to the UI

public class MobileLiveVideoComponent extends Fragment implements CameraClickView{
    
    @Override
    public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) {
        / /...
        CameraClickHandler is bound to the current Fragment
        TouchEventBus.of(CameraClickHandler.class).attach(this);
    }
    
    @Override
    public void onDestroyView(a) {
        / /...
        //CameraClickHandler unbinds the current Fragment
        TouchEventBus.of(CameraClickHandler.class).dettach(this);
    }
    
    @Override
    public void showVideoClickFocus(float x, float y) {
        //todo: displays a yellow box UI
    }
    
    @Override
    public void onTouch(MotionEvent e) {
        //todo: calls the CAMERA focus of the SDK}}Copy the code

When the user makes gestures to the UI, motionEvents are distributed along the sequence in the TouchEventBus. If no other Handler consumes the event before CameraClickHandler, it can be processed in the onTouch method and then responded in the UI.

The distribution order of events

A distribution order needs to be defined between multiple Toucheventhandlers, so that the first Handler to receive the touch event can intercept subsequent handlers. In terms of the definition of order, it is difficult to fix an absolute distribution route, because the level of fragments can change as the live broadcast template changes. So TouchEventBus uses a relative sequential definition. Each Handler can decide which other handlers to intercept. For example, we want to rank the CameraClickHandler before several other handlers:

public class CameraClickHandler extends AbstractTouchEventHandler<CameraClickView> {
    / /...

    @Override
    public boolean onTouch(@NonNull CameraClickView v, MotionEvent e, boolean hasBeenIntercepted) {
        / /...
    }

    /** * defines which handlers need to come after me **/
    @Override
    protected void defineNextHandlers(@NonNull List
       
        >>> handlers)
       > {
        // The following handlers will come after the CameraClickHandler, but the order between them is undefinedhandlers.add(CameraZoomHandler.class); handlers.add(MediaMultiTouchHandler.class); handlers.add(PreviewSlideHandler.class); handlers.add(VideoControlTouchEventHandler.class); }}Copy the code

Each Handler specifies the next Handler in line to form a graph. We can dynamically obtain a distribution path through topological sorting. The arrow pointing to “A->B” in the following figure indicates that A needs to be placed before B:

Any Handler can be dynamically added to or removed from this diagram during live broadcast template switching without affecting the normal operation of other services.

Nested views are distributed with touch on the Android system

The TouchEventBus is only required for non-nested Fragment levels, which are distributed inside the Fragment using Android’s default touch event. The red arrow part is the distribution of TouchEventBus, which is called layer by layer according to the topology order of the Handler. The blue arrow part is the distribution of the ViewTree inside the Fragment, which is distributed in accordance with the distribution order of the Android system, that is, from the parent layout to the child view, and the child view decides whether to consume from the parent layout layer by layer.

Using the example

Running the TouchSample module of this project is a simple Demo using TouchEventBus.

  • Swipe left and right with one finger to switch tabs
  • Scale the “Tab%_subTab%” text box in the middle with two fingers
  • Swipe left and right to switch between background images
  • Slide the left side of the screen to pull out the side panel

UI hierarchy: Activity -> Background -> Side panel -> Tabs -> Text box

Touch processing order: side panel -> Text zoom -> Background slide -> bottom navigation click -> TAB slide

Another action made here is that the bottom navigation click does not consume touch events. So you can swipe left and right in the navigation area at the bottom to switch to one Tab. Swiping left and right in the background area toggles a secondary Tab.

configuration

  1. Add the repository address to the project build.gradle

    allprojects {
        repositories {
            maven { url 'https://jitpack.io'}}}Copy the code
  2. Add dependencies for corresponding modules

    dependencies {
        compile 'com. Making. YvesCheung. TouchEventBus: TouchEventBus: 1.4.3'
    }
    Copy the code

The project address

Github.com/YvesCheung/…