Demand analysis

Recently, in the process of joint coordination between H5 development and APP client engineers, it is often necessary to encapsulate some common mobile terminal events into interfaces and provide them to the client, such as user tap events, double click events, long press events and drag events. However, since the browser only provides three native events of TouchStart, TouchMove and Touchend by default, in the actual development process, we commonly use the solution is to realize our custom mobile terminal events by listening to the TouchStart and TouchEnd events together with the timer. In order to realize the reuse of common custom events, we encapsulate them and provide user-friendly utility functions, which is the original intention of implementing MT-Events.

Mt-events is Mobile Terminal Events. Our initial positioning for this library was to encapsulate some common mobile events, such as double click events, long press events, swipe events, etc., to facilitate mobile development. Later, with the iteration of the project, the function of MT-Events tends to the trend of the development of the front-end event binding tool. Because we integrate event delegate, you can use our MT-Events just like using the oN method of JQuery, which is more convenient for event binding and delegation. Make mobile events as friendly as native events. This is the Github address of our project: github.com/jerryOnlyZR…

Next, we will take you to experience the building process of a tool library, the role of ES6’s new features Map, Proxy, Reflect and WeakMap in our tool library, and the charm of our open source tool library MT-Events.


Mt – events

Let’s take a look at the features of the MT-Events toolkit:

  • Generality: Commonly used for packagingMobile event:
    • Click on the
    • Double click on the
    • Long press
    • sliding
    • Drag and drop
    • The zoom
    • rotating
  • Convenience: In the global mount utility function, the binding event is as slippery as $.on(); To encapsulate the event broker, you simply pass in a selector for the propped element, as JQuery does.
  • New syntax: Use Map to Map event callback to facilitate event removal. WeakMap is used to realize weak reference of DOM elements and callbacks to prevent memory leaks.
  • Lightweight: Code compression + gzip, only about 2KB.

So how do we use it? There are two ways to reference libraries, the most common of course is to use script from HTML:

<script src="http://mtevents.jerryonlyzrj.com/mtevents.min.js"></script>
Copy the code

Our tool function mtEvents will then be mounted to the Window object. You can type and execute mtEvents in the developer tools console in your browser. You have successfully imported our tool library by printing the following text:

Or if you are a developer of a front-end framework such as VUE, you can also import our tools through NPM dependency, and our tool library will follow your VUE files into bundles.

First, install our library in line-dependent form:

npm i mt-events --save
Copy the code

It can then be used directly in our.vue and other files:

//test.vue
<script>
const mtEvents = require('mt-events')
export default {
    ...,
    mounted(){
    	mtEvents('#bindTarget'.'click', e => console.log('click'))}}</script>
Copy the code

You can refer to our Github user documentation for specific usage methods

How to build an open source library of our own

Choose an appropriate testing tool

Untested code is not persuasive. If you’re browsing someone else’s open source library code, you’ll find a folder called Test in the root directory, where you’ll find your project’s test files. Especially for tool library, testing is an indispensable link.

There are a wide variety of testing tools on the market, such as Jest, Karma, Mocha, Tape, etc. There is no need to limit the testing tools to one or the other. Here we compare these frameworks.

  • Jest
    • Facebook’s open source JS unit testing framework
    • Integrated with JSDOM, mT-Events library is mainly suitable for mobile terminal. Integrated with JSDOM allows us to better simulate mobile terminal events
    • The Istanbul test coverage tool will produce a coverage catalog under the project with an elegant test coverage report so we can see the elegant test clearly
    • Out of the box, very little configuration, only need NPM command installation can run, UI level is clear, and easy to operate
    • Based on the parallel testing of multiple files, it runs fast in large projects
    • Jasmine syntax is built in, and many new features have been added
    • Built-in Auto Mock, with mock API
    • Assertion and emulation are supported without the need to introduce a third party assertion library
    • Tests in isolation, supports snapshot tests
    • Used mostly in React projects (but widely supported)
    • Relatively new, the community is not very mature
  • Karma
    • Google Angular team’s open source JavaScript test execution process management tool
    • Simple and convenient configuration
    • Powerful adapter to configure jasmine, Mocha, and other unit testing frameworks on karma
    • Can provide a real simulation environment, can be configured in Chrome, Firefox and other browser environment
    • Developers can control the entire test automation process, to achieve more automation, when we edit save, can run all the test cases
    • High scalability, support plug-in development
    • Support CI services
    • Fast execution
    • Supports remote control and debugging
  • Mocha
    • The cost of learning is high, but with it comes greater flexibility and scalability
    • The community is mature, you can find a variety of plugins or extensions for specific scenarios on the community
    • It requires a lot of configuration and is relatively troublesome to configure
    • Its own integration is not high, and it needs to be combined with a third party library (usually Enzyme and Chai) to have the functions of assertion, mocks and Spies
    • A global test structure is created by default
    • Terminal display friendly
    • The most widely used library today
  • Tape
    • Developers simply need to execute a JS script with Node and call the API directly
    • Most compact, minimal in size, providing simple structure and assertions
    • Only the lowest level, most basic API is provided
    • Global variables are not defined, and developers can change the test code at will
    • No CLI client environment is required. You only need to be able to run JS environment to run Tape

To sum up, Jest is out of the box; Karma is recommended if you need a framework for getting started quickly on large projects. Mocha has the most users and the most mature, flexible, configurable and easy to expand community. Tape is the most streamlined, providing the most basic things and the lowest level API.

Here’s an example of how to use Jest:

  • Install the Jest
$ npm i jest -D
Copy the code
  • Adding a configuration file:
// jest. Config. js # Configure the test case path in jest
module.exports = {
    testURL: 'http://localhost'.testMatch: ['<rootDir>/test/*.js'].// Test file matching path (get all JS files in test folder)
    coverageDirectory: '<rootDir>/test/coverage'.// Test coverage document generation path
    coverageThreshold: {			    // Test coverage passes the threshold
        global: {
            branches: 90.functions: 90.lines: 90.statements: 90}}}Copy the code
  • Configure package test scripts
// package.json{... ."scripts": {... ."test": "jest"}}Copy the code
  • Write test cases
Describe ('test dbtap events', () => {# test group test('test 1+1', () => {# test case expect(1+1).tobe (2) # predicate})})Copy the code
  • Perform the test
$ npm t
Copy the code
  • Results output

This is the basic flow of configuring tests.

Use ESLint to standardize team code

In team development, code maintenance often takes up more time than the development of new features. Therefore, it is important to develop code specifications that are consistent with the team, not only to avoid basic syntax errors to a large extent, but also to ensure that the code is readable and easy to maintain.

Programs are written for people to read, and only occasionally for computers to execute. –Donald Knuth

As we all know, ESLint is an open source JavaScript code inspection tool that can be used to validate our code, define a specification for the code that team members can develop according to, and ensure that the code is standardized. Using eslint can bring many benefits, can help us avoid some low-level mistakes, may be a little grammar question, allows you to position for a long time to find the problem, and in the process of teamwork, can ensure that everyone is in accordance with the same kind of style to develop, so more convenient we understand each other’s code, improve the development efficiency.

In addition, ESLint was originally designed to allow developers to create their own code detection rules that would be extensible to detect problems during coding. Eslint also has some rules built in for ease of use, and you can add custom rules from there.

eslint --init  
Copy the code

Choose the build tool you are most familiar with

During the development phase, we often use some syndication features like ES6 to facilitate our development, or ES6 Modules to link up our modular work. However, some new features are not supported by Node.js or browsers, so we need to compile and package the development code. In order to refine the automation project, There are many good automated build tools to choose from, such as the front-end giant Webpack, or the streaming build tool Gulp, or the Rollup with its tree-shaking features. Each build tool has its own advantages and we can choose the best build tool for our business needs. What build tools do is code a series of processes, automate a series of complex operations, and ultimately convert source code into executable JavaScript, CSS, and HTML code. Build tools abound, such as Grunt, Gulp, Webpack, Rollup, etc. Let’s compare these tools.

  • Grunt
    • Grunt has a large number of reusable plug-ins that encapsulate common build tasks
    • Flexibility is high
    • Integration is not high, configuration trouble, can not do out of the box
    • It’s an evolution of Npm Scripts
    • The pattern is similar to Webpack, and as Webpack continues to iterate and improve its functionality, it can be prioritized to use Webpack
  • Gulp
    • A flow-based automated build tool
    • You can manage and execute tasks
    • You can monitor file changes and read and write files, streaming processing tasks
    • It can be used with other tools
    • Integration is not high, configuration trouble, can not do out of the box
  • Webpack
    • A packaged modular JavaScript tool
    • Convert files by loader, inject hooks by Plugin, and finally output files composed of multiple modules.
    • Projects that focus on modularity are not suitable for non-modular projects
    • Rich and complete, but also can be extended by Plugin
    • Out of the box, the development experience is good
    • The community is mature and active, and you can find plug-in extensions for various special scenarios in the community
  • Rollup
    • A packaging tool similar to WebPack but focused on ES6 modules
    • Tree Shaking ES6 source code, removing code that is defined but not used
    • In order to reduce the size of the output file and improve the operation performance, Scope collieries are implemented for ES6 source code
    • Simple to configure and use, but not as complete as WebPack
    • Community ecological chain is not mature enough to find solutions in many special scenarios

For our MT-Events project, we chose Rollup and Webpack because we needed to branch out the JS code after “iso-structuring”, so we needed to take advantage of the tree-shaking features of Rollup. And we use Webpack to facilitate our construction work in order to launch the compression packaging of min.js files.

Configure JSDoc to clear the way for later generations

Project maintenance work is the most critical means of extending the project life cycle, reading other people’s source code is believed to be a laborious thing for everyone, especially when the original author is not around you or can not provide you with any information, it is more sad from it. Therefore, well-written comments are a good habit to develop during development. In order to improve the maintainability of our code, we all improve our comments on the trunk code, and there is a tool on the market that automatically converts our comments into AN API document and generates a visual page. Sounds amazing.

This tool is called JSDoc. It is a tool that generates API documents of Javascript applications or libraries and modules based on annotation information in Javascript files. JSDoc analysis of the source code is we write in accordance with the Docblock format code comments, it will help us intelligently generate beautiful API document pages, we need to do, is simply run a JSDoc command can be.

Here is the MT-Events API documentation page (beautiful, isn’t it? These are generated automatically by JSDoc) :

The simplicity of the style makes for a lot of fun, and if a later maintainer wants a quick look at the general architecture and functionality of your project, it’s better to present a developer document like this than to just throw in a copy of the source code.

Lint and test submitted code using Git hooks

In order to ensure that the online code is not contaminated, we have configured ESLint, so before each member of the team pushes the code, lint and test are required to ensure that the online code is clean and valid. But can this tedious work be automated?

The solution is to use git hooks to automate Lint and test:

  1. Git’s hook management tool husky:

    npm install -D husky
    Copy the code
  2. Next, add our hook command to our project’s package.json:

    In the MT-Events project, we execute lint on the COMMIT hook and test on the push hook as follows:

    {... ."scripts": {... ."precommit": "lint-staged"."prepush": "npm t"}}Copy the code
  3. Optimisation: Lint-only changed files — Lint-staged:

    Lint-staged plugins have had to walk through files every time a hook executes lint, which leads to inefficiency. This problem was solved by having LINt-staged plugins have diff features that let ESLint only change files, and you only need to add configuration to package.json. Mt – events example:

    First, install Lint-staged dependencies:

    npm install -D lint-staged
    Copy the code

    Next, add the appropriate configuration to the package:

    {... ."lint-staged": {
        "core/*.js": [ 						// lint listens for change files
          "prettier --write".// Automatically format all code
          "eslint --fix".// lint automatically modifies code that does not conform to the specification
          "git add"						    // Add all changes to the staging pool]}}Copy the code

    With hooks, we should see output like this:

    • Commit hooks

    • Push the hook

Let continuous integration tools help you automate your deployment

Every time we generate the on-line file after running locally, we need to upload it to our server through SCP or rsync. It is definitely against our idea of engineering automation to manually execute relevant commands to complete the on-line operation every time. In order to realize automatic deployment, We can use continuous integration tools to help us go live.

There are plenty of mature continuous integration tools out there, but Travis CI and Jenkins are among the most highly regarded. As a Github standard, Travis CI is an uncompromising tool in the open source space, so if we were versioning our projects on Github, it would be a great choice. The focus of this article is on how Travis CI can be used in our automation projects.

Travis CI features:
  • Travis CI provides a continuous integration service that only supports Github and does not support other code hosting.
  • It needs to be tied to a project on Github, and it also needs to contain build or test scripts.
  • As soon as there is new code, it will be automatically fetched. Then, provide a virtual machine environment, perform tests, complete builds, and deploy to the server.
  • Run builds and tests automatically whenever the code changes and feed back the results. After making sure that you are as expected, integrate the new code into the trunk.
  • Every time there is a small change in the code, the running result can be seen, so that small changes can be accumulated continuously, instead of merging a large chunk of code at the end of the development cycle, which greatly improves the efficiency of developing the MT-Events library. As long as there is an update, users can pull the latest JS code. That’s the incremental upper limit.

In fact, Travis CI can be used in three simple steps, as shown in the picture on the homepage:

  1. Check the projects that you need continuous integration in the Travis CI dashboard
  2. Add a name named in your project root directory.travis.ymlConfiguration file of
  3. Finally, all you have to do is push your code and wait

In fact, the difficulty lies in writing the.travis. Yml configuration file and sorting out the specific continuous integration. First, post a configuration file of our project:

language: node_js               The node project is written in this way
node_js:
- 8.112. 			# Project environment
cache:				# Cache node_js dependencies to improve the efficiency of the second build
  directories:
  - node_modules
before_install:                 # These are generated automatically after we encrypt the key. The two lines of command are used to get a valid key
- openssl aes-256-cbc -K $encrypted_81d1fc7fdfa5_key -iv $encrypted_81d1fc7fdfa5_iv
  -in mtevents_travis_key.enc -out mtevents_travis_key -d
- chmod 600 mtevents_travis_key
after_success:			# Custom actions after a successful build
- npm run codecov		# Generate the Codecov icon on Github's home page
- scp -i mtevents_travis_key -P $DEPLOY_PORT -o stricthostkeychecking=no -r dist/mtevents.min.js
  $DEPLOY_USER@$DEPLOY_HOST:/usr/local/nginx/html   		SCP the generated live file to the server
Copy the code

First, we update the open source project and push it. Travis listens to our push and automatically pulls the project code to Travis’s virtual machine to execute the build process. That’s the idea. In fact, we can also implement a simple continuous integration tool using Shelljs.

We usually use rsync instead of SCP in CI large projects such as websites and Web apps, because SCP uploads all files, and rsync has its own diff function, so its function is to “synchronize” the changed files. This can greatly improve our CI efficiency. But since our library project has only one min.js file, SCP is sufficient to solve the problem.

Add an open source license to your project

Every open source project needs to be equipped with an appropriate open source license to inform all users who browse our project what permissions they have. For specific license selection, please refer to this chart drawn by ruan Yifeng:

So how do we add licenses to our projects? In fact, Github already provides us with a very easy visualization: when we visit Github, we find that many projects add logos to readme. md to mark and explain the project. These small ICONS add a lot of color to the project, not only simple and beautiful, but also contain clear and understandable information.

  1. Open our open source project and switch to the Insights panel
  2. Click on the Community TAB
  3. If no License is added to your project, you will be prompted to Add License in the Checklist. Click the Add button to enter the visual operation process

Add some ICONS you like to decorate your project

When we spend a lot of energy to build and improve our project, we hope that more people will pay attention to and use our project. How can we present our project to others at this point? Adding a nice logo to your project is a great way to make it refreshing.

Open the MT-Events README file, you can see that there are many beautiful small ICONS at the beginning of the project. Many large projects use these small ICONS to decorate their projects, which can not only show some of the main information of the project, but also reflect the professionalism of the project.

So how do we add such little ICONS to our own open source projects? GitHub’s official website is shields. IO /, where you can choose your favorite logo to polish your project. Common logos include continuous integration status, code test coverage, project version information, project downloads, open source agreement type, project language, etc. Below is a simple list of ICONS according to our project to talk about how to generate.

  • Continuous integration state

    • Continuous integration Uses Travis CI as recommended by the previous module. Add a.travis. Yml configuration file to your project that tells Travis CI how to compile or test your project, focusing on the previous module.

    • Then the logo picture address is

      http://img.shields.io/travis/ {making user name} / {item name}. The SVGCopy the code

      Replace {GitHub user name} and {project name} in the above URL with your own project. Finally, you can paste the Markdown code after integration on your own project

    • The renderings are:

  • Test coverage

You’ve worked so hard to get your project to 100% test coverage, it must be frustrating not to show it off. If you want to add a small icon for project test coverage on Github, we recommend using Codecov.

All you need to do is add the items you need to run to collect test coverage to Codecov’s dashboard as you would add items to Travis CI, and then install Codecov dependencies in your project:

$ npm install codecov --save-dev
Copy the code

Codecov works by automatically finding and collecting test coverage documents within your project after you run your project tests, presenting them on the page and generating little ICONS, so you just need to execute the Codecov command after your project tests. Since our codedev is installed locally, we need to go into package.json and configure our codecov execution command:

  // package.json{... ."scripts": {... ."codecov": "codecov"}}Copy the code

Now you know what NPM run Codecov does in our.travis. Yml configuration file

Next, we can add our renderings to the Github home page.

  • Project Version Information

    • Project release information is based on different release tools. Shields. IO / # / examples /… You can find the logo image addresses of different publishing tools on this site.

    • Here we use our library as an example, published as NPM, so the address of the icon is:

      https://img.shields.io/npm/v/ {item name}. SVGCopy the code
    • The renderings are:

  • Project Downloads

    • The number of times a project is downloaded is calculated independently for different platforms. Shields. IO / # / examples /… On this website you can find the logo picture addresses of various statistical platforms.

    • Here’s an example of our library, published in NPM, in terms of weekly downloads:

      https://img.shields.io/npm/dw/ {} to your project. SVGCopy the code
    • The renderings are:

Mt-events goes from 0 to 1

The directory structure

Mt-events ├─ Core # Source folder │ ├─ Event.js # Custom Event Handler, including Long click, double click, slide, and drag Events ├─ index.js # mtEvents │ ├─ ├─ touch. Js # Remove The event method │ ├─ Touch. │ ├─ Weakmap.js # Set up weak references to user-defined callback and event binding elements ├─ Dist │ ├─ ├.js # Mt-Events ├─ Docs │ ├─ Developer # Mt-Events Use the command '$NPM run docs' to generate │ ├─ User # ├─ lib # Temporary folder │ ├─ event.js │ ├─ │ ├─ Proxy. Js │ ├─ weakMap.js │ ├─ test │ ├── Index Coverage # test coverage reference file │ ├ ─ ─ index. The js # test cases ├ ─ ─. Travis. Yml # Travis - ci configuration file ├ ─ ─ jest. Config. Js # jest configuration file ├ ─ ─ Package. json ├─ Rollup.config. js # Rollup ├─ WebPack.config.js # webPack ├─ Rollup.config.js # Rollup ├─ WebPack.config.js # WebPackCopy the code

This is a fairly unremarkable list of projects, and I’m sure you’ll see a lot of familiar words. We’ve explained the purpose of the documents in this list, but the key details and highlights will be distilled later in the article.

Engineering practice

Tool selection

Build: WebPack4 Rollup Test tool: Jest Continuous Integration: Travis CI API document generation tool: JSDoc code specification: ESLint Prettier Lint-Staged Project version control tool: GitCopy the code
JavaScript module wrapper Rollup

Rollup has been used by many major JavaScript libraries and uses a new standardized format for code modules, all included in the ES6 version of JavaScript, which gives you the freedom to seamlessly use the most useful standalone functions in lib that you need. Rollup also helps MT-Events to achieve simple “isomorphism”. By distinguishing user reference methods, we divided the online files into index-npm.js and index-browser. js files. You can also use NPM to require dependencies.

In addition to using ES6 modules, Rollup’s unique Tree Shaking feature allows us to statically analyze imported modules and remove redundancy, helping us weed out useless branches of code:

// index.js  
if (process.env.PLATFORM === 'Browser') {
  window.mtEvents = mtEventsFun
} else {
  module.exports = mtEventsFun
}
// rollup.config.js  
export default {
    entry: './core/index.js'.output: {
        file: `lib/index-${platform}.js`.format: 'cjs'
    },
    plugins: [
        replace({
            "process.env.PLATFORM": JSON.stringify(platform)
        }),
        copy({
            './core/events.js': 'lib/events.js'.'./core/proxy.js': 'lib/proxy.js'.'./core/weakmap.js': 'lib/weakmap.js'}})];// package.json generates corresponding index-npm.js and index-browser.js files based on the parameters passed in
// Remove unused code from the corresponding index-${platform}.js file
{
    "build:browser": "rollup --config --platform Browser"."build:npm": "rollup --config --platform npm"
}

// index-npm.js
{
  module.exports = mtEventsFun;
}

// index-Browser.js
{
  window.mtEvents = mtEventsFun;
}
Copy the code
Unit testing tool Jest

With the iterative process of the project, relying on manual regression testing is easy to make errors and omissions. To ensure the quality of the MT-Events library and to achieve automated testing, we introduced Jest because it integrates JSDOM, which is perfect for simulating the performance of our event library in a browser environment. And Jest is easy to use, out of the box, with almost zero configuration and comprehensive functionality.

However, we encountered a problem at the beginning of the test. There is no method like click() that we can call directly to simulate event firing in the browser’s native mobile events. What about this problem?

Using the globally mounted TouchEvent constructor, we tried to create the user’s Touch event. Finally, the method proved to be feasible. Here is the core code for simulating the Touch event:

// touch.js
createTouchEvent (type) {
    return new window.TouchEvent(type, {
        bubbles: true.cancelable: true
    })
}
dispatchTouchEvent (eventTarget, event) {
    if (typeof eventTarget === 'string') {
        eventTarget = document.querySelector(eventTarget)
    }
    eventTarget.dispatchEvent(event)
    return eventTarget
}
Copy the code

Here are the coverage and results of the code we tested using Jest:

Continuous integration

Based on the configuration mentioned above, we can see the continuous integration results of our project on the Travis CI home page:

The online min.js file has also been updated to the latest version.

Source analysis

Mt-events source code is written in accordance with ES6 code specifications, the following aspects to experience the charm of MT-Events source code:

A utility Function that is both Function and Object

Such a bizarre data type may seem strange, but I’m sure you’ve seen it before, just not noticed it, and it’s one of our oldest friends from many years ago. Remember the $sign in JQuery? You must have used this method to get the element $(“#myDom”), and ajax methods attached to $to send requests like this: $.ajax(…) $is both a Function and an object. The implementation principle is very simple. You only need to attach the prototype of the class instance to Function. Use the mtEvents Function directly to optimize the experience without having to retrieve the bind method from mtEvents. The specific implementation code is as follows:

// index.js
let mtEvents = new MTEvents()
const mtEventsPrototype = Object.create(MTEvents.prototype)
const mtEventsFun = mtEvents.bind.bind(mtEvents)
Object.setPrototypeOf(mtEventsFun, mtEventsPrototype)
Object.keys(mtEvents).map(keyItem= > {
  mtEventsFun[keyItem] = mtEvents[keyItem]
})
Copy the code

Removing an event requires passing a pointer. How do we map the user’s callbacks to the event callbacks we bind to the element?

In custom events, we determine the type of event triggered by the user by listening for both TouchStart and TouchEnd events, and execute the callback passed in by the user at the specified location. So what happens when a user needs to remove a previously bound event? The user must pass in the callback that needs to be executed, not the event callback that we bind to the element.

At this point, we need to map the user’s incoming execution callback to the callback we bind to the event listener, so that we can find the event binding callback function we need to remove based on the user’s incoming execution callback. For mapping, the first thing we think of is the object, but in traditional JS, the key of the object can only be a string, but we need to make it a function, this time we should think of our new data type Map in ES6, its key can not be limited to strings, which is exactly what I like.

We define userCallback2Handler as a Map to bind user defined callback to eventHandler eventHandler, and remove event binding according to corresponding remove. The same is true for custom events.

What if the user forgot to remove the bound event when removing a DOM element? Let WeakMap weak reference and memory leak Say Goodbye!

WeakMap was created to solve this problem, and its key names refer to objects that are weak references, meaning that the garbage collection mechanism does not take that reference into account. Therefore, as long as all other references to the referenced object are cleared, the garbage collection mechanism frees the memory occupied by the object. In other words, once it is no longer needed, the key name object and the corresponding key value pair in WeakMap will disappear automatically, without manually deleting the reference. — From Ruan Yifeng’s Introduction to ECMAScript 6

The significance of Weakmap. js is to establish a weak reference between DOM elements and corresponding callback. When DOM elements are removed, the callback bound to the element will also be recycled by GC, which can prevent memory leaks.

// weakmap.js

/** * weakMapCreator WeakMap Generator * @param {HTMLElement} HTMLElement DOM element * @param {Function} callback event listener callback * @return {WeakMap} WeakMap instance */
function weakMapCreator (htmlElement, callback) {
    let weakMap = new WeakMap()
    weakMap.set(htmlElement, callback)
    return weakMap
}
Copy the code

The event delegate code has to be written every time the event is bound, and proxy-reflect is used to quickly undo it

In the process of development, we found that in order to implement the event delegate related operations, we often had to write repeated code. In order to reduce the code repetition rate, we thought of using ES6 Proxy and Reflect to Proxy the event callback and perform the event delegate related operations during this process.

Js source code in the proxy., define the event handling method: _delegateEvent, and event entrusted proxy generator: DelegateProxyCreator, so that when the event listener callback is executed, the corresponding event delegate is processed through our event delegateProxy, which not only greatly reduces the code repetition rate, makes the code look more concise and beautiful, but also makes it much easier to locate problems and bugs. You just need to locate the bug at its root.

/ _delegateEvent event agency * * * * @ param {String (the Selector) | HTMLElement} bindTarget event binding element * @ param {String (the Selector)} DelegateTarget event agency * @ param elements {Object} target native event Object on the target Object, namely (e. arget) * @ return {Object | null} if there is agent, this method is called, Returns the proxy object */ if the event occurs on the proxy object
function _delegateEvent (bindTarget, delegateTarget, target) {
  if(! delegateTarget)return null
  const delegateTargets = new Set(document.querySelectorAll(delegateTarget))
  while(target ! == bindTarget) {if (delegateTargets.has(target)) {
      return target
    } else {
      target = target.parentNode
    }
  }
  return null
}

/ * * * delegateProxyCreator event agency Proxy generator * @ param {String (the Selector) | HTMLElement} * @ param bindTarget event binding elements {String(Selector)} delegateTarget Event delegate element * @param {Object} target Native event Object * @param {Function} Callback Proxy intercepting callback * @return {Function} Proxy callback */
function delegateProxyCreator (bindTarget, delegateTarget, e, callback) {
  const handler = {
    apply (callback, ctx, args) {
      const target = _delegateEvent(bindTarget, delegateTarget, e.target)
      if((delegateTarget && target) || ! delegateTarget) {return Reflect.apply(... arguments) } } }return new Proxy(callback, handler)
}
Copy the code