Two years ago, I wrote a well-received blog post about the front-end API request caching scheme for business caching, covering how to cache data, promises, and timeout deletion (as well as how to build decorators). If you don’t know much about this, you can learn by reading blogs.

But the previous code and solutions were simpler and much more intrusive. This is not good, so I began to relearn and think about Proxy.

Proxy can be understood as that a layer of “interception” is set up before the target object, and the external access to the object must pass this layer of interception first. Therefore, it provides a mechanism to filter and rewrite the external access. The original meaning of the word Proxy is a Proxy, used here to mean that it “proxies” certain operations. For the introduction and use of Proxy, please refer to ECMAScript 6 for the introduction of Proxy.

Evolution of the project

Any project is not a touch, the following is about the Proxy cache library writing ideas. I hope that’s helpful to you.

Proxy Handler adds the cache

Of course, the handler parameter in the proxy is also an object, so since it is an object, we can add data items, so we can write memoize based on the Map cache to improve the recursive performance of the algorithm.

type TargetFun<V> = (... Args: any[]) => V function memoize<V>(fn: TargetFun<V>) {return new Proxy(fn, {// Add a middle layer to integrate Proxy and object. // add cache to object // @ts-ignore cache: new Map<string, V>(), apply(target, thisArg, Const currentCache = (this as any). Cache = (this as any) argsList.toString(); // This call is not currently cached, add cache if (! currentCache.has(cacheKey)) { currentCache.set(cacheKey, target.apply(thisArg, argsList)); } // Return cached data return currentCache.get(cacheKey); }}); }

We can try memoize Fibonacci, a function that passes the proxy has a very big performance boost (visible to the naked eye) :

const fibonacci = (n: number): number => (n <= 1 ? 1 : fibonacci(n - 1) + fibonacci(n - 2));
const memoizedFibonacci = memoize<number>(fibonacci);

for (let i = 0; i < 100; i++) fibonacci(30); // ~5000ms
for (let i = 0; i < 100; i++) memoizedFibonacci(30); // ~50ms

Customize function parameters

We can still generate unique values using functions described in the previous blog, but we don’t need the function name anymore:

Const generateKeyError = new Error("Can't generate key from function argument" generateKey(argument: any[]): string { try{ return `${Array.from(argument).join(',')}` }catch(_) { throw generateKeyError } }

While the library itself can provide unique values based on function parameters, this is certainly not sufficient for a variety of different businesses, and the user needs to be able to customize parameter serialization.

// If the normalizer function is configured, use it. Otherwise, use the default const Normalizer = options? .normalizer ?? generateKey return new Proxy<any>(fn, { // @ts-ignore cache, apply(target, thisArg, argsList: any[]) { const cache: Const cacheKey: string = Normalizer (argsList); Map<string, any> = (this as any).cache // Generate a unique value from the formatter const cacheKey: string = normalizer(argsList); if (! cache.has(cacheKey)) cache.set(cacheKey, target.apply(thisArg, argsList)); return cache.get(cacheKey); }});

Add Promise cache

In the previous blog, I mentioned the disadvantages of caching data. Multiple calls at the same time will make multiple requests because the request did not return. So we also need to add a cache for promises.

if (! Currentcache.has (cacheKey)){let result = target.apply(thisArg, argsList) // Cache promise if it is a promise. // Promise if (result? .then) {result = promise.resolve (result).catch(error => {// if an error has been made, the Promise will be rectified. So the current delete call must be after set, Return promise.reject (error)})} currentcache.set (cacheKey, result); } return currentCache.get(cacheKey);

At this point, we can cache not only the data but also the Promise data request.

The expiration deletion function was added

We can add the timestamp of the current cache to the data and add it when the data is generated.

Export default class ExpiredCacheItem<V> {data: V; cacheTime: number; constructor(data: V) {this.data = data // Add system timestamp this.cacheTime = (new Date()).gettime ()}} String) {const data = this.cachemap.get (name) // There is no data (because the current saved data is ExpiredCacheItem), so we see a successful timeout if (! Const currentTime = (new Date()).gettime () // obtain the number of seconds in the past between the currentTime and the stored time const overTime = If (math.abs (timeclock) > this.timeout) {// If (timeclock) = 0; No problem, but if you have this code, entering the method again reduces the judgment. This.cachemap.delete (name) return true} // Cachemap.delete (name) return true} String) {// Check whether the cache times out return! this.isOverTime(name) }

At this point, we can do all the functions described in the previous blog. However, if it ends here, it is too not enjoyable. Let’s continue to learn the features of other libraries to optimize my feature library.

Adding manual Management

Typically, these cache libraries have manual management capabilities, so HERE I also provide manual cache management for business management purposes. Here we use the Proxy GET method to intercept property reads.

Return new Proxy(fn, {// @ts-ignore cache, get: (target: TargetFun<V>, property: string) => {// If manual management is configured if (options? Manual) {const manualTarget = getManualActionObjFormCache < V > (cache) / / if the current call functions in the current object, direct call, If not, access the original object // even if the current function has the property or method, it doesn't matter, because you configured manual management. If (property in manualTarget) {return manualTarget[property]}} // Manual management is not currently configured, Direct access to the original object return target [property]}} export default function getManualActionObjFormCache < V > (cache: MemoizeCache < V >) : CacheMap<string | object, V> {const manualTarget = object.create (null) // Const manualtarget. set = (key: string | object, val: V) => cache.set(key, val) manualTarget.get = (key: string | object) => cache.get(key) manualTarget.delete = (key: string | object) => cache.delete(key) manualTarget.clear = () => cache.clear! () return manualTarget }

In this case, it’s not complicated, so we can just call Reflect, but in more complicated cases we recommend using Reflect.

Add WeakMap

When we use cache, we can also provide WeakMap (WeakMap does not have clear and size methods), here I extract BaseCache base class.

export default class BaseCache<V> { readonly weak: boolean; cacheMap: MemoizeCache<V> constructor(weak: Boolean = false) {/ / whether or not to use weakMap this. Weak = weak enclosing cacheMap = this. GetMapOrWeakMapByOption ()} / / or according to the configuration for the Map WeakMap getMapOrWeakMapByOption<T>(): Map<string, T> | WeakMap<object, T> { return this.weak ? new WeakMap<object, T>() : new Map<string, T>() } }

After that, I add various types of cache classes that use this as a base class.

Adding a cleanup function

When deleting the cache, you need to clean up the value and provide the dispose function. This class inherits BaseCache and provides dispose calls.

export const defaultDispose: DisposeFun<any> = () => void 0 export default class BaseCacheWithDispose<V, WrapperV> extends BaseCache<WrapperV> { readonly weak: boolean readonly dispose: DisposeFun<V> constructor(weak: boolean = false, dispose: DisposeFun<V> = defaultDispose) {super(weak) this.weak = weak this.dispose = dispose} disposeValue(value: V | undefined): Void {if (value) {this.dispose(value)}} void {if (value) {this.dispose(value)}} disposeAllValue<V>(cacheMap: MemoizeCache<V>): void { for (let mapValue of (cacheMap as any)) { this.disposeValue(mapValue? . [1])}}}

If the current cache is a WeakMap, there is no clear method and iterator. Individual wants to add middle layer to do all this (still thinking, not done yet). If the WeakMap calls the clear method, I am directly providing the new WeakMap.

clear() { if (this.weak) { this.cacheMap = this.getMapOrWeakMapByOption() } else { this.disposeAllValue(this.cacheMap) this.cacheMap.clear! ()}}

Adding a count reference

In learning about other kumemoizee, I saw the following usage:

memoized = memoize(fn, { refCounter: true }); memoized("foo", 3); // refs: 1 memoized("foo", 3); // Cache hit, refs: 2 memoized("foo", 3); // Cache hit, refs: 3 memoized.deleteRef("foo", 3); // refs: 2 memoized.deleteRef("foo", 3); // refs: 1 memoized.deleteRef("foo", 3); // refs: 0, clear Foo cache memoized("foo", 3); // Re-executed, refs: 1

So I followed suit and added a RefCache as well.

export default class RefCache<V> extends BaseCacheWithDispose<V, V> implements CacheMap<string | object, V> {// add ref count cacheRef: MemoizeCache<number> constructor(weak: Boolean = false, dispose: dispose: DisposeFun<V> = () => void 0) { super(weak, The dispose) / / according to the configuration generated WeakMap or Map this. CacheRef = this. GetMapOrWeakMapByOption < number > ()} / / get from the clear and so on are the same. Do not list the delete (key: string | object) : boolean { this.disposeValue(this.get(key)) this.cacheRef.delete(key) this.cacheMap.delete(key) return true; } set(key: string | object, value: V): This {this.cachemap.set (key, value) = ref this.addref (key) return this} string | object) { if (! this.cacheMap.has(key)) { return } const refCount: number | undefined = this.cacheRef.get(key) this.cacheRef.set(key, (refCount ?? 0) + 1) } getRefCount(key: string | object) { return this.cacheRef.get(key) ?? 0 } deleteRef(key: string | object): boolean { if (! this.cacheMap.has(key)) { return false } const refCount: Number = this.getrefCount (key) if (refCount <= 0) {return false} const currentRefCount = refCount - 1 Greater than 0, set, If (currentRefCount > 0) {this.cacheref.set (key, currentRefCount) } else { this.cacheRef.delete(key) this.cacheMap.delete(key) } return true } }

Change the main function of proxy:

if (! currentCache.has(cacheKey)) { let result = target.apply(thisArg, argsList) if (result? .then) { result = Promise.resolve(result).catch(error => { currentCache.delete(cacheKey) return Promise.reject(error) })  } currentCache.set(cacheKey, result); RefCounter} else if (options? .refcounter) {// If called again and already cached, add currentCache.addref? .(cacheKey) }

Add the LRU

The full English name of LRU is Least Recently Used, meaning Least frequently Used. LRU is definitely more efficient than other data structures for caching.

Consider adding Max as well as maxAge (here I’m using two maps for LRU, which will increase memory consumption, but performance is better).

If the current saved data item is equal to Max, we simply set the current cacheMap to oldCacheMap and redo the cacheMap.

set(key: string | object, value: V) {const itemCache = new ExpiredCacheItem<V>(value) this.cacheMap.set(key, itemCache) : this._set(key, itemCache); return this } private _set(key: string | object, value: ExpiredCacheItem<V>) { this.cacheMap.set(key, value); this.size++; if (this.size >= this.max) { this.size = 0; this.oldCacheMap = this.cacheMap; this.cacheMap = this.getMapOrWeakMapByOption() } }

If no, go to oldCacheMap. If yes, delete the old data and put in the new data (using the _set method). If neither, return undefined.

get(key: string | object): V | undefined {/ / if cacheMap, return value if (this. CacheMap. From the (key)) {const item = this. CacheMap. Get (key); return this.getItemValue(key, item!) ; If (this.oldcachemap.has (key)) {const item = this.oldcachemap.get (key); // If (! this.deleteIfExpired(key, item!) ) {// Move to new data and delete old data this.moveToRecent(key, item!) ; return item! .data as V; } } return undefined } private moveToRecent(key: string | object, item: ExpiredCacheItem<V>) {// delete old data this.oldcachemap.delete (key); // New data Settings, focus !!!! If the current set is equal to Max, clear oldCacheMap, so that the data does not exceed Max this._set(key, item); } private getItemValue(key: string | object, item: ExpiredCacheItem<V>): V | undefined {/ / if the current set the maxAge query, or direct return to return this. MaxAge? this.getOrDeleteIfExpired(key, item) : item? .data; } private getOrDeleteIfExpired(key: string | object, item: ExpiredCacheItem<V>): V | undefined { const deleted = this.deleteIfExpired(key, item); return ! deleted ? item.data : undefined; } private deleteIfExpired(key: string | object, item: ExpiredCacheItem<V>) { if (this.isOverTime(item)) { return this.delete(key); } return false; }

Memoize function

At this point, we can take a look at the interface and the main function based on these features and get rid of the details of the code.

Export interface BaseCacheMap<K, V> {delete(key: K): Boolean; get(key: K): V | undefined; has(key: K): boolean; set(key: K, value: V): this; clear? (): void; addRef? (key: K): void; deleteRef? (key: K): boolean; } export Interface MemoizeOptions<V> {/** Serialization parameters */ Normalizer? : (args: any[]) => string; /** Whether to use WeakMap */ weak? : boolean; */ maxAge? */ maxAge? : number; /** The maximum number of entries to delete */ Max? : number; /** Manually manage memory */ manual? : boolean; /** Whether to use reference counting */ refCounter? : boolean; /** Cache delete data callback */ Dispose? : DisposeFun<V>; } export Interface ResultFun<V> extends Function {delete? (key: string | object): boolean; get? (key: string | object): V | undefined; has? (key: string | object): boolean; set? (key: string | object, value: V): this; clear? (): void; deleteRef? (): void }

The final memoize function is pretty much the same as the original one, only doing three things

  • Check the parameters and throw an error
  • Get the appropriate cache based on the parameters
  • Return to the agent
export default function memoize<V>(fn: TargetFun<V>, options? : MemoizeOptions<V>): ResultFun < V > {/ / check parameters and throw an error checkOptionsThenThrowError < V > (options) / / modified serialization function const normalizer = options? .normalizer ?? generateKey let cache: MemoizeCache<V> = getCacheByOptions<V>(options) return new Proxy(fn, {// @ts-ignore cache, get: (target: TargetFun<V>, property: string) => {// Add manual management if (options? .manual) { const manualTarget = getManualActionObjFormCache<V>(cache) if (property in manualTarget) { return manualTarget[property] } } return target[property] }, apply(target, thisArg, argsList: any[]): V { const currentCache: MemoizeCache<V> = (this as any).cache const cacheKey: string | object = getKeyFromArguments(argsList, normalizer, options? .weak) if (! currentCache.has(cacheKey)) { let result = target.apply(thisArg, argsList) if (result? .then) { result = Promise.resolve(result).catch(error => { currentCache.delete(cacheKey) return Promise.reject(error) })  } currentCache.set(cacheKey, result); } else if (options? .refCounter) { currentCache.addRef? .(cacheKey) } return currentCache.get(cacheKey) as V; } }) as any }

The complete code is in Memoizee-Proxy. We operate and play by ourselves.

The next step

test

Test coverage isn’t everything, but the JEST test library has helped me a lot in implementing the library by helping me rethink the functionality and parameter validation that each class and function should have. In the previous code I always checked at the main entry of the project, without thinking deeply about the parameters of each class or function. In fact, this robustness is not enough. Because you don’t get to decide how users use your library.

The Proxy in-depth

In fact, there is no limit to the number of scenarios in which agents can be used. Ruby has proven this already (learn Ruby Metaprogramming).

Developers can use it to create a variety of encoding patterns, such as (but far from limited to) tracking property access, hiding properties, preventing modification or deletion of properties, function parameter validation, constructor parameter validation, data binding, and observable objects.

Of course, Proxy comes from ES6, but the API still requires a higher browser version, and proxy-pollfill is available, but the functionality is limited. However, since 2021, I believe it is time to further study Proxy.

Into the cache

Caching is bad! There is no doubt about it. But it was too fast! So we need to understand more about the business, what data needs to be cached, and what data can be cached.

The cache for the current write is only for one method, can the items that are written later return data in a more granular way? Or do you go even further and write a set of caching layers?

Steps to develop

During the development of the project, I took small steps and ran fast, constantly reworking. In the beginning, the code only went as far as adding the expiration removal feature.

But every time I finished a new feature, I started to clean up the logic and flow of the library again, trying to make the code elegant enough every time. And because I don’t have the ability to think it through the first time I write it. But hope in the future work, continuous progress. This also reduces code rework.

other

Function creates

In fact, when I added manual management to the current library, I considered copying the function directly, since the function itself is an object. Add methods such as set to the current function. But there is no way to copy the scope chain.

It didn’t work, but you learned something, and here are two pieces of code to create the function.

We basically use new Function to create the Function when we create the Function, but the browser does not provide a constructor to create the asynchronous Function directly, so we need to get it manually.

AsyncFunction = (async x => x).constructor foo = new AsyncFunction('x, y, p', 'return x + y + await p') foo(1,2, 0) Promise.resolve(3)).then(console.log) // 6

For global functions, we can also create the function directly by fn.tostring (), and asynchronous functions can also be constructed directly.

function cloneFunction<T>(fn: (... args: any[]) => T): (... args: any[]) => T { return new Function('return '+ fn.toString())(); }

To encourage the

If you think this article is good, I hope you can give me some encouragement and help me star on my Github blog.

Blog address

The resources

Front-end API request caching scheme

ECMAScript 6 Getting Started with agents

memoizee

memoizee-proxy