Video lessons & Debug demos

The purpose of the video course is to quickly master the process of running the React source code and scheduler, Reconciler, Renderer, Fiber, etc., in react, and debug the source code and analysis in detail to make the process clearer.

Video course: Enter the course

Demos: demo

Course Structure:

  1. You’re still struggling with the React source code.
  2. The React Mental Model
  3. Fiber(I’m dom in memory)
  4. Start with Legacy or Concurrent (start at the entrance and let’s move on to the future)
  5. State update process (what’s going on in setState)
  6. Render phase (awesome, I have the skill to create Fiber)
  7. Commit phase (I heard renderer marked it for us, let’s map real nodes)
  8. Diff algorithm (Mom doesn’t worry about my Diff interviews anymore)
  9. Function Component saves state
  10. Scheduler&lane model (to see tasks are paused, continued, and queue-jumped)
  11. What is concurrent mode?
  12. Handwriting mini React (Short and Sharp is me)

concurrent mode

React17 started supporting concurrent Mode, a set of new features including Fiber, Scheduler, and Lane that adjust application response speed based on user hardware performance and network conditions. The core is to enable asynchronous interruptible updates. Concurrent mode is also the direction of major react iterations in the future.

  • Cup: Allows the time-consuming reconcile process to cede JS execution to higher-priority tasks, such as user input,
  • Suspense: Depend on Suspense

Fiber

Fiber has been introduced before. Here we will look at the meaning of Fiber in concurrent mode. Prior to REact15, reconcile is executed synchronously. To solve this problem, a set of asynchronous interruptible updates is required to allow time-consuming computations to cede JS execution to higher-priority tasks, which can be performed when the browser is free. So we need a data structure to describe real DOM and updated information that can be reconciled in memory when appropriate. This data structure is Fiber.

Scheduler

Scheduler is independent from React itself and is equivalent to a separate package. The significance of Scheduler lies in that, when the computation amount of CUP is large, we calculate the time of a frame according to the FPS of the device and execute the OPERATION of CUP within this time. When the task execution time is about to exceed one frame, The task is paused to give the browser time to rearrange and redraw. Continue the task when appropriate.

In JS we know that generator can also pause and resume tasks, but we also need to prioritize tasks, which generator cannot do. Time slices are implemented using MessageChannel in Scheduler, and tasks are prioritized using a small top heap to achieve asynchronous interruptible updates.

The Scheduler can use expiration time to represent priority.

The higher the priority, the shorter the expiration time and the closer to the current time, meaning that it will be executed later.

The lower the priority is, the longer the expiration time is. In other words, it can be executed after a long time.

lane

Lane uses binary bits to represent the priority of tasks, which is convenient for priority calculation. Different priorities occupy ‘tracks’ of different positions, and there is a concept of batch. The lower the priority, the more’ tracks’. High priority interrupts low priority, and what priority should be assigned to new tasks are all problems that Lane needs to solve.

batchedUpdates

In simple terms, multiple updates are triggered simultaneously in a context, and these updates are merged into one update, for example

onClick() {
  this.setState({ count: this.state.count + 1 });
  this.setState({ count: this.state.count + 1 });
}
Copy the code

In previous versions of react, if you put multiple updates in a setTimeout, you wouldn’t merge them if you took them out of context, because multiple setState executionContext in the same context would have BatchedContext, SetState with BatchedContext will be merged, and when executionContext is equal to NoContext, SyncCallbackQueue will be synchronized, so multiple set states in setTimeout will not be merged, And it will be executed synchronously.

onClick() {
 setTimeout(() = > {
    this.setState({ count: this.state.count + 1 });
    this.setState({ count: this.state.count + 1 });
  });
}
Copy the code
export function unbatchedUpdates<A.R> (fn: (a: A) => R, a: A) :R {
  const prevExecutionContext = executionContext;
  executionContext |= BatchedContext;/ / contains BatchedContext
  try {
    return fn(a);
  } finally {
    executionContext = prevExecutionContext;
    if (executionContext === NoContext) {
      resetRenderTimer();
      // Execute tasks in SyncCallbackQueue synchronously if executionContext is NoContextflushSyncCallbackQueue(); }}}Copy the code

In Concurrent mode, the above example will also be combined into one update, the root cause is in the following simplified source code. If multiple setState callbacks are performed, the priorities of these setState callbacks will be compared. If the priorities are consistent, the return will be first, without the subsequent render phase

function ensureRootIsScheduled(root: FiberRoot, currentTime: number) {
  const existingCallbackNode = root.callbackNode;// The setState callback that was previously called
  / /...
	if(existingCallbackNode ! = =null) {
    const existingCallbackPriority = root.callbackPriority;
    // The new setState callback and the previous setState callback with equal priority enter batchedUpdate logic
    if (existingCallbackPriority === newCallbackPriority) {
      return;
    }
    cancelCallback(existingCallbackNode);
  }
	// Dispatch the start of the render phase
	newCallbackNode = scheduleCallback(
    schedulerPriorityLevel,
    performConcurrentWorkOnRoot.bind(null, root),
  );
	/ /...
}
Copy the code

In Concurrent mode, setTimeout is called multiple times with the same setState priority as requestUpdateLane, CurrentEventWipLanes === NoLanes, so their currentEventWipLanes parameters are the same, In findUpdateLane, the schedulerLanePriority parameters are the same (scheduling priority is the same), so the returned lanes are the same.

export function requestUpdateLane(fiber: Fiber) :Lane {
	/ /...
  if (currentEventWipLanes === NoLanes) {CurrentEventWipLanes === NoLanes for the first time
    currentEventWipLanes = workInProgressRootIncludedLanes;
  }
  / /...
  // schedulerLanePriority, currentEventWipLanes are the same in setTimeout, so the return lane is the same
  lane = findUpdateLane(schedulerLanePriority, currentEventWipLanes);
  / /...

  return lane;
}
Copy the code

Suspense

Suspense can display pending state when data is requested, and data will be displayed when data is successfully requested. The reason is that components in Suspense have a low priority, while fallback components out of Suspense have a high priority. In Suspense, the render phase will be rescheduled after resolve. This happens in updateSuspenseComponent, see the Suspense video for details