This article aims to provide a more in-depth overview of the Flutter architecture, including its core principles and concepts at the design level.

In development, the Flutter application will run in a VM (application virtual machine) so that it can hot-reload related updates while preserving state and without recompiling. For releases, the Flutter application is compiled directly into machine code (Intel X64 or ARM instruction sets) or JavaScript for the Web platform. The framework code for Flutter is open source, following the BSD open source license, and has a thriving third-party library ecosystem to complement the core library functionality.

We will learn about the Flutter architecture in the following parts:

  1. Introduction: What is a Flutter? What to do? Who uses it?
  2. Architecture layer: The building blocks of Flutter.
  3. Responsive USER Interface: The core concept of Flutter user interface development.
  4. Layout and Rendering: How Flutter transforms an interface layout into pixels.
  5. Widgets Tree: The building block of the Flutter user interface.
  6. Platform embedding layer: Code that enables Flutter applications to run on mobile and desktop operating systems.
  7. Integration with other code: Learn more about the technologies available to Flutter applications.

Introduction to Flutter

Flutter is Google’s portable UI toolkit that helps you create high-quality native apps for mobile, Web, and desktop applications. Flutter can be used with existing code that is widely used by developers and development organizations around the world, and is open source and free.

For users, Flutter makes a beautiful application UI come alive and interesting. For developers, Flutter lowers the barriers to building mobile apps. It speeds up mobile app development, reduces the cost and complexity of developing both Android and iOS apps, and reduces the cost and complexity of developing cross-platform apps. For designers, Flutter ensures accurate implementation of design intentions without compromising fidelity or being forced to compromise. In the hands of designers, Flutter can also serve as an efficient prototyping tool.

Flutter is for developers who want to build elegant mobile apps quickly, or who want to cover as many users as possible with a code base. Flutter is also suitable for engineering managers who need to lead research and development teams. Flutter allows engineering managers to create a unified application development team that includes mobile, Web, and desktop applications. This helps to develop more functionality faster, deploy the same functionality across multiple platforms, and reduce maintenance costs. Ultimately, if you want beautiful applications, pleasing motion and animation, and a UI with a sense of personality and identity, you are the target user for Flutter.

For developers, Flutter lowers the entry barrier to application development. It speeds up the application development process and reduces the cost and complexity of cross-platform development. For designers, Flutter provides a canvas for a high-fidelity user experience. Fast describes Flutter as “one of the top design ideas of the decade”, providing the ability to translate concepts into production code without the compromises imposed by a typical framework. For engineering directors and employers alike, Flutter can unify application developers from different platforms into a single team of mobile, front-end and desktop applications, build brands together and build applications for multiple platforms in a single code base. Flutter speeds up cross-platform development and simultaneous release processes.

During development, the Flutter application runs in a VM that provides stateful hot overloading of changes without the need for a full recompilation. For publication, the Flutter application compiles directly to machine code, either Intel X64 or ARM instructions, or JavaScript if web-oriented. The framework is open source, has a loose BSD license, and has a thriving ecosystem of third-party packages that complement the core library functionality.

Ii. Flutter architecture layer

The Flutter is designed as an expandable hierarchical system. It can be thought of as a collection of individual components, with the upper components depending on the lower components. Components cannot override access to lower-level content, and parts of the framework layer are optional and replaceable.



The Architecture of Flutter is divided into three layers: Framework (Dart), Engine (C/C++), and Embedder (platform-specific).

  1. Dart is used to implement the Framework, including Material Design style Widgets, Cupertino style Widgets (for iOS), BASIC UI/ text/image/button Widgets, rendering, animation, gestures, etc. The core code of this part is: Flutter package under the flutter repository, IO, Async, UI and other packages under the Sky_engine repository (DART: UI library provides the interface between the Flutter framework and the engine).

  2. Engine is implemented in C++, including Skia, Dart, and Text.

  • Skia is an open source two-dimensional graphics library that provides a common API for a variety of hardware and software platforms. It has been used as a graphics engine for Google Chrome, Chrome OS, Android, Mozilla Firefox, Firefox OS and many other products. It also supports Windows, macOS, iOS, Android, Ubuntu and many others.
  • The Dart section includes :Dart Runtime, Garbage Collection(GC), and JIT(Just In Time) support if In Debug mode. In Release and Profile mode, AOT(Ahead Of Time) compiles to native ARM code without JIT.
  • Text is Text rendering, which is rendered at the following levels: derived from Minikin’s libtxt library (for font selection and line separation); HartBuzz is used for glyphs selection and shaping; Skia works as a render /GPU back end, using FreeType rendering on Android and Fuchsia, and CoreGraphics for rendering fonts on iOS.
  1. Embedder is an Embedder layer that allows the Flutter to be embedded to various platforms. The main tasks here include rendering Surface Settings, thread Settings, and plug-ins. From this we can see that the platform-dependent layer of Flutter is very low. Platforms (such as iOS) only provide a canvas, and all the remaining render related logic is inside Flutter, which makes it very cross-end consistent.

As you can see from the architecture diagram, Flutter rewrites a cross-platform UI framework from start to finish, including UI controls, rendering logic and even a development language. The rendering engine relies on the cross-platform Skia graphics library to implement, relying on the system only graphics rendering related interface, which can ensure the consistency of experience on different platforms and devices to the greatest extent. The logic processing uses AOT-supported Dart language, and the execution efficiency is much higher than JavaScript. To the underlying operating system, Flutter applications are packaged in the same way as other native applications. A platform-specific embed provides an entry point; Coordinate with the underlying operating system to access services such as render surfaces, accessibility, and input; And manage the message event loop. The embeders are written in platform-appropriate languages: Currently Java and C++ for Android, Objective-C/Objective-C++ for iOS and macOS, and C++ for Windows and Linux. Using an embed, the Flutter code can be integrated into an existing application as a module or as the entire content of the application. Flutter contains many embeds for common target platforms, but others exist as well. The core of Flutter is the Flutter engine, which is mainly written in C++ and supports all the basic elements required for Flutter applications. The engine is responsible for rasterizing the composite scene whenever a new frame needs to be drawn. It provides a low-level implementation of the Flutter core API, including graphics (via Skia), text layout, file and network I/O, accessibility support, plug-in architecture, and Dart runtime and compilation toolchains.

The engine is exposed to the Flutter framework via dart: UI, which encapsulates the underlying C++ code in dart classes. This library exposes the lowest level primitives, such as the classes used to drive the input, graphics, and text rendering subsystems. Typically, developers interact with Flutter through the Flutter Framework, which provides a modern, reactive Framework written in the Dart language. It includes a rich set of platforms, layouts, and infrastructure libraries, consisting of a series of layers. From the bottom to the top are:

  • Base classes and component services, such as animation, painting, and gestures, provide common abstractions on top of the underlying base.
  • The render layer provides an abstraction for handling the layout. From this layer, you can build a tree of renderable objects. You can manipulate these objects dynamically, and the tree automatically updates the layout to reflect your changes.
  • The Widgets layer is a constituent abstraction. Each render object in the render layer has a corresponding class in the Widgets layer. In addition, the Widgets layer allows you to define combinations of classes that can be reused. This is the layer that introduces the reactive programming model.
  • The Material and Cupertino libraries provide a comprehensive set of controls that use composite primitives from the Widget layer to implement the Material or iOS design language.

The Flutter frame is relatively small; Many of the higher-level features that developers are likely to use are implemented in packages, including platform plug-ins like cameras and WebView, as well as platform-independent features like character, HTTP, and animation, which are built on top of the core Dart and Flutter libraries. Some of these packages come from the broader ecosystem, covering services such as in-app payments, Apple authentication and animation.

The Flutter framework consists of many abstract levels. At the top of those tiers are the Material and Cupertino Widgets that we use a lot. These are the two types of widgets we use most of the time. Below the Widget layer, you’ll find the Rendering layer. Rendering layer simplifies the layout and Rendering process. It’s an abstraction of DART: UI. Dart: THE UI is the lowest layer of the framework and handles communication with the Engine layer. In short, the higher the level, the easier it is to use, but the lower the level, the more APIS are exposed and the more customizations can be added.

2.1 the dart: UI library

Dart: THE UI Library exposes the lowest level of services that are used to bootstrap the Application, such as driving input, text drawing, layout, and rendering subsystems.

So you can build a Flutter App just by instantiating classes in the DART: UI library (such as Canvas, Paint, and TextField). But if you’re familiar with drawing directly on a canvas, you know that drawing a pattern using these low-level apis can be difficult and tedious. Next, consider something that isn’t drawing, such as layout and hit tests. What does all this mean? This means you have to manually calculate all the coordinates used in your layout, and then mix some drawing and hit tests to capture user input. Do the above for each frame and track them. This works well for simple apps, such as one that displays text in a blue area. If for those more complex apps or simple games can be enough for you to suffer. Not to mention animations, scrolling, and some cool UI effects that product managers love most. I can tell you from my years of development experience that these are endless nightmares for developers.

2.2 Rendering library

Flutter Rendering tree. The RenderObject hierarchy is used by the Flutter Widgets library to implement its layout and background rendering. In general, although you may use the RenderBox to implement custom effects in your application, in most cases the only interaction we have with the RenderObject is when debugging layout information.

Dart: UI Library is the first abstraction layer on dart: UI Library. It does all the heavy math for you (like keeping track of coordinates that need to be computed constantly). It uses RenderObjects to handle this work. You can think of RenderObjects as a car engine that does all the work of bringing your APP to the screen. All RenderObjects in the Rendering tree are layered and drawn by Flutter. To optimize this complex process, Flutter uses an intelligent algorithm to cache objects that are expensive to instantiate to achieve optimal performance. In most cases, you will find that Flutter uses RenderBox instead of RenderObject. This is because the project’s builders found it possible to successfully build an effective and stable UI using a simple and box-layout constraint. Imagine all the widgets are placed in their boxes. The parameters in this box are calculated and then placed among the other boxes that have been arranged. So if only one Widget changes in your layout, the box in which it was loaded is recalculated by the system.

2.3 the Widget library

Flutter Widgets framework

The Widget library is perhaps the most interesting. It is another abstraction layer used to provide widgets out of the box. All the widgets in this library belong to one of three categories of widgets handled with the appropriate RenderObject.

  1. Layouts, such as Column and Row Widgets, are used to help us easily work with the Layout of other Widgets.
  2. Painting, such as Text and Image Widgets, allows us to display (draw) something on the screen.
  3. Hit-testing, such as GestureDetector, allows us to recognize different gestures, such as clicks and swipes.

Most of the time we use a few “base” widgets to make up the widgets we need. For example, we use a GestureDetector to wrap a Container and wrap a Button inside the Container to handle Button clicks. This is called composition, not inheritance. However, in addition to building each UI component itself, the Flutter team also created two libraries containing the commonly used Material and Cupertino style Widgets.

2.4 Material & Cupertino library

The Library of Widgets using Material and Cupertino design specifications.

Flutter created this layer of Widgets in the Material and Cupertino styles to reduce the burden on developers.

Responsive user interfaces

At first glance, Flutter is a responsive and pseudo-declarative UI framework that maps app state to interface state and updates app state changes to the interface at runtime. This model architecture is inspired by Facebook’s React framework, which contains a re-deconstruction of traditional design concepts.

In most traditional UI frameworks, the initial state of an interface is defined once and then updated separately at run time in response to events in user code. One of the big challenges here is that with the increasing complexity of applications, developers need to have a holistic understanding of the state correlation of the entire UI. Let’s look at the following UI:



You can change state in many places: color boxes, color sliders, radio buttons. As the user interacts with the UI, state changes can affect every location. To make matters worse, a small change to the UI can lead to a chain reaction of unrelated code, especially if the developer doesn’t pay attention to the connection.

We can do this in a similar way to MVC, where the developer pushes the changes to the Model through the Controller, and the Model pushes the new state through the Controller to the View. This approach is problematic, however, because the creation and updating of UI elements are separated, making them easily out of sync.

Flutter, like other responsive frameworks, addresses this problem by explicitly stripping away the underlying state and user interface. You can use the React style API to create a description of the UI and let the framework take care of gracefully creating and updating the user interface through configuration.

In Flutter, widgets (similar to the components in React) are immutable classes used to configure the object tree. These widgets manage the individual tree of layout objects and then participate in managing the composite tree of layout objects. At the heart of Flutter is an efficient mechanism for traversing tree changes, converting tree objects into lower tree objects and passing changes from tree to tree.

Widget declares the construction of the UI by overwriting build(), the method that converts state to UI:

UI = f(state) 
Copy the code

The build() method can be called whenever the framework needs it (it may be called once per render frame), and from a design point of view it should be quick to execute with no additional impact. Such implementation design relies on the runtime characteristics of the language (in particular, fast instantiation and cleanup of objects). Fortunately, Dart is well suited for the job.

Layout and rendering of Flutter

Have you ever wondered: Since Flutter is a cross-platform framework, how does it provide comparable performance to native platform frameworks? How does it translate from the widget hierarchy to the actual pixels drawn on the screen? What steps do YOU need to go through?

Let’s start with android native apps. When you write something to draw, you need to call the Java code of the Android framework. Android system library provides components that can draw itself to Canvas object. Then Android can use Skia graphics engine written by C/C++ to call CPU and GPU to complete the drawing on the device.

Typically, cross-platform frameworks create a layer of abstraction on top of the underlying UI libraries of Android and iOS that tries to smooth over the differences between the systems. In this case, the application code is often written in interpreted languages such as JavaScript and interacts with Java-based Android and Objective-C-based iOS systems to display the UI. All of these processes add significant overhead, especially when the UI and application logic interact in complex ways.

By contrast, Flutter reduces the overhead of the abstraction layer by bypassing the system UI component library and using its own set of widget content. The Dart code used to draw the Flutter image content is compiled into machine code and rendered using Skia. Flutter also has its own copy of Skia embedded, allowing developers to update their applications even when devices are not updated to the latest systems, ensuring stability and improving performance.

4.1 From User Operations to GPU

The first principle of Flutter’s rendering mechanism is simplicity and speed. The Flutter provides a direct channel for data flow to the system, as shown in the following flow chart:

【 User Input 】 Response to input gestures (keyboard, touch screen) 【animation】 User interface changes triggered by timers 【build】 Application code that creates widgets on the screen 【 Layout 】 Positions and resizes elements on the screen 【paint】 Transforms elements into visual representations Transform the output into GPU rendering instructionsCopy the code

Let’s take a closer look at some of these stages.

4.2 Build: From Widget to Element

Take a look at the following code snippet, which represents a simple widget structure:

Container(
  color: Colors.blue,
  child: Row(
    children: [
      Image.network('https://www.example.com/1.png'),
      const Text('A'),,),);Copy the code

When Flutter needs to draw this code snippet, the framework calls the Build () method and returns a widget subtree that draws the UI based on the current application state. During this process, the build() method may introduce new widgets based on state if necessary. In the example above, the color and child of the Container are typical examples. We can look at the Container source code, and you can see that when the color property is not empty, ColoredBox is added for color layout.

if(color ! =null) current = ColoredBox(color: color! , child: current);Copy the code

Correspondingly, RawImage and RichText are also introduced in the build process for Image and Text. As a result, the resulting widget structure is deeper than the code representation, as shown below in this scenario2:



Is that why you’re using Dart DevToolsFlutter inspectorWhen debugging the widget tree structure, you will find that the actual structure is deeper than the structure in your original code.

During the build phase, a Flutter transforms the widgets described in the code into a tree of corresponding Elements, one for each Widget. Each Element represents an instance of a widget at a specific location in the tree hierarchy. There are two basic types of elements:

  • ComponentElement, host to other elements.
  • RenderObjectElement, an Element that participates in the layout or drawing phase.



RenderObjectElement is the bridge between the underlying RenderObject and the corresponding widget, which we’ll cover later.

Any widget can be referenced to Element through its BuildContext, which is a handle to the widget’s location in the tree. Similar to the context in the theme.of (context) method call, it is passed as an argument to the build() method.

Because widgets and the relationships between the nodes above and below them are immutable, any action done to the widget tree (such as replacing Text(‘A’) with Text(‘B’)) returns A new collection of Widget objects. But that doesn’t mean the underlying presentation has to be rebuilt. The Element tree is persistent between frames and therefore plays a crucial role in performance. Flutter relies on this advantage to implement a mechanism that allows the underlying representation to be cached, just as the widget tree is discarded altogether. A Flutter can rebuild parts of the Element tree that need to be reconfigured based on the widgets that have changed.

4.3 Layout and Rendering

Few applications draw only a single widget. Therefore, efficiently arranging the structure of widgets and determining the size and location of each Element before rendering is complete is one of the key points of any UI framework.

In the render tree, the base class for each node is RenderObject, which defines an abstract model for layout and rendering. This is all too mundane: it is not always of a fixed size, and does not even follow cartesian coordinates (as shown in the example of this polar coordinate system). Each RenderObject knows about its parent node, but there is little information about its children other than how to access and obtain their layout constraints. This design allows the RenderObject to be efficiently abstracted, handling a wide variety of usage scenarios.

During the construction phase, a Flutter creates or updates an object inherited from the RenderObject for each RenderObjectElement in the Element tree. RenderObject is actually a primitive: render textRenderParagraphAnd render the pictureRenderImageAnd apply the transform before drawing the content of the child nodeRenderTransformIt’s a higher level implementation.



Most Flutter widgets are rendered by objects that inherit from a RenderBox subclass. The RenderObject they render has a fixed size in two-dimensional Cartesian space. RenderBox provides a box constraint model that associates the minimum and maximum width and height for rendering for each widget.

Flutter traverses the render tree in DFS (depth-first traversal) as it layouts, passing restrictions from parent to child in a top-down fashion. For child nodes to determine their own size, they must follow the limits passed by the parent node. The child responds by passing the size to the parent in a bottom-up fashion within the constraints established by the parent.



After traversing the tree once, each object has a definite size via parent constraints and is ready to be calledpaint()Render.

The box constraint model is very powerful, and its object layout has O(n) time complexity:

The parent node can determine the size of its child node objects by setting maximum and minimum size limits. For example, in a mobile application, the highest-level render object will limit the size of its children to the size of the screen. Child nodes can choose how to occupy space. For example, they might be laid out in a centered manner within set limits.)

The parent node can determine the width of the children and give them the flexibility to adapt to the layout height (or to adapt to the width depending on the height). An example of this in practice is text with streaming layout, which often fills in horizontal limits and determines height based on the amount of text content.

This box constraint model is also applicable to scenarios where the child node object needs to know how much space is available to render its content. By using the LayoutBuilder widget, the child node can get the constraints passed down from above and make proper use of the constraints as follows:

Widget build(BuildContext context) {
  return LayoutBuilder(
    builder: (context, constraints) {
      if (constraints.maxWidth < 600) {
        return const OneColumnLayout();
      } else {
        return constTwoColumnLayout(); }}); }Copy the code

More information about the constraint and layout system, as well as reference examples, can be found in the in-depth Understanding of Flutter layout constraints article.

The root node of all RenderObjects is the RenderView, which represents the overall output of the render tree. When the platform needs to render a new frame (such as a vsync signal or the completion of a texture update), it calls a compositeFrame() method, which is part of the RenderView. This method creates a SceneBuilder that triggers an update of the current screen. When the image is updated, RenderView passes the resulting image to the window.render () method in dart: UI, controlling the GPU to render.

More details about the composition and rasterization stages of the rendering process will not be discussed in this in-depth article, but can be found in the discussion on the Flutter rendering process.

5. The Widget tree

How does Flutter create a layout? How do renderObjects connect to Widgets? What is Element? Let’s look at a simple example, just to see how they relate to each other.







The APP we’ve built is very simple. It consists of three Stateless widgets: SimpleApp, SimpleContainer, and SimpleText. So what happens if we call the runApp() method of Flutter? When runApp() is called, the following events first occur in the background.

  1. Flutter builds the Widgets tree that contains these three Widgets.
  2. The Flutter traverses the Widget tree and calls createElement() to create the corresponding Element object based on the Widget. The object is then assembled into the Element tree.
  3. A third tree is created that contains the RenderObject created by createRenderObject() for the Element corresponding to the Widget. Here is the state of Flutter after it has gone through these three steps:






Flutter creates three different trees, one for Widget, one for Element, and one for RenderObject. Each Element has a reference to the corresponding Widget and RenderObject.

So what is RenderObject? RenderObject contains all the logic used to render instance widgets. It is responsible for layout, painting, and hit-testing. It is performance-intensive to generate, so we should cache it as much as possible. We keep them in memory for as long as possible, and even recycle them (because their instantiation is really expensive), and that’s where Element comes in.



Element is the bridge between the mutable Widget tree and the immutable RenderObject tree. Element is good at comparing two objects, namely the Widget and the RenderObject, in a Flutter. It configures the Widget’s position in the tree and keeps references to the corresponding RenderObject and Widget. Why use three trees instead of one? In short, it’s for performance. When the Widget tree changes, Flutter uses the Element tree to compare the new Widget tree with the original RenderObject tree. The RenderObject needs to be recreated if the Widget and RenderObject types in one location are different. If the widgets in other locations are of the same type as the RenderObject, you only need to change the configuration of the RenderObject instead of performing performance-costly RenderObject instantiation.



Because widgets are lightweight and require very little performance to instantiate, they are the best tool for describing the state of your APP (that is, configuration). Heavyweight RenderObjects (which are expensive to create) need to be created as little as possible and reused as much as possible. As Simon said: The whole Flutter APP is like a RecycleView. However, in the framework, elements are removed, so you don’t have to deal with them very often. The context passed in the Build (BuildContext Context) method of each Widget is the Element that implements the BuildContext interface, which is why individual widgets of the same category differ.



Because widgets are immutable, the entire Widget tree needs to be rebuilt whenever a Widget’s configuration changes. For example, when we change the color of a Container to red, the framework triggers an action to rebuild the entire Widget tree. Then, with the help of Element, Flutter compares the type of the first Widget in the new Widget tree with the type of the first RenderObject in the RenderObject tree. The type of the second Widget in the Widget tree is then compared to the second RenderObject in the RenderObject tree, and so on until the Widget tree is compared to the RendObject tree.







Flutter follows a basic principle: Determine whether the new Widget is of the same type as the old one. If they are not of the same type, remove the Widget, Element, and RenderObject from their trees (including their subtrees) and create new objects. If it is a type, just modify the configuration in the RenderObject and continue through. In our example, the SimpleApp Widget is of the same type as the original, and will be configured in the same way as the old SimpleApp apprender, so nothing will happen. The next item in the Widget tree is the SimpleContainer Widget, which has the same type as before, but its color has changed, and the RenderObject configuration has changed. Because The SimpleObject still needs a SimpleContainerRender to render, Flutter simply updates SimpleContainerRender’s color properties and then asks it to re-render. All the other objects remain the same.





This process is very fast because Flutter is very good at creating lightweight Widgets. Those heavyweight RenderObjects remain unchanged until their corresponding widgets are removed from the Widget tree. What happens if the Widget’s type changes?







As before, Flutter traverses the top of the Widget tree and compares it to the RenderObject type in the RenderObject tree.







Because SimpleButton’s type is different from that of the corresponding Element in the Element tree (and actually compared to the type of RenderObject), Flutter will remove this Element and its corresponding SimpleTextRender from their tree. The Flutter will then rebuild the Element and RenderObject corresponding to SimpleButton.







Then the new RenderObject tree has been rebuilt and the layout will be computed and drawn on the screen. Flutter uses many optimization methods and caching strategies internally to handle this, so you don’t need to handle this manually.



Platform embedder

We already know that the construction, layout, composition, and drawing of the interface of Flutter are all done by Flutter itself, rather than being converted to native components of the corresponding platform system. The method of capturing the underlying lifecycle of textures and linked applications will inevitably change depending on platform characteristics. The Flutter engine itself is platform-independent and provides a stable ABI (Application binary Interface) containing a platform embedding layer through which to set up and use the Flutter.

The platform embed layer is the native system application for rendering all Flutter content. It acts as the glue between the host operating system and Flutter. When you start a Flutter application, the embed layer provides an entry point, initializes the Flutter engine, obtains the UI and rasterize threads, and creates textures that Flutter can write to. The embedding layer is also responsible for managing the application lifecycle, including input actions (such as mouse, keyboard, and touch), window size changes, thread management, and platform messaging. Flutter has platform embedded layers for Android, iOS, Windows, macOS, and Linux. Of course, developers can create custom embedded layers, as this available example supports remote Flutter with VNC-style frame buffers.

Each platform has its own set of apis and limitations. Here are some brief notes about the platform:

  • On iOS and macOS, Flutter is loaded into the embed layer via UIViewController and NSViewController, respectively. These embed layers create a FlutterEngine that hosts the Dart VM and your Flutter runtime, as well as a FlutterViewController associated with the corresponding FlutterEngine, Pass UIKit or Cocoa input events to Flutter and render FlutterEngine frames via Metal or OpenGL.
  • On Android, Flutter is loaded into the embed layer by default as an Activity. At this point, the view was controlled by a FlutterView, and the contents of Flutter were presented in view mode or texture mode based on the composition and Z-ordering of Flutter contents.
  • On Windows, Flutter is hosted by a traditional Win32 application whose content is rendered via a library ANGLE that converts OpenGL API calls into DirectX 11 equivalently. Attempts are being made to use UWP applications as an embedded layer for Windows and to replace ANGLE with a direct call to the GPU via DirectX 12.

Vii. Integration with other code

Flutter provides multiple mechanisms for code interaction, whether you are calling code or apis written in languages like Kotlin or Swift, calling C-based apis, embedding native code capabilities into Flutter applications, or embedding Flutter into existing applications.

7.1 Platform Channel

For mobile and desktop applications, Flutter provides accessPlatform channelThe ability to call custom code, which is a very simple mechanism for the Dart code to communicate with platform code between host applications. By creating a common channel that encapsulates the channel name and encoding, developers can send and receive messages between Dart and platform components written in languages such as Kotlin and Swift. The data is serialized from the Dart type (such as Map) to a standard format, and then deserialized to the equivalent in Kotlin (such as HashMap) or Swift (such as Dictionary).



The following example is simple receive handling for a Dart call platform channel event in Kotlin (Android) or Swift (iOS) :

// Dart side
const channel = MethodChannel('foo');
final String greeting = await channel.invokeMethod('bar'.'world');
print(greeting);
Copy the code
// Android (Kotlin)
val channel = MethodChannel(flutterView, "foo")
channel.setMethodCallHandler { call, result ->
  when (call.method) {
    "bar" -> result.success("Hello, ${call.arguments}")
    else -> result.notImplemented()
  }
}
Copy the code
// iOS (Swift)
let channel = FlutterMethodChannel(name: "foo", binaryMessenger: flutterView)
channel.setMethodCallHandler {
  (call: FlutterMethodCall, result: FlutterResult) -> Void in
  switch (call.method) {
    case "bar": result("Hello, \(call.arguments as! String)")
    default: result(FlutterMethodNotImplemented)
  }
}
Copy the code

More examples of how to use platform channels, including macOS platform examples, can be found in flutter/ Plugins repository 3.

7.2 Render native content in the Flutter application

Because the contents of a Flutter are drawn within a single texture and the widget tree is completely internal, there is no content like Android view in the internal model of a Flutter. There is also no way to cross-render with Flutter widgets. This is a problem for developers who need to display native components (such as the built-in browser) in Flutter applications.

Flutter solves this problem by introducing platform widgets (AndroidView and UiKitView) that developers can embed on every platform. Platform views can be integrated with other Flutter content. These widgets act as a bridge between the underlying operating system and Flutter. On Android, for example, AndroidView provides three main functions:

  • Copy the native view rendered graphics texture and submit it to the Flutter rendering layer for composition every Flutter frame.
  • Responds to hit tests and input gestures by converting them into equivalent native input events.
  • Create a similar accessibility tree and pass commands and responses between the native layer and the Flutter layer.

Inevitably, however, there is an overhead associated with such synchronization. This approach is therefore generally more suitable for complex controls, such as Google Maps, that are not suitable to be re-implemented in Flutter.

Usually the Flutter application instantiates these widgets in the Build () method based on platform judgment. For example in the Google_maps_flutter plugin:

if (defaultTargetPlatform == TargetPlatform.android) {
  return AndroidView(
    viewType: 'plugins.flutter.io/google_maps',
    onPlatformViewCreated: onPlatformViewCreated,
    gestureRecognizers: gestureRecognizers,
    creationParams: creationParams,
    creationParamsCodec: const StandardMessageCodec(),
  );
} else if (defaultTargetPlatform == TargetPlatform.iOS) {
  return UiKitView(
    viewType: 'plugins.flutter.io/google_maps',
    onPlatformViewCreated: onPlatformViewCreated,
    gestureRecognizers: gestureRecognizers,
    creationParams: creationParams,
    creationParamsCodec: const StandardMessageCodec(),
  );
}
return Text(
    '$defaultTargetPlatform is not yet supported by the maps plugin');
Copy the code

As mentioned above, AndroidView and UiKitView usually communicate with the native using the mechanism of platform channels. Platform view is not currently supported on desktop platforms, but this is not an architectural limitation. Support for desktop platforms may be added in the future.

7.3 Hosting Flutter content in upper-layer applications

In contrast to the previous scenario, integrate the Flutter Widget into an existing Android or iOS application. As mentioned earlier, the newly created Flutter app runs in an Android Activity or an iOS UIViewController on mobile devices. Developers can use the same embed API to integrate Flutter content into existing Android or iOS apps.

The Design of the Flutter module template is simple and easy to embed. Developers can integrate them as source code dependencies into Gradle or Xcode build definitions, or package them as an Android Archive (AAR) or iOS Framework binary for other developers to use without installing Flutter. The Flutter engine takes a short period of initialization to load the shared libraries of the Flutter, initialize the Dart runtime, create and run Dart ISOLATE threads, and bind the rendering layer to the UI. To minimize the delay in rendering the Flutter interface, it is best to initialize the Flutter engine at the time of application initialization, or at least before the first Flutter page is displayed, so that the user does not experience a sudden lag when the first Flutter page loads. In addition, the engine separation of Flutter allows multiple Flutter pages to reuse the engine, sharing the memory consumption of the necessary library load. More on incorporating Flutter into existing Android and iOS apps can be found in the article Controlling load Order, Optimizing Performance and memory.

If you are interested in more of the internal details of Flutter, the White paper on how Flutter works provides a good introduction to the design philosophy of the framework. References:

  1. The hierarchical structure of Flutter
  2. An article on Flutter architecture and rendering
  3. Flutter FaQs and Answers
  4. Overview of the Flutter architecture