This article was first published on Android platform Kotlin, welcome to follow!

I’m Wanbo. Long time no see.

This year Google I/O held an online presentation on the 18th and 19th. I took time to watch three videos, and today I’m going to share with you some of the new technologies that I think are interesting.

This installment will focus on the latest developments in AI and Android.

AI progress

In his 2017 Google I/O keynote, Google CEO Sundar Pichai announced that Google had entered the era of AI First, and that AI had made a huge impact on Google’s platforms and services this year. AI is everywhere.

At this year’s Google I/O, we introduced two new natural language models: LaMDA and MUM.

LaMDA

LaMDA is a new model for handling language conversations. In real life, human conversation, context switch is very frequent, for example, one minute your family is telling you to eat on time, the next minute they ask you to find a date.

But from the AI’s point of view, this is a very difficult problem to deal with. At the present stage, most of the AI’s dialogues only answer narrow scenarios. When you randomly switch to a new topic, the AI may say: “Sorry, I don’t know how to reply to you.”

The new LaMDA model unlocks a more natural way to chat, much like chatting with a knowledgeable friend who will always give you the right answer even if the conversation gets off topic.

MUM

MUM is a multi-tasking unified processing model for search scenarios. Although LaMDA is part of a natural language model with Transformer architecture, MUM can extract key information from context to give you the right feedback on a problem.

Just like you’re asking an experienced friend, MUM improves understanding of human problems and improves search, moving from a keyword-centric search process to an intelligent search based on context and context.

What’s more powerful than LaMDA is that MUM can interpret text, video, pictures, audio and other forms of information, analyze content and understand the intent behind it, and then give you the most appropriate search results.

TPU V4 chip and quantum computer

The NEW V4 chip is twice as fast as its predecessor. Multiple Tpus can be connected together to form a supercomputer, which Google calls A Pod. A Pod consists of 4096 V4 chips, which can deliver more than 1 exaflop of computing power.

Speaking after announcing the new TPU chip, Pichai said quantum computing is the future of computing because there are many problems with classical computing that won’t be solved anytime soon.

It’s still early days, but Google plans to deliver a commercially available quantum computer by 2029.

When I saw this, as an Android programmer, I suddenly had a kind of depressed mood 😂, often said to others that we are the practitioners of the Internet of science and technology, in fact, we are far from the real technology, we are just code porters…

Back to our area of concern, let’s take a look at the latest Android developments.

Android progress

This year’s Android 12 update focuses on design and interaction, which can be said to be more like iOS. I think this is not a bad thing. (I’m talking about you 👉 Material Design)

1. New design and interaction

Material You

What impressed me most about Material You, the new design style introduced in Android 12, was that when the speakers introduced their new UI specification, they asked themselves a question: “What would it be like if design followed feelings instead of rules and functions?”

That’s a big change.

Specific design specifications won’t be released until the fall, but past Android systems have actually built in theme emphasis colors, such as cyan in the AOSP code and blue on the Pixel. In Android 12, the system expands the rich color palette, using the built-in color palette to generate preset styles, and the system components will automatically adapt.

In addition to preset styles, Android 12 also provides developers with preset apis to call and combine colors, assigning background colors, accent colors, foreground colors, etc., with lightness 0-1000.

If you’re familiar with the Material Design API, in the early days the Material Design team established the standard names for different scene colors in the App, like BackgroundColor for BackgroundColor, SurfaceColor represents the container color above Background and primaryColor represents the theme color.

So Android 12 will most likely follow this specification to use the built-in palette for style combinations. But it is not clear whether the environment back home retains this characteristic.

The new Widget

In Android 12, small and medium-sized components have been completely new design, both UI and operability have been improved. In iOS 14, Apple supported the screen widget function for the first time, but the iOS widget only provides display function, and any operation needs to jump to the App. The Android 12 widget retains desktop functionality, allowing you to perform certain shortcuts without having to open an App.

It also provides a new widget Library, another difference from iOS is that iOS widget sizes are fixed and only support small, medium and three types of widgets, while Android 12 continues the previous widget development mode. Under the NxN specification, widgets can be customized in real life.

Frosted glass, rounded corners, list damping effect

I was wondering why Android didn’t support Blur for so many years. In my opinion, Gaussian Blur is different from Blur. Gaussian Blur achieves Blur effect by manipulating the pixel matrix of Bitmap, while Blur looks more like an overlay effect on the original View.

Finally, in this year’s Google I/O, I got the answer: The T-Mobile G1 actually supported Blur on the first Android phone, but Android removed Blur due to performance and design concerns. It wasn’t until this year’s Release of Android 12 that the effect was restored.

Also updated are the custom API for View rounded corners and the list sliding damping effect, which I want for iOS. 😂 also has its own special feature, touch animation with particle ripple effect.

Android 12 also provides custom App Launch to start the animation, developers can through Animated Vector for custom Drawable. But I think in the domestic environment of open screen advertising, there may be few apps will support.

In addition to these toasts, they have also been updated to include app ICONS. Notifications change again and again, and I’m not sure why Google changes notifications every year.

Overall, this year’s Android 12 will definitely feature design and interaction updates that make me feel like Android is really different.

2. Privacy updates

Microphone, camera call prompt

Privacy dashboard

This seems to be a lot of domestic mobile phone support is not?

The clipboard reads the notification

Location permissions allow you to select an exact or approximate range

3. Image system update

On the imaging side, Android 12 adds support for the AVIF image format, which can be smaller than JPG while retaining more detail.

In terms of video encoding format, compatibility optimization has also been made. App can declare its supported video formats. When users select formats not supported by App, such as HEVC(H.265), HDR, HDR+, the system will automatically transcode them into AVC (H.264).

So much for Android. Finally, I’d like to share one of the most amazing products in Google I/O.

Project Starline

By sampling people with ultra-high resolution cameras and depth sensors, users can experience life-size 3D images in front of a special screen, just like face to face, so as to communicate with each other. This product is really awesome.

So far, I haven’t watched all of the Google I/O lecture videos. If there are other interesting content worth learning in the following videos, I will share it with you at the first time.