This is the second day of my participation in the August More text Challenge. For details, see: August More Text Challenge

Nayuta has been implementing this feature with their Flutter. How can Jetpack compose be dropped?

First of all, I would like to thank the team and Nayuta. Some of the materials used in the following paragraphs are also from Nayuta. Thanks again.

Train of thought

From the perspective provided by the team, the glasses-free 3D effect divides the entire image structure into three layers: upper, middle, and lower. When the phone is rotated left, right, up and down, the top and bottom images move in opposite directions, while the middle ones stay put, giving a visual 3D effect.

For Jetpack Compose, the main idea is as follows:

  1. Use the Compose Canvas to draw the three layers of pictures, and use Translate to translate the upper and lower layers of pictures;
  2. Register the monitor of the phone’s gyroscope sensor to get the rotation Angle of xyz axis when the phone is rotating;
  3. Calculate the translation distance of the picture according to the rotation Angle, and control the maximum translation distance.
  4. Once you have the shift distance, set the distance to the shift distance variable marked with mutableStateOf, causing the UI to refresh with the shift effect.

implementation

According to the above idea, we first use compose to draw three static pictures. For compose, there are many ways to draw pictures, such as Image and Canvas, etc. Considering that pictures need to be moved in the future, Canvas is chosen for drawing here.


val imageBack = ImageBitmap.imageResource(id = R.drawable.back)
val imageMid = ImageBitmap.imageResource(id = R.drawable.mid)
val imageFore = ImageBitmap.imageResource(id = R.drawable.fore)

Canvas(modifier = Modifier .fillMaxSize()) {
        / / the underlying
    drawImage(imageBack)
    / / middle
    drawImage(imageMid)
    //s
    drawImage(imageFore)
    
}
Copy the code

The static effect diagram is as follows:

Still image loading is easy, so how do you make the image move?

For example, Compose’s Canvas has a translate method, which translates the x and y coordinates by a given pixel increment. The parameter is the distance of translation on the x axis and the distance of translation on the y axis. Here they are defined as xDistance and yDistance respectively. Since only the top and bottom images will move, apply translate to the top and bottom images in the Canvas, as follows:

 translate(-xDistance, -yDistance) {
                drawImage(imageBack)
            }

            drawImage(imageMid)
            translate(xDistance, yDistance) {
                drawImage(imageFore)
            }
Copy the code

The values of xDistance and yDistance are passed in. Note that the upper image and the lower image move opposite each other, so the upper image is passed in the opposite value of xDistance. At this point, the image is translated by xDistance and yDistance.

How do the values of xDistance and yDistance change dynamically?

For example, Compose provides a state called mutableStateOf, and when you mark the data of mutableStateOf, that data is declared stateful. If the state changes later, all controls that refer to that state will be redrawn. In other words, set xDistance and yDistance to this state, because the Canvas refers to the value of xDistance, so when the value of xDistance changes, the image will be redrawn, which is the effect of translation.

As follows:

var xDistance by remember { mutableStateOf(0f) }
var yDistance by remember { mutableStateOf(0f) }
Copy the code

XDistance and yDistance have been dynamically tagged. Next, you need to dynamically set the values of xDistance and yDistance according to the mobile phone’s gyroscope movement. Before starting said sensor, there is a problem, when the image translation or role shift up and down, there are around or on either side of the screen to show up and down, this time you need to do amplification processing images, set the boundary to the picture, let the pictures in the biggest shift from mobile, prevent translation to show the screen background images, Set Canvas to 1.3 times the original size.

Canvas(
       modifier = Modifier
                .fillMaxSize()
                .scale(1.3f)) {}
Copy the code

The result is something like this:

Mobile phone gyroscope sensor

Through the rotation of the phone, the image moves thanks to the sensor,

As shown in the figure, the sensor coordinate system is divided into three axes: X, Y and Z. When the phone is flipped left and right, it moves around the Y axis; when the phone is flipped up and down, it moves around the X axis; when the phone is placed flat on the table and a circle is drawn left and right, it moves around the Z axis.

As the phone rotates, the sensor tells us the angular velocity of the movement in three directions, which determines the translation distance of the picture.

First of all, how do sensors listen? Android actually encapsulates the API for us, The SensorManager, just follow the instructions and create it.

    val context = LocalContext.current
    val sensorManager: SensorManager? = getSystemService(context, SensorManager: :class.java) val sensor = sensorManager? .getDefaultSensor(Sensor.TYPE_GYROSCOPE)Copy the code

After obtaining the SensorManager through getSystemService, set the type of sensor to TYPE_GYROSCOPE, that is, the gyroscope sensor. And listen for the angular velocity in three directions xyz.

sensorManager? .registerListener(object : SensorEventListener { override funonSensorChanged(event: SensorEvent?) {
        // The Y-axis angular velocityspeedY = event? .values? .get(1)!!!!!// The angular velocity of the X-axisspeedX = event? .values? .get(0)!!!!!// The z-axis angular velocityspeedZ = event? .values? .get(2)!!!!! } override funonAccuracyChanged(sensor: Sensor? , accuracy: Int){}}Copy the code

The SensorEventListener listens to the angular velocity of the phone in all three directions, because the gyroscope is reading the angular velocity, and as you know, the angular velocity times the time, which is the Angle of rotation, just calculates the Angle of rotation.

 // Add the rotation angles of the phone on each axis
                angularX += (event.values[0] * dT).toLong()
                angularY += (event.values[1] * dT).toLong()
                angularZ += (event.values[2] * dT).toLong()

                // Set maximum boundary values for x and y,
                if (angularY > mMaxAnular) {
                    angularY = mMaxAnular.toFloat()
                } else if (angularY < -mMaxAnular) {
                    angularY = -mMaxAnular.toFloat()
                }

                if (angularX > mMaxAnular) {
                    angularX = mMaxAnular.toFloat()
                } else if (angularX < -mMaxAnular) {
                    angularX = -mMaxAnular.toFloat()
                }
Copy the code

After the Angle is calculated, because the image has to move by a distance, then you need to know the translation distance of the image. In fact, the above is proposed to set the maximum translation boundary for the picture, here is also set the maximum rotation Angle, so you can come to the translation distance according to the Angle proportion.

According to the formula rotation Angle/maximum Angle = translation distance/maximum translation distance, translation distance = rotation Angle/maximum Angle * maximum translation distance

                 val xRadio: Float = (angularY / mMaxAnular).toFloat()
                 val yRadio: Float = (angularX / mMaxAnular).toFloat()
                 xDistance = xRadio * maxOffset
                 yDistance = yRadio * maxOffset
Copy the code

Image distance calculation done, basically with mobile, images will submit translation effect, but found that there is a problem, the onSensorChanged callback refresh soon, when moving around about the Y axis, images are also meeting translation, resulting in images are irregular beating, round about the Y axis movement in fact you just need to about translation, in the same way, To move around the X-axis, the picture just needs to move up and down. Here, for the x and y axis motion, set the rotation condition control.

  x = Math.abs(event.values[0])
  y = Math.abs(event.values[1])
  z = Math.abs(event.values[2])

  if (x > y + z) {
      xDistance = 0f
      yDistance = yRadio * maxOffset
   } else if (y > x + z) {
      xDistance = xRadio * maxOffset
      yDistance = 0f}
Copy the code

Ok, that’s it. Let’s look at the final result:

The last

The design of apps on the market is basically the same, and an interesting idea always makes people take a second look. Thanks again to the team of Freely for providing this idea. By the way, today in the ant forest to collect energy, found that the trees have a little bit of this effect, you may wish to take a look.

References:

The realization of naked eye 3D effect of Free APP

Here you go! The naked eye 3D effect of the Flutter imitation Free App | More challenging in August

Recommended Reading:

Play with the Compose Theme list