The author | white well-off (dust)
New retail product | alibaba tao technology


Traditional content enhancement through filters, labels, stickers and other means to enrich and improve the expressive force of content, information quickly popular in trill quickly short video consumption era, through the game interesting, fun the effects of the creation of the user to reduce costs, improve the content form and tension, is the content of the community can draw lessons from one of the direction and thinking.






What is interactive content

In the monotonous Feed stream of community content, everything on Timeline is presented mechanically. Personalized and scene-oriented content cards are realized through interactive and innovative gameplay, so as to provide users with more fresh and interesting ways of interaction. Content is not dead to the user, it is immersive, interactive, playable to give the user a novel experience.




Interactive gameplay is a way of visual reinforcement, reconstructing content through interactive innovation and giving static content more spatial expression. The main means are:
1. Dimension promotion: Depth map, dimension reconstruction; Naked eye 3D, dimension breakthrough.
2. Perspective transformation: visual diagram and perspective transformation; Deformation diagram, plane transformation.
3. Overall correlation deformation, including characters, age, makeup, etc.
4, local correlation deformation, local sloshing, local flow.
5, other transformation, physical transformation, particle transformation, weather transformation.
6. Scene synthesis, scene enhancement, etc. You can make gameplay innovations in picture scenes, album scenes and video scenes combined with specific cases.




Specific case introduction

▐ depth map



People, still life and scenery are extracted from 2d images with depth information, and then the depth perspective is transformed according to the gyroscope to produce a pseudo-3D interactive effect. The classic case is 3D pictures of Facebook.


Gameplay Analysis:



1. First extract the depth information map according to the original image. IOS binocular can take pictures with depth information, or use deep learning technology to directly extract depth information from the original image.




2. Attitude Angle output by gyroscope is used to represent the change of the user’s Angle of view, according to the corresponding depth map and the Angle of view change determined by pitch and roll.





Core implementation: according to the depth map information, combined with the focus and depth, with the gyroscope to produce the effect of pixel offset change perspective.

vec4 dep = texture2D(depthMap, vTextCoord); Records = vTextCoord + offset * (DEP.r * -1.0 + params.x) * params.y; gl_FragColor = texture2D(colorMap, disCords);Copy the code

Effect display:


Meaning of gameplay:



Depth map is actually a pseudo 3 d effects (2 d – plus – the depth), according to the gyroscope slight rotation Angle of view, to give visitors a sense of intimacy, let the same static image content in some surprising interaction, make pictures just slide through consumption, perspective transformation of the senses for visitors, full control, Every interesting detail can be played over and over again.


Bradley Bradley Bradley Bradley Bradley Bradley Bradley Bradley



Based on the locally-selected wobble area, make the content part follow the gesture in the feed stream to create a wobble effect interaction, coffee, cake, pudding, fried egg, face, pets, and all kinds of novel and interesting creative gameplay.


Gameplay Analysis:



1. Select wobble area through bezier curve circle generated by control point.
2. Pixel offset is generated according to the position of each point to the center as the gyroscope moves.
3. With the increase of the distance from the center point, the displacement decreases linearly, and the offset decreases gradually with time.




Core implementation: according to the position of the point to the offset center, combined with the gyro slosh to generate periodic pixel offset:

vec2 offset = getOffset(sketch.PointLT, sketch.PointRT, sketch.PointRB, sketch.PointLB, sketch.Center, sketch.Time, TextureCoordsVarying, sketch.Direction, sketch.Amplitude); vec4 mask = texture2D(Texture, TextureCoordsVarying + offset); Gl_FragColor = vec4 (mask. RGB, 1.0);Copy the code

Effect display:



Meaning of gameplay:



Through the visualization of local content, a more real and interesting sense of body can be generated when browsing and consuming, which can better reflect the core expression power of material content than label filter. The content is shaken by gesture shaking, so that users have a more physical sense of content details.


Police Officer 2008



Blue sky and white clouds, sea sky, hair plush, water, beach, clothes and skirts, smoke and other materials through dynamic flow, so that the content is more vivid and expressive.





Gameplay Analysis:

1. Collect fluid direction through touch gesture. These are the matched moving points in the transformation process.
2. Anchor points are fixed on the stationary part of the picture content to prevent non-fluid part from participating in motion transformation.


3. Constantly update the progress through triangular planing and affine transformation to make the whole picture move.





Core implementation: affine transformation is carried out by triangular planing with one – to – one corresponding feature points

Rect Rect (0, 0, size.width, size.height); Subdiv2D subdiv(rect);for(vector<Point2f>::iterator it = points.begin(); it ! = points.end(); it++) subdiv.insert(*it); std::vector<Vec6f> triangleList; subdiv.getTriangleList(triangleList); (warpImage1, img1Rect, t1Rect, tRect) applyAffineTransform(warpImage1, img1Rect, tRect); applyAffineTransform(warpImage2, img2Rect, t2Rect, tRect); Mat imgRect = (1.0-alpha) * warpImage1 + alpha * warpImage2; multiply(imgRect, mask, imgRect); Multiply (img (r), Scalar (1.0, 1.0, 1.0, 1.0) - mask, img (r)); img(r) = img(r) + imgRect;Copy the code



Effect display:

Meaning of gameplay:



Dynamic graph can more vividly display the hidden content information of the picture, so that the vision is more three-dimensional and full.


▐ view



A new way of user interaction, richer than a single picture, simpler than a short video, users actively control the perspective, release imagination, including time wheel, long exposure, small animation, 3D display, panoramic preview and other special effects.


This interaction of twisting the phone to see different photos creates a subtle sensation for the viewer. They think: ‘I can control this! ‘or’ I can look at any picture I want! ‘It’s very easy to put control in the hands of the user, improve the playability of images and increase the consumption of content.




Gameplay Analysis:



1.Shoot a 6S short video, 4 frames are captured every 1s, and a picture is obtained in 250ms, similar to taking photos. Or 24 images can be taken independently, which can be related or relatively independent and don’t have to be taken at the same time. If users want to create a feeling of time passing or scene flow decoration process, it is perfectly possible to display old photos and new photos together.
2,Change the photos from different angles according to the gyroscope of the phone.

Effect display:


Meaning of gameplay:

  • Compared to static diagrams:Detailed expression, richer content, more three-dimensional, story, interesting, interactive;
  • Compared to video:Content expression, more lightweight, more interesting interaction, no longer the concept can be repeated play;
  • Compare that to a GIF:Mood expression, lightweight, interactive, the fun can be extended, can carry a larger scene rather than emoticon dynamic map, can slowly carefully observe the details of each perspective, so that the picture is more scene, with context and story. For example, the time wheel: for example, the dynamic display of the layout process of the house before and after decoration, and the change process of indoor lighting from cold to warm. 3D display: for 3D display effect of indoor lamp decoration pendant, view details from any Angle. Scene preview: Panorama preview mode, you can turn the phone to see the layout of the whole room, pause to observe at any Angle.


2008 2008 Scene Enhancement



Different from VR augmented reality in real-time video stream, scene enhancement is to create more elements fused with the scene through scene synthesis in static content, so as to enhance the dynamic sense and subtle atmosphere of the whole picture. Water vapor, water droplets, candles, flash, fire particles, smoke, flowers and trees and other scenes enhanced, can also be combined with the content and activities of interactive festive atmosphere and scene play and other fun scene play, such as fireworks, Christmas, snowflakes and other atmosphere, attract users to interact.



Gameplay Analysis:



1. Process the synthetic scene through the video file, including the mask layer of the original scene and its transparent channel.
2. Fuse the original image with the real scene through the mask layer according to the editing position through pre-processing.





Core implementation: render the mask layer to FBO preprocessing by preprocessing, and then do scene synthesis according to transparency.

if(uTextureType==0){// Print only the top part to FBO vec2 topTexCoord=vec2(vtexcoord.x, vtexcoord.y *0.5); gl_FragColor=texture2D(sTexture, topTexCoord); }else if(uTextureType==1){FBO vec2 botTexCoord=vec2(vtexcoord. x,1.0-mod(1.0-vtexcoord. y*0.5,0.5)); gl_FragColor=texture2D(sTexture, botTexCoord); }else if(uTextureType==2){// Merge FBO into the home screen VEC4 based on the mask layerbgColor=texture2D(samplerBg,vTexCoord);
    vec4 topColor=texture2D(samplerTop,vTexCoord);
    vec4 btmColor=texture2D(samplerBtm,vTexCoord);
    gl_FragColor=bgColor * (1.0- btmcolor.r) + topColor * btmcolor.r; }Copy the code

Effect display:





Meaning of gameplay:



Through the way of scene composition, you can expand into thousands of interesting gameplay and enrich the content with additional expressiveness.


J. 2008 Weather Simulation



Fog, sunshine, white clouds, snowflakes, water drops on the ground, water mist glass, water lines, rainbows, bubbles, stars and other simulated weather effects.


Gameplay Analysis:



Through pure shader to achieve simulation weather effects, and the original content of the mix.


Effect display:





Meaning of gameplay:



Through native GLSL to achieve a variety of high frame rate simulation weather effects, low cost, easy to verify and publish online. Some weather effects with gyroscopes can also create interesting interactive gameplay.


2008 Bradley Bradley

Transition play:
Picture switching, video transition effects


Douyin special effects:Malfunction, flicker, out-of-body and other douyin effects
Particle system:Snowflake, ribbon, aperture and other standard PLIST particle files


Gameplay Analysis:



Transfer the special effects gameplay from Tiktok to the content community through transitions and particle systems.


Effect display:




Meaning of gameplay:



Put the special effects common in short videos into the content community, and create more interesting gameplay through the collision of inspiration between motion and silence.


2008 Bradley Bradley



Douyin fire’s facial effects include aging, changing faces, baby effects, and can also be combined in the content community through proper gameplay.


★ Face transform

Multiple images of different periods of time are transformed to create the effect of time passing by.


Gameplay Analysis:



Triangular planing through the key points of the face, and then affine transformation between the figures, to produce douyin face transformation play.



Effect display:


Meaning of gameplay:



It is more interesting and playable than multiple selfies alone. Apart from the video environment of Tiktok, the transformation of character Gallery can also do some interesting things, which can be combined with the gyroscope or the Gallery autoplay: the dynamic transition effect of face or pet Gallery, for example, as a test function to unlock the user, the degraded is still normal picture Gallery.


★ Blink interaction



When the picture receives “like”, there will be interaction with the characters in the picture, such as the BulingBuling special effect of blinking, enriching the emotion and interaction of “like”. Enhance user’s body sense from interaction details, and give users more positive feedback.




Gameplay Analysis:



The eye position can be obtained through the key points of face recognition, and then the blink effect can be realized through local texture extrusion deformation.


Effect display:



Meaning of gameplay:



Through the interaction of small details, additional interaction behaviors can be generated to give more positive feedback to user interaction behaviors, increase the favorable degree and interactive fun, and promote and encourage users to explore more interactive behaviors.


J. Bradley Bradley



Naked Eye 3D: splitDepth in two classic forms:








Picture reconstruction, scene reconstruction and so on





Technology precipitation & Summary outlook

Through exploration and learning in the direction of interactive play, a set of interactive component library based on OpenGL is deposited. Why OpenGL instead of using native to achieve these effects?

1. High performance, making full use of the computing rendering capability of GPU.

2, dynamic, encapsulation standard INPUT and output, script dynamic delivery.

Through continuous construction and improvement, it provides a set of extensible interactive component library and product data analysis capability.

Content social interaction is only a form, and exposure is the core demand of users. Through interactive innovation, we hope to help content communities expand content play and diversification, improve the expressiveness and interest of content, impress users from interactive details, and build reputation. From interactive gameplay to create explosive points, expand the influence (to create symbolic interactions: such as wechat shake, explore left and right swipe, QQ drag to eliminate bubbles, etc.). Let the quality and warmth of the content lead to more interaction and retention, and hopefully more valuable landing scenes.