Hacking is redefining entertainment. \

Earlier, TME Live X Billie Eilish concert, with the help of XR technology, created a giant spider, space, tropical rainforest and other strong visual impact image elements. Lady Gaga used Intel’s RealSense technology to make up her face in a variety of ways, and even had a realistic spider crawling across her cheek.

On the stage of this year’s CCTV Spring Festival Gala, the application of XR technology also showed a blockbuster effect: through the way of “cloud stage”, Jay Chou and Andy Lau, who could not attend, appeared clearly and vividly on the stage of the Spring Festival Gala.

This Friday and Saturday, THE9’s first concert will be held on iQiyi. Different from the live broadcast that the public is accustomed to, this concert is the first “interactive immersive virtual” online performance launched by IQiyi Cloud Performance. Meanwhile, this concert will be the world’s first movie and television LEVEL LED realistic virtual production XR live concert.

With its novel concept and use of technology, the concert attracted the attention of the industry and fans early on. According to iQiyi, more than 100,000 people booked the concert within 24 hours of its official announcement, and tickets were sold out in a fraction of a second.

Executive producer Wu Lei told Deep Sound that the concert has been in the works since September last year. At that time, they could have chosen to quickly put a live concert online in only one month, but they finally decided to settle and polish THE9’s works and strive to present a wonderful concert. It can be said that XR technology has become the biggest highlight and Easter egg of this concert.

 

The use of XR technology in performances is not unique, but it is still rare for Domestic artists to perform on their own stage. What are the differences and innovations of THE9’s concert after 7 months of polishing by a team of more than 500 people?

 

With this question in mind, Shen Xiang talked with Zhu Liang, vice president and head of intelligent production department of IQiyi, Wu Lei, executive producer of the concert, and Tan Yinzi, general director of the concert.

A feast

XR technology, also known as extended reality, is VR (virtual reality), AR (augmented reality), MR (mixed reality). To put it simply, XR scenes are based on real scenes, combined with cool AR content in the background.

This time, IQiyi applied the film and television LED realistic virtual production and XR technology in the live concert for the first time. Tan Yinzi told “Deep Sound” that the practical XR and stylized XR used two sets of equipment, in order to implement the content idea, the technical team and the director team early on the docking.

In order to make the user experience realistic, the director team will select and design some scenes suitable for realism in advance. For example, in the stage of The Sphinx, they consulted a large number of materials and stories about the Sphinx of Egypt, as well as inscriptions and stone steps of that time. The director even carefully painted the skin on the sphinx to ensure the authenticity of the scene. When the final scene is presented, the audience feels as if they are in the desert, watching the performance from the Sphinx, and the immersion is very powerful.

Traditional stage choreography only gives the artist a stage, and it is difficult to change much after the construction is completed. The only thing that can be changed is the adjustment of background color or small devices. But the XR gives the directors a lot of room to create, and the stage can be constantly adjusted to the song or the mood. It might be a desert one moment, but it could be a rooftop the next.

The charm of XR technology lies in its fantasy and futuristic sense of indistinguishable reality from the virtual, as well as its rich experience of “deceiving” the senses. Cool visual effects have become a big attraction of the concert.

At the same time, the application of new technology also provides new possibilities for content creation, so that both technology and content can achieve each other.

According to introduction, the director group designed a main line for the concert, with the growth of nine girls as a clue, telling the story of self-breakthrough: DumbDumbBomb, which is transparent, selected and questioned since childhood, to NOT ME, which has the courage to break through the rumors of the outside world, and then sphinx, which gradually breaks through the fog and has a clear understanding of its goal, and becomes the “hunter” of his life, LION and Hunt, and finally pursues infinite possibilities and promises to become a better version of himself.

And these nine girls, also have in this virtual world belongs to own exclusive territory and stage. As the concert introduced the concept of “city”, as the concert goes on, fans can follow the artist to “shuttle” between different Spaces and blocks in the city, enjoying different scenery.

In addition, new technology is placing greater demands on artists.

On the one hand, artists are required to make changes in performance according to technical requirements. Instead of performing in front of a green cloth and adding special effects later, XR technology allows artists to see the effects directly on an LED screen, Zhu said. This means that the artist needs more precise positioning to fit the scene.

On the other hand, it requires the active participation of the artist. The directors said that after the release of the Sphinx X Mystery and Unreal X, they had in-depth communication with THE9 members on each song to understand their understanding and ideas of the songs. The director team then creates a script from this, which is communicated and revised many times until it finally shows what the nine people have in mind.

For example, When communicating with the director team, The9-Lu Keran expressed the idea that he hoped to overturn the previous image. The directing team helped her create an exotic palace. THE9- Liu Yuxin also has very detailed requirements for her stage. She thinks that the song “BiuBiu” has the sound effect of shooting guns, so it is suitable to match several gun holes on the LOGO. It can be seen that technology is invisible, so that the artist’s sense of participation is greatly enhanced.

Interactive “New tricks”

Compared with traditional concerts, online concerts do have natural “disadvantages” in terms of interaction, which is actually the reason why offline entertainment cannot be easily replaced.

But that doesn’t mean there aren’t ways to improve — even if offline scenes and experiences can’t be recreated online, technology can make online content more interactive and play with new experiences in new ways.

For example, in this concert, the program group will help bar linkage, video call, bullet screen interaction and many other fan interaction links to achieve. At the same time, this concert also played a lot of “new tricks” that had never been done before, such as using AR technology to surround the stage with bullets, presenting a virtual audience, and on this basis, adding fans and artists singing on the same stage, creating the stage and so on.

Among them, in the co-creation stage, the director group provided six interactive games including KTV carnival night, tacit understanding of you and me, see your voice, I have you or not, and fans can vote to decide which two games will finally appear on the live stage. In addition, the organizers also launched a video solicitation for the chorus of the song Promise. The selected videos will have the opportunity to appear on the big screen during the live concert.

Another very interesting part is the virtual auditorium. After purchasing tickets, users can open the virtual auditorium, buy more costumes and props in the change page, and “dress up” their virtual avatar. Before the live broadcast, avatars will be seated according to the sequence of entering the live broadcast room, and there will be a reminder to sit, giving people a sense of both on-site.

Of course, the most anticipated part of the interactive session is the “one-on-one” with the idol. In the live video link of THE9 concert, fans will realize real-time interaction with artists through the live large screen. At that time, the director team will randomly select 300 people to play the game with the artists on the big screen, and six of them will be lucky enough to get the chance to live with THE9.

Fans attending rehearsals have already experienced the thrill of connecting directly with their idols. “I remember the host asked me who I was, and THEN I said I was a metaphorical fan, and the rest of my mind went blank,” one fan wrote on Weibo.

 

In order to ensure the realization of interactivity, iQiyi made a trade-off.

One of the biggest benefits for the producers of the original online concerts was that they broke through the space constraints of the offline venues. But to ensure interactivity, iQiyi limited the number of people at the concert. Wu lei said concerts definitely need to limit the number of people in order to give fans a greater chance to have a dialogue, interaction and communication with artists. For this concert, the production team is not more concerned about the number of tickets, but hope to close the relationship between THE9 and their fans, so that they can see more of their fans’ love for them.

In fact, technological innovation, exploration and landing are never achieved overnight. For IQiyi, the short-term economic benefits of a concert are by no means the most important factor in promoting the use of XR and other technologies in video content.

In recent years, IQiyi has invested in interactive content, 5G, XR, VR/AR and other fields, and is simultaneously layout and action in both hardware manufacturing and content research and development. For example, in terms of hardware, VR devices are constantly launched and updated; In terms of content, we have launched interactive dramas, tried to add interactive advertisements in variety shows, and made VR movies, etc. The concert is also an innovative attempt by IQiyi to further expand the scope of application of innovative technology in content and catch up with the pace of innovation on the international stage.

Overall, XR technology brings new experiences in audiovisual and interactive experiences, and it may also be the “key” to the next generation of entertainment, where more personalized expression, highly interactive and immersive experiences are just waiting to happen.

And behind the technology landing, is also an industry workflow innovation. As tan yinzi said, in the preparation process of THE9 concert, one of the very important links is to train the director team and help the content production team to systematically understand the technology and the logic behind the technology. From the traditional linear working mode of pre-production, shooting and post-production to pre-production, this requires the cooperation of the content team and the technical team, and also requires the platform to control and promote all these.

Ten years ago, no one could have imagined that after avatar, world movies would officially enter the 3D era. IMAX3D, full CG motion capture and other film technologies would frequently appear in movies. No one would have thought that stars in different places could appear on the same live show, and that the audience in front of the screen would have avatars participating in it, expressing the current experience in the form of a bullet screen.

It is equally impossible to predict what “next generation entertainment” will look like ten years from now. However, with the active exploration of the technical team, the addition of the Internet platform, content innovation and development path is clear enough at present.

The following is the transcript of the interview compiled by Deep Sound:

On the premise of not affecting the meaning, Shenxiang classified and adjusted the order of the questions according to the theme.

In addition, iQiyi virtual anchor “Qi Xiaozhi” also automatically generated an audio broadcast according to the interview content, trying to restore the interview scene for you. Click on the audio below to listen to the interview at????

Why do you do this concert?

**** : Why did you choose THE9’s first concert? How did you develop the online virtual form? What was the opportunity? Or why THE9 was chosen for the debut of this technology? * * * *

Wu Lei: At the beginning of the epidemic, we were inspired by the application of technology in some cases. There were two main cases. One was the Mandalorian TV series, which used virtual production to produce a great American TV series. The second is to see Disney launched “Hamilton” musical moved online, box office results on the industry particularly big impact; Second, the outbreak period offline performance stagnation, everyone is looking towards online, we wonder is it possible to put these things together, good box office has good content, users can produce a different viewing experience with offline concert, so we started to try to do this thing, this is a try for half a year.

Shen Xiang: When did you first prepare for this project and how did you think about it?

Wu Lei: I think it was last September when we started working on this project. If we had to do a live live concert quickly, the lead time would have been a month, but we didn’t do that. First of all, it is irresponsible for THE9 to go online quickly, because THE9 also needs to precipitate and polish their works, which requires a relatively long communication cycle to develop the stage works of the concert.

Secondly, since we want to make the next generation of entertainment content products, we need to solve the virtual production, interactive elements and cloud ticketing system (ticketing). Only when such concerts are audio-visual, interactive and have a simpler way to settle accounts can we be sure to bring them to the public.

What are the challenges of concerts?

Shen Xiang: As a director, compared with traditional concerts, what is the biggest challenge of this kind of concerts with a lot of technology? Will it be any different for the performers?

Tan Yinzi: In traditional concerts, people pay more attention to the interaction with the audience, such as when the chorus can be performed, what kind of machine is used to adjust the face of a certain audience member on the scene, and let him on the big screen. It is more about the group playing by themselves. Online, we are more involved. The hardest thing about doing an online concert is how to make the people who are not there feel the atmosphere and how they interact with the performers. We have a variety of different means of interaction, there is a new interactive experience. At the same time, in this concert, our stage is full of imagination, which can suddenly turn into a desert, a roof, and a forest, which is completely different from the atmosphere given by traditional offline stage choreotry, which has more imagination, space and display power.

Shen Xiang: why does this concert try to make a real scene?

Tan Yinzi: What we do in this concert is the concept of “city”. As a city, we all have our own understanding. If every song is a very empty thing, we will feel that the city is completely ungrounded. We found some scenes that are very suitable for live action, and make it completely authentic, so that the audience can enjoy it while watching it.

Deep sound: The director group put forward the requirements for the live scene of this concert. Their plan and requirements fall into the specific implementation, which aspect is the difficulty mainly reflected in?

Zhu Liang: There are many difficulties. We have set up several virtual machines this time, which is not such a large scale in the previous production. In addition, special design should be made in hardware, software control and program arrangement to ensure that the audience can present more immersive visual feelings during live broadcast. We want people to look at these characters and think that they’re just wandering through a virtual scene, rather than pulling the audience out of the scene and telling you that this is a stage. From the very beginning of our creation, all our technical preparation is in this direction.

Shen Xiang: During the preparation of this concert, did the technical team receive any request to exceed the “upper limit”?

Zhu Liang: a lot, every day in the “crazy test” technology bottom line. This matter has not been done before, whether form, content, ideas or gameplay, are in innovation. The use of XR technology in this concert now looks like imagination, but it may become the starting point of our technological innovation after the concert. In the future, we need to explore greater imagination and make more breakthroughs. We hope iQiyi can be a brave explorer in this aspect.

Deep sound: Compared with traditional concerts, are there any special requirements for artists with the addition of a large number of technologies?

Tan Yinzi: We every song and artist communication they feel, they want to communicate her own understanding of this song, every artist has a very clear the idea of this song to oneself, and imagination to the scene, we use this city all the scenes in a multi-dimensional three-dimensional space, their each other is a spell and interconnected three-dimensional space, Each of them had her own little patch in the city, her own stage.

What will technology bring in the future?

Shen Xiang: XR technology is becoming more and more popular in concerts and evening parties. Do you think this technology has dominated the industry? Where are the opportunities in the future?

Zhu Liang: When we watch some of the so-called XR performances, if they are done with green cloth, the surrounding environment hardly affects the character. This might be a flat light that looks like a New Year’s painting or a photo taken in a studio, very fake, and certainly not immersive. However, the possibility of leds in the environment is one of its advantages because they are so bright that they can project onto our body, skin and clothes, adding to our sense of reality.

Deep sound: will this mode feed back into other content production if it works?

Wu Lei: Yes, eventually there will be a better form of content, and with lower costs, there will be more diversified ways to monetize businesses. However, even if there are many business forms and modes of expansion, we don’t think about that much at this stage. We think about quality content first, and everything else will follow.

Shen XIANG: Technology has brought many possibilities to the performing arts industry and the film industry. When this possibility turns to reality, cost also needs to be considered. How do you think about the balance between technology and cost in the future?

Zhu Liang: We can see that since LED started to enter the field of advertising and concert, its performance is rising, but the cost is falling rapidly, because the development of technology is fast forward. As far as I know, there are at least 200 studios in North America that have started to install LED shooting solutions. It’s going to be a very big trend. When this market rises, more LED manufacturers will pay attention to this market demand, increase its capacity, and its cost will gradually become more reasonable.

At the same time, in the aspect of software, there are some special server, software vendors for performing arts, after perhaps only for performing arts field of high-end, the market is relatively small, but if combined with film and television production, it is bigger market, bigger, can have more new players into the nature, to develop such a dedicated server and software tools. So, I believe there will be more and more solutions, price changes, and more and more gameplay in the future.

Maybe you’d like to see more

Iqiyi took the lead in launching CUVA HDR standard content, and will support the standard 2021 CCTV Spring Festival Gala live and on demand

From Flash to MP4, IQiyi Qixiu live presents the road to special effects

Scan the qr code below, more exciting content to accompany you!

This article is reproduced with authorization from zhou Yongliang, author of “Deep Sound”