Welcome to visit
RTC developer communityTo communicate with more developers of WebRTC and real-time audio and video.


QUIC came to light in 2013 and has been a hot topic of discussion for the last two years. The reason is that as a transport layer protocol, QUIC takes advantage of the advantages of TCP and UDP, adding encryption, doubling the speed, and other aspects of improvement, making the deployment speed and update speed on the device have been improved.

While you might think that the transport layer protocol should be designed separately from the apps that run on it, the history of QUIC is inextricably linked to HTTP/2, and HTTP/2 on QUIC developed almost simultaneously. With regard to IETF103, the QUIC Working Group actually needs to limit its ongoing work to a single use case. The technology is hot, and many companies have invested a lot of money in it, which is why there are multiple ways to implement it today.

The main players behind QUIC are, of course, dot-com companies, as well as CDN. Akamai is a heavy participant in this technology, and many of its employees are specification and specification makers.

Media on the Web is often divided into two ecosystems: broadcast and real-time. In the broadcast world, most distributions are based on files and HTPP. In the real time world, most communications are based on RTP (RTSP/RTCP/STRP/WebRTC…) .

Here’s an additional question about RTP and QUIC that needs to be addressed: should we use RTP as a live medium, or should we abandon it because some of the mechanisms in RTP are redundant for some of the mechanisms in QUIC? If we use RTP, how should we plan our architecture, and how should we plan multiplexing based on these protocols? If we abandon it, how do we manage media mechanisms that are not in QUIC?

In fact, many organizations and individuals are interested in delivering (live) media via QUIC and have started to do so. The QUIC team has every reason to continue to hesitate.

Here are some of the initiatives we know about, and there may be more.

A. From ORTC, some people have implemented early QUIC transfers and QUIC streams, and the code can be found in the Chromium code base. The goal is to allow only data transmission, not media.

B. In order to provide more flexible pipeline in the media stack, as presented at a conference in Stockholm, the Google team is pushing for more module classes in WebRTC to allow people to use their own codecs, encryption methods, media and network transmission methods.

Here’s some information about the next version of WebRTC:

Support for adding RTP extensions that are different from the first package in the video frame layer

The refactoring class represents the encoded video frame

Reduce the number of classes representing the video codec configuration to a reasonable number

Integrate each frame encryption interface into WebRTC

Implement pluggable media transmission

Add the image Id to the normal RTP packaging

Add frame encryption and decryption to the media channel

C. Chair of the RMCAT working group, which deals with bandwidth assessment and congestion control issues, and another member from CallStats. IO, are doing direct- media-over-quic and RTP-over-quic.

D. The AVTCORE working group, which manages everything related to RTP, is looking at QUIC multiplexing, as well as other protocols that RTP needs to support.

E. The TAPS working group is looking at how to support QUIC as one of their transport protocols.

These groups have different objectives and may have more branches within the same group. The number of QUIC usage cases is equal to the sum of UDP and TCP usage cases. Of course, for everyone, their use case should be the most important.

No new features have been added until 1.0

This is the clear position of many companies, including Apple. Different people have different reasons for this. The W3C working group is closing the current charter, but delays in implementation of some plans and the APIs required for Simulcast make testing simulcast difficult. As mentioned at a recent Lyon conference, “The biggest problem with Simulcast is a mountain. “The bigger question is how much time it will take us.” This is a major concern for W3C staff and chairs. Apple and other vendors also want to stabilize webrtc1.0, and some have said they are working on other aspects, including QUIC.

QUIC is still immature

This is something Mozilla has been saying for the last year, not only at the face-to-face meeting in Stockholm, but also recently at the TPAC conference in Lyon. Those who disagree say that the chair of the QUIC group (a Mozilla employee) is committed to completing the standard document on Q4, and that other groups (including WebRTC) should not wait any longer, so it becomes a tricky question whether WebRTC should adopt QUIC. Others argue that QUIC is already in use, so if the WebRTC team does not decide, they will have to deal with it separately. (This argument also happened at SVC Condec.)

Personally, I think:

QUIC is the future. We can postpone it, but we can’t avoid it. WebRTC had the same experience.

Abandoning RTP outright would affect much of the existing WebRTC infrastructure. The team behind QUIC initially spent a lot of time designing the technology selection so that QUIC could help to optimize the current transmission technology and make the technology more widely and quickly adopted. I believe they are making the same effort to apply QUIC to real-time transmission.

Finally, it is recommended for developers who want to develop real-time audio and video apps or want to learn WebRTC
Some blog posts and materials