Supported media formats

This article describes the media codecs, containers, and network protocol support provided by the Android platform.

As an application developer, you can use any media codecs that can be used on any Android device, including media codecs provided by the Android platform and device-specific media codecs. However, it is a best practice to use device-independent media encoding profiles.

The following table describes the built-in media format support in the Android platform. Codecs that are not guaranteed to be available on all versions of the Android platform are indicated in parentheses, for example :(Android 3.0+). Note that any given mobile device may support additional formats or file types not listed in the table.

Section 5 of the Android Compatibility definition specifies the media formats that the device must support in order to be compatible with Android 8.1.

Audio support

Audio formats and codecs

Video support

Video formats and codecs

Video coding recommendations

The following table lists the recommended Android Media Framework video encodings and parameters for playing using the H.264 benchmark profile codec. The same advice applies to the main profile codec, which is only available in Android 6.0 or later.

The following table lists the recommended Android Media Framework video encodings and parameters for playing with VP8 media codecs.

Video decoding suggestions

The device implementation must support real-time dynamic video resolution and frame rate switching for all VP8, VP9, H.264, and H.265 codecs in the same class via standard Android apis, reaching the maximum resolution supported by each codec on the device.

Implementations that support Dolby Vision decoders must follow the following guidelines:

  • Provides a Dolby vision extractor.
  • Correctly display Dolby Vision content on the device screen or on a standard video output port such as HDMI.
  • Set the track index of the backward-compatible base (if present) to be the same as that of the merged Dolby vision layer.

Demand for video streaming

There are additional requirements for video content transmitted over HTTP or RTSP:

  • For 3GPP and MPEG-4 containers,moovAtoms must precede anymdatAtoms, but must inheritftypThe atoms.
  • For 3GPP, MPEG-4, and WebM containers, the interval between audio and video samples corresponding to the same time offset may not exceed 500 KB. To minimize this audio/video drift, consider interleaving audio and video in smaller block sizes.

Image support

Network protocol

The following network protocols support audio and video playback:

RTSP(RTP,SDP) HTTP/HTTPS Sequential stream HTTP/HTTPS Protocol draft:

  • Mpeg-2 TS media files only
  • Version 3 (Android 4.0 and later)
  • Protocol Version 2 (Android 3.x)
  • Not supported before Android 3.0

Note: Prior to Android 3.1, HTTPS was not supported.