The GIF format

GIF(Graphics Interchange Format) is a Graphics file Format developed by CompuServe. There are a lot of information about GIF, and this article will emphasize some important knowledge points.

The GIF File consists of three parts: File Header, GIF Data Stream and Trailer, as shown below:

GIF87a
GIF89a
0x3B

GIF is based on the global (local) color list. Each pixel stores the index value (0~255) of the global (local) color list corresponding to the pixel color value, and then generates the coding stream after LZW algorithm encoding compression, which is stored in the image block.

GIF data flow

Logical screen identifier

The GIF data stream contains the main contents, starting with the logical screen identifier, which takes up 7 bytes, as shown below:

  1. M: Global Color Table Flag. When set, it indicates that there is a Global Color Table, and the pixel value is meaningful only in this case.
  2. Cr: Color ResoluTion, CR +1 determines the Color depth of the image.
  3. S: Sort Flag, which, when set, indicates that the global color list is sorted by category.
  4. Pixel: Global color list size,2 << pixelRepresents the Size of the global color list

Global color list

Immediately following the logical screen list is the global color list, occupying a total of 2 << Pixel * 3 bytes, each color consists of 3 bytes, respectively R, G, B color points, as shown below:

The above logical screen identifiers and global color lists are global, and only one exists in a GIF file. Each subsequent image block corresponds to a GIF frame.

Image block

Graphical control extension

There is a graphic control extension in GIF89a version, which occupies 8 bytes and is generally placed in front of an image block (image identifier) or graphic text extension to control the rendering mode of the first image or text block immediately following it, as shown below:

The first byte 0x21 is the GIF extension identifier. The second byte 0xF9 indicates that this is a graphical control extension block; The unit of delay time is 10ms, indicating the display time of the current frame. The transparent color index specifies that when decoding the current frame, the color in the global (local) color table corresponding to the index needs to be changed to transparent color, and then after decoding the current frame, the original color needs to be restored. The meanings of each Bit of the fourth byte are as follows:

  1. I: Use Input Flag, which can be the enter key, mouse click and other events, can be used together with the delay time. Within the set delay time, if there is an Input event from the user, the next frame will be switched immediately; otherwise, the delay time will reach.
  2. T: Transparent Color Flag, which indicates that the current frame uses a Transparent ColorTransparent color indexUse in combination.

Disposal Method (Disposal Method, very important!! Refers to how to process the previous frame when rendering the current frame (processing the previous frame according to the Disposal Method of the previous frame, but not the previous frame according to the Disposal Method of the current frame), which has the following values:

  1. Unspecified (0) (Nothing) : Draw a full-size, opaque GIF frame instead of the previous frame. Even if the two consecutive frames have only minor differences, each frame is still drawn independently, as shown below:
  2. Do Not Dispose (1) (Leave As Is) 1: Pixels that have Not been covered by the current frame will continue to be displayed. This method Is often used to optimize GIFs. The current frame only needs to be partially refreshed on the basis of the previous frame, and pixels that have Not been covered by the current frame will continue to be displayed. This method can save memory and improve decoding speed, as shown below:
  3. Restore to Background (2) : Before drawing the current frame, the drawing area of the previous frame is restored to the Background color. This method is often used to optimize the situation where many frames have the same Background. The Background color of the previous frame can be displayed through the transparent area of the current frame, as shown below:
  4. Restore to Previous (3) : When the current frame is drawn, the most recent setting is first restoredUnspecifiedorDo not Dispose, and then superimpose the current frame on top. This method has poor performance and has been gradually abandoned, as shown below:

The most important way to understand the Disposal Method of the Disposal Method is to process the previous frame according to the Disposal Method of the previous frame, rather than the Disposal Method of the current frame. Such as: A GIF has two frames A and B. The Disposal Method of frame A is Restore to Background and the Disposal Method of frame B is Do Not Dispose, so when drawing frame B, Since the Disposal Method of frame A is Restore to Background, it is necessary to Restore the drawing area of frame A to Background color first, and then draw frame B.

Image identifier

An image identifier is the beginning of an image block, occupying 10 bytes, with the first byte fixed at 0x2C, used to identify the image identifier. The image identifier defines the offset, width and height properties of the current frame, as shown below:

Disposal Method
Do Not Dispose

The meanings of each Bit of the 10th byte are as follows:

  1. M: Local Color Table Flag. When this Flag is set, it indicates that the current frame has a Local Color list, and the pixel value after this Flag is meaningful. The local color list follows the image identifier and is used only by the current frame; Otherwise, use the global color list and ignore the pixel value.
  2. I: Interlace Flag. When set, frame image data following the local color list shall be arranged in Interlace mode; otherwise, sequential arrangement shall be used.
  3. S: Sort Flag, which, when set, indicates the following local color list classification arrangement.
  4. R: Reserved field, currently initialized to 0
  5. Pixel: Local color list size,2 << pixelRepresents the Size of the local color list
Local color list

If the m Bit of the 10th byte of the image identifier is set, there is a local color list, occupying 2 << Pixel * 3 bytes in total. Each color consists of 3 bytes, which are R, G and B color components respectively, that is, it is stored in the same way as the global color list. However, the local color list is only used by the current frame. After decoding the current frame, the global color list needs to be restored.

Image data based on a color list

First of all, it should be made clear that the image data is the encoded data compressed by LZW compression algorithm, which consists of two parts:

  • LZW Minimum Code Size
  • Image Data

The LZW compression algorithm has three important objects: data stream, code stream, and compiled table. In coding, data stream is the input object (raster data sequence of image), and code stream is the output object (encoded data after compression operation). When decoding, the encoding stream is the input object and the data stream is the output object. The compiled table is an object that needs to be used in both encoding and decoding, while the image block stores the code stream.

When decoding, the LZW encoding stream is first extracted from the image block, and then decoded into data stream (sequence of pixel index values) by LZW algorithm, and then combined with the global (local) color list, the pixel data of a frame of image is restored.

The LZW algorithm is not introduced in detail in this article. You can check the LZW algorithm.

Other extension blocks

In addition to the graphics control extensions used to control how image blocks are rendered, there are other extensions such as:

  • Comment Extension (Comment Extension) : Used to record graphics, copyright and other non-graphics and control of pure text data, Comment Extension does not affect the image data stream processing, decoders can completely ignore it. It can be stored anywhere in the data flow and is recommended to be placed at the beginning or end of the data flow.
  • Plain Text Extension: Used to draw simple Text images, consisting of parameters that control drawing and Plain Text data for drawing. A graphical text extension is also an image block, and you can define a graphical control extension in front of it to control its rendering form (similar to a normal image block). Therefore, when counting GIF frames, the graphic text extension block is counted as a frame.
  • Application Extension: For an Application to define its own Extension information. In general, this includes the number of GIF loops. For GIF metadata parsing, see Fresco’s GifMetadataStreamDecoder

Fresco “parse GIF

Fresco supports decoding and rendering of giFs and Webp giFs. In Fresco V1.11.0, the decoded giFs are encapsulated into AnimatedDrawable2, which is played by calling the animatedDrawable2.start () method.

For GIFs, Fresco implements two decoding methods: Native decoding using the Giflib library at the Native layer, and system Movie decoding. By default, it is decoded by giflib. To decode through Movie, import the Fresco animator-GIf-Lite library and specify the decoder as GifDecoder, as shown below:

Fresco.newDraweeControllerBuilder().setImageRequest(
ImageRequestBuilder.newBuilderWithSource(imageUri)
                .setImageDecodeOptions(
                ImageDecodeOptions.newBuilder().setCustomImageDecoder(GifDecoder(true)).build()).build()
)
Copy the code

Among them, when GifDecoder is created, if the parameter is true, it means that GifMetadataMovieDecoder is used to simply parse GIF metadata, which is relatively rough. For example, the number of GIF playback is fixed as infinite loop; Otherwise, the GifMetadataStreamDecoder parses the GIF metadata in detail if the parameter is false.

Let’s take a look at Fresco’s process for loading images:

  1. When ImagePipeline gets an image, it generates a different one depending on the request (fetchDecodedImage: fetchDecodedImage, or fetch undecoded image: fetchEncodedImage)Producer SequenceIt’s actually oneProducerChain, eachProducerOnly one link in the chain, such as NetworkFetchProducer downloading pictures, DecodeProducer decoding pictures, etc.
  2. ImagePipeline getProducer SequenceWill be created based on itCloseableProducerToDataSourceAdapter, i.e.,DataSourceAnd triggerProducer SequenceThe whole chain of production.Producer SequenceWhen it produces the result, it passesDataSourceCallback to subscribersDataSubscriberIf it isImagePipeline.fetchEncodedImageSo what the subscriber gets isCloseableReference<PooledByteBuffer>, that is, an undecoded byte pool; If it isImagePipeline.fetchDecodedImage, so the subscriber getsCloseableReference<CloseableImage>, that is, the picture data after decoding.
  3. Under normal circumstances,AbstractDraweeControllerWill get the decoded image data:CloseableReference<CloseableImage>, and then encapsulates it as a DrawableBitmapDrawableorOrientedDrawable; The GIF is encapsulated intoAnimatedDrawable2), toDraweeHierarchy.DraweeHierarchyInside is the Drawable hierarchy array, according toDraweeViewThe state displays different drawables.

. Then, we look at the ImagePipeline fetchDecodedImage get photo from the network of the whole Producer Sequence, as shown below:

  1. NetworkFetchProducer : Responsible for downloading image data from the network, internal NetworkFetcher, responsible for implementing the download logic using different Http frameworks, such as: HttpUrlConnectionNetworkFetcher, OkHttpNetworkFetcher, VolleyNetworkFetcher etc.
  2. WebpTranscodeProducer: Not all Android platforms support itWebP, for specific referenceWebpTranscodeProducer.shouldTranscodeMethod, so for platforms that do not support WebP, you need to convert to JPG/PNG. Where lossless or transparent WebP formats (DefaultImageFormats.WEBP_LOSSLESS and DefaultImageFormats.WEBP_EXTENDED_WITH_ALPHA) need to be converted to PNG, This is done by decoding WebP to RGBA and then coding RGBA to PNG. Simple or extended WebP formats (DefaultImageFormats.WEBP_SIMPLE and DefaultImageFormats.WEBP_EXTENDED) need to be converted to JPEG. This is done by decoding WebP to RGB and then coding RGB to JPEG.
  3. PartialDiskCacheProducer: The three PartialDiskCacheProducer are related to disk cache EncodedImage
  4. DiskCacheWriteProducer
  5. DiskCacheReadProducer
  6. EncodedMemoryCacheProducer: no decoding data memory cache
  7. EncodedCacheKeyMultiplexProducer
  8. AddImageTransformMetaDataProducer
  9. ResizeAndRotateProducer: Responsible for sampling and image rotation
  10. DecodeProducer: All the above producers are based on EncodedImage, and DecodeProducer decodes EncodedImage as Close Argument
  11. BitmapMemoryCacheProducer: over the next two memory Bitmap caching is related
  12. BitmapMemoryCacheKeyMultiplexProducer
  13. ThreadHandoffProducer: Is responsible for switching threads
  14. BitmapMemoryCacheGetProducer
  15. PostprocessorProducer
  16. PostprocessedBitmapMemoryCacheProducer
  17. BitmapPrepareProducer

From bottom to top, hold references; Return data from the top down.

OK, here we focus on the GIF related logic, from DecodeProducer with bottom go to, will find AnimatedImageFactoryImpl. DecodeGif responsible for decoding EncodedImage decoding data into the outstanding GifImage, GifImage represents a GIF, here will only parse the GIF metadata, will not really decode THE GIF frame, until the real display, will be decoded according to the need; AnimatedImageFactoryImpl. DecodeWebP responsible for decoding EncodedImage decoding data into the outstanding WebPImage, similar to GIF; If a custom GifDecoder is used, it decodes to MovieAnimatedImage. The above three xxxImages, all subclasses of AnimatedImage, provide all operations related to giFs.

So how do you get each frame? Get the AnimatedImageFrame from animateDimage. getFrame. GifFrame, WebPFrame, MovieFrame, correspond to the above XXXImage), then through AnimatedImageFrame. RenderFrame GIF rendering on a given Bitmap.

There’s a big difference between GifImage and MovieFrame rendering to Bitmap. MovieFrame is implemented through the MovieDrawer class (with the help of the system class Movie), so the RENDERED GIF frame is the complete frame, that is, the various residual frame logic has been processed according to the Disposal Method, which is relatively simple. GifImage is realized by the third-party library Giflib. The Bitmap obtained by GIFimage. renderFrame is residual frame, and the Disposal Method strategy needs to be handled by ourselves.

Let’s see how Fresco displays giFs and handles the Disposal Method. AnimatedDrawable2.draw -> AnimationBackendDelegate.drawFrame -> BitmapAnimationBackend.drawFrame -> BitmapAnimationBackend.drawFrameOrFallback -> BitmapAnimationBackend.renderFrameInBitmap -> AnimatedDrawableBackendFrameRenderer.renderFrame -> AnimatedImageCompositor.renderFrame -> AnimatedDrawableBackendImpl.renderFrame -> AnimatedDrawableBackendImpl.renderImageDoesNotSupportScaling -> AnimatedImageFrame. RenderFrame – > Native through giflib decoding.

In the following, we focus on the key methods: BitmapAnimationBackend. DrawFrameOrFallback: responsible for the full frame render a GIF to the given Canvas, first from the cache lookup current full frame; If not, continue to look for a reusable Bitmap and draw the current frame to the Bitmap before rendering the Bitmap to the Canvas. If there is no reusable Bitmap, create a new Bitmap and draw the current frame to the Bitmap before rendering the Bitmap to the Canvas. If all else fails, the previous frame is returned.

AnimatedImageCompositor. RenderFrame: It is responsible for rendering the specified GIF frame to the given full-size Bitmap, which needs to deal with various Dispose Method and blendOperation (GIF is transparency mixing), the key code is as follows:

// Generates a complete frame for the given index of the GIF
public void renderFrame(int frameNumber, Bitmap bitmap) {
    Canvas canvas = new Canvas(bitmap);
    canvas.drawColor(Color.TRANSPARENT, PorterDuff.Mode.SRC); / / clear the Bitmap
    
    // If blending is required, prepare the canvas with the nearest cached frame.
    int nextIndex;
    if(! isKeyFrame(frameNumber)) {NextIndex points to the next index to render onto the canvas
      nextIndex = prepareCanvasWithClosestCachedFrame(frameNumber - 1, canvas);
    } else {
      // Blending isn't required. Start at the frame we're trying to render.
      nextIndex = frameNumber;
    }
    
    Following the frame number just preceding the one we're trying to render, we Iterate from nextIndex to the frame number just preceding the frame data to Canvas
    // and composite them in order according to the Disposal Method.
    for (int index = nextIndex; index < frameNumber; index++) {
      AnimatedDrawableFrameInfo frameInfo = mAnimatedDrawableBackend.getFrameInfo(index);
      DisposalMethod disposalMethod = frameInfo.disposalMethod;
      if (disposalMethod == DisposalMethod.DISPOSE_TO_PREVIOUS) {
        continue;
      }
      // If transparent pixel blending is not required, the drawing area of the specified frame is overwritten with transparent pixels in advance
      if (frameInfo.blendOperation == BlendOperation.NO_BLEND) {
        disposeToBackground(canvas, frameInfo);
      }
      // Draw a frame
      mAnimatedDrawableBackend.renderFrame(index, canvas);
      // Call back the Bitmap of a frame
      mCallback.onIntermediateResult(index, bitmap);
      // disposalMethod represents the processing strategy for the current frame, which is ready to render the next frame: overlay the drawing area of the current frame with a background color
      if (disposalMethod == DisposalMethod.DISPOSE_TO_BACKGROUND) { 
        disposeToBackground(canvas, frameInfo);
      }
    }

    AnimatedDrawableFrameInfo frameInfo = mAnimatedDrawableBackend.getFrameInfo(frameNumber);
    // The default drawing does pixel blending, but the frameNumber frame does not need blending, so the overlay area needs to be cleared
    if (frameInfo.blendOperation == BlendOperation.NO_BLEND) { 
      disposeToBackground(canvas, frameInfo);
    }
    // Finally, we render the current frame. We don't dispose it.
    mAnimatedDrawableBackend.renderFrame(frameNumber, canvas);
Copy the code

The above code, the first is to find out the generated Bitmap frameNumber frames (details refer to AnimatedImageCompositor. PrepareCanvasWithClosestCachedFrame), when need to which a frame began to draw from, Then there is the Dispose Method for each frame, which mainly replaces the drawn area with the Background color in advance for the frame in Restore to Background mode.

Key frame: The current frame is a full-size frame, and the transparent pixel of the current frame does not need to be mixed with the previous frame, that is, the transparent pixel will also cover the previous pixel; Or the previous frame is a full-size frame, and the Disposal Method of the previous frame is Restore to Background.

AnimatedDrawableBackendImpl. RenderImageDoesNotSupportScaling: responsible for the residual frame drawing to full size frame specified location, the code is as follows:

  private void renderImageDoesNotSupportScaling(Canvas canvas, AnimatedImageFrame frame) {
    // Get the width and height of the residual frame and the starting offset
    int frameWidth = frame.getWidth();
    int frameHeight = frame.getHeight();
    int xOffset = frame.getXOffset();
    int yOffset = frame.getYOffset();
    synchronized (this) {
      prepareTempBitmapForThisSize(frameWidth, frameHeight);
      // Draw the residual frame onto a temporary Bitmap
      frame.renderFrame(frameWidth, frameHeight, mTempBitmap);

      // Temporary bitmap can be bigger than frame, so we should draw only rendered area of bitmap
      mRenderSrcRect.set(0.0, frameWidth, frameHeight);
      mRenderDstRect.set(0.0, frameWidth, frameHeight);

      // Draw the GIF residual frame to the specified position by using the Canvas shift
      canvas.save();
      canvas.translate(xOffset, yOffset); 
      canvas.drawBitmap(mTempBitmap, mRenderSrcRect, mRenderDstRect, null); canvas.restore(); }}Copy the code

AnimatedImageFrame. RenderFrame is responsible for the GIF residual frame drawing to the specified Bitmap (generate the real size of the residual frame), mainly in the Native layer through giflib, details refer to GIF. CPP

With that said, Fresco support for GIFs is a treasure trove of image processing!!

Reference documentation

  1. GIF format image detailed analysis
  2. GIF Disposal Method
  3. AnimatedGifs
  4. Android source code read – GIF decoding
  5. GIF official documentation
  6. Fresco “source