· How does Keras freeze the network layer

 

Finetune sometimes needs to freeze some network layer to speed up training when using Keras

Keras provides a method for freezing individual layers: layer.trainable = False

How should this be used? Here are some examples

1. Freeze all network layers of Model

base_model = DenseNet121(include_top=False, weights="imagenet",input_shape=(224.224.3))
for layer in base_model.layers:
    layer.trainable = False
Copy the code

2. Freeze certain network layers of Model

In keras, in addition to retrieving layer from model.layers, we can also retrieve layer from model.get_layer(layer_name).

base_model = VGG19(weights='imagenet')
base_model.get_layer('block4_pool').trainable = False
Copy the code

You might be wondering, I don’t know what to do with layer_name. The answer is to print the model.summary(),

As shown below, the left-most column is layer_name (note the >< outside the parentheses)

__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param # Connected to================================================================================================== input_1 (InputLayer) (None, 224, 224, 3) 0 __________________________________________________________________________________________________ NASNet (Model) (None, 7, 7, 1056) 4269716 input_1[0][0] __________________________________________________________________________________________________ resnet50 (Model) (None, 7, 7, 2048) 23587712 input_1[0][0] __________________________________________________________________________________________________ densenet121 (Model) (None, 7, 7, 1024) 7037504 input_1[0][0] __________________________________________________________________________________________________ global_average_pooling2d_1 (Glo (None, 1056) 0 NASNet[1][0] __________________________________________________________________________________________________ global_average_pooling2d_2 (Glo (None, 2048) 0 resnet50[1][0] __________________________________________________________________________________________________ global_average_pooling2d_3 (Glo (None, 1024) 0 densenet121[1][0] __________________________________________________________________________________________________ concatenate_5 (Concatenate) (None, 4128) 0 global_average_pooling2d_1[0][0] global_average_pooling2d_2[0][0] global_average_pooling2d_3[0][0] __________________________________________________________________________________________________ dropout_1 (Dropout) (None, 4128) 0 concatenate_5[0][0] __________________________________________________________________________________________________ classifier (Dense) (None, 200) 825800 dropout_1[0][0] ================================================================================================== Total params: Trainable Params: 825,800 Non-trainable Params: 34894932 __________________________________________________________________________________________________ NoneCopy the code

hope this helps