1. Custom layer

For simple, stateless custom operations, you might be able to do this with the layers.core.lambda layer. But for custom layers that include trainable weights, you should implement them yourself.

This is a skeleton of the Keras layer in Keras2.0 (update to the new version if you are using an older version). You only need to implement three methods:

  • build(input_shape): This is where you define weights. This method has to be setself.built = True, can be calledsuper([Layer], self).build()To complete.
  • call(x)This is where the functional logic of the layer is written. You just have to focus on incomingcallThe first parameter: input tensor, unless you want your layer to support masking.
  • compute_output_shape(input_shape): If your layer changes the shape of the input tensor, you should define the shape change logic here, which allows Keras to automatically infer the shape of the layers.
from keras import backend as K from keras.engine.topology import Layer class MyLayer(Layer): def __init__(self, output_dim, **kwargs): self.output_dim = output_dim super(MyLayer, self).__init__(**kwargs) def build(self, input_shape): Self. kernel = self.add_weight(name='kernel', shape=(input_shape[1], self.output_dim), Initializer ='uniform', trainable=True) super(MyLayer, self).build(input_shape) # def call(self, x): return K.dot(x, self.kernel) def compute_output_shape(self, input_shape): return (input_shape[0], self.output_dim)Copy the code

It is also possible to define a Keras layer with multiple input and output tensors. To do this, you should assume that the inputs and outputs of the methods build(input_Shape), Call (x), and compute_output_Shape (input_shape) are lists. Here’s an example, similar to the one above:

from keras import backend as K from keras.engine.topology import Layer class MyLayer(Layer): def __init__(self, output_dim, **kwargs): self.output_dim = output_dim super(MyLayer, self).__init__(**kwargs) def build(self, input_shape): Self. kernel = self.add_weight(name='kernel', shape=(input_shape[0][1], self.output_dim), initializer='uniform', trainable=True) super(MyLayer, Self).build(input_shape) # def call(self, x): assert isinstance(x, list) a, b = x return [K.dot(a, self.kernel) + b, K.mean(b, axis=-1)] def compute_output_shape(self, input_shape): assert isinstance(input_shape, list) shape_a, shape_b = input_shape return [(shape_a[0], self.output_dim), shape_b[:-1]]Copy the code

The existing Keras layer is a good example of implementing any layer. Don’t hesitate to read the source code!

2. Customize the evaluation function

Custom evaluation functions should be passed in at compile time. This function takes (y_true, y_pred) as an input argument and returns a tensor as output.

import keras.backend as K

def mean_pred(y_true, y_pred):
    return K.mean(y_pred)

model.compile(optimizer='rmsprop',
              loss='binary_crossentropy',
              metrics=['accuracy', mean_pred])
Copy the code

3. Customize the loss function

Custom loss functions should also be passed in at compile time. This function takes (y_true, y_pred) as an input argument and returns a tensor as output.

import keras.backend as K def my_loss(y_true, y_pred): Model.compile (Optimizer ='rmsprop', loss=my_loss, metrics=['accuracy'])Copy the code

4. Working with custom layers (or other custom objects) in saved models

If the model to be loaded contains custom layers or other custom classes or functions, they can be passed to the loading mechanism with the custom_OBJECTS parameter:

Model = load_model('my_model.h5', custom_objects={'AttentionLayer': AttentionLayer})Copy the code

Alternatively, you can use custom object scopes:

from keras.utils import CustomObjectScope

with CustomObjectScope({'AttentionLayer': AttentionLayer}):
    model = load_model('my_model.h5')
Copy the code