Now, we can specify the autoencoder network. The autoencoder's model and summary is as follows:
# Autoencoder
ae_model <- keras_model(inputs = input_layer, outputs = decoder) summary(ae_model)
______________________________________________________________________ Layer (type) Output Shape Param # ====================================================================== input_3 (InputLayer) (None, 28, 28, 1) 0 ______________________________________________________________________ conv2d_11 (Conv2D) (None, 28, 28, 32) 320 ______________________________________________________________________ max_pooling2d_5 (MaxPooling2D) (None, 14, 14, 32) 0 _______________________________________________________________________ conv2d_12 (Conv2D) (None, 14, 14, 32) 9248 _______________________________________________________________________ max_pooling2d_6 (MaxPooling2D) (None, 7, 7, 32) 0 _______________________________________________________________________ conv2d_13 (Conv2D) (None, 7, 7, 32) 9248 _______________________________________________________________________ up_sampling2d_5 (UpSampling2D) (None, 14, 14, 32) 0 _______________________________________________________________________ conv2d_14 (Conv2D) (None, 14, 14, 32) 9248 _______________________________________________________________________ up_sampling2d_6 (UpSampling2D) (None, 28, 28, 32) 0 ________________________________________________________________________ conv2d_15 (Conv2D) (None, 28, 28, 1) 289 ======================================================================== Total params: 28,353 Trainable params: 28,353 Non-trainable params: 0 ________________________________________________________________________
From the preceding summary of the autoencoder network, we can see that there are 28,353 parameters in total. Next, we will compile this model using the following code:
# Compile model
ae_model %>% compile( loss='binary_crossentropy', optimizer='adam')
For denoising autoencoders, thebianary_crossentropy loss function performs better than other options.
When compiling the autoencoder model, we will use binary_crossentropy for the loss function since the input values are between 0 and 1. For the optimizer, we will use adam. After compiling the model, we are ready to fit it.