How to do it...

Now let's proceed to data processing:

  1. We start by defining a generator for the training and testing data. We will use these generators while loading data into our environment and perform real-time data augmentation:
# train generator
train_augmentor = image_data_generator(
rescale = 1/255,
rotation_range = 300,
width_shift_range = 0.15,
height_shift_range = 0.15,
shear_range = 0.2,
zoom_range = 0.2,
horizontal_flip = TRUE,
fill_mode = "nearest"
)

# test generator
test_augmentor <- image_data_generator(rescale = 1/255)

Now let's load the training, testing, and validation data into our environment:

# load train data
train_data <- flow_images_from_directory(
train_path,
train_augmentor,
target_size = c(150, 150),
batch_size = 20,
class_mode = "binary")

# load test data
test_data <- test_generator <- flow_images_from_directory(
test_path,
test_augmentor,
target_size = c(150, 150),
batch_size = 20,
class_mode = "binary")

# load validation data
validation_data <- flow_images_from_directory(
validation_path,
test_augmentor,
target_size = c(150, 150),
batch_size = 20,
class_mode = "binary"
)

We can print the shape of the rescaled image using the following code:

train_data$image_shape
  1. After loading our data, let's instantiate a pre-trained VGG16 model. Going further, we will refer to this model as the base model:
pre_trained_base <- application_vgg16(
weights = "imagenet",
include_top = FALSE,
input_shape = c(150, 150, 3)
)

Let's now take a look at the summary of the base model:

summary(pre_trained_base)

Here is the description of the base model:

After instantiating the base model, we add dense layers to it and build a holistic model:

model_with_pretrained <- keras_model_sequential() %>% 
pre_trained_base %>%
layer_flatten() %>%
layer_dense(units = 8, activation = "relu") %>%
layer_dense(units = 16, activation = "relu") %>%
layer_dense(units = 1, activation = "sigmoid")

Now we visualize the summary of the model:

summary(model_with_pretrained)

The screenshot shows a summary of the holistic model:

We can print the number of trainable kernels and biases we have in our model using the following code:

length(model_with_pretrained$trainable_weights)

Let's freeze the pre-realized weights of the base model:

freeze_weights(pre_trained_base)

We can check how many trainable weights we have after freezing the base model by executing the following code:

length(model_with_pretrained$trainable_weights)
  1. After configuring the model, we then compile and train it.

Let's compile the model using binary cross-entropy as the loss function and RMSprop() as the optimizer:

model_with_pretrained %>% compile(
loss = "binary_crossentropy",
optimizer = optimizer_rmsprop(lr = 0.0001),
metrics = c('accuracy')
)

After compiling, we now train the model:

model_with_pretrained %>% fit_generator(generator = train_data,
steps_per_epoch = 20,
epochs = 10,
validation_data = validation_data)

Next, we evaluate the performance of the trained model on the test data and print the evaluation metrics:

scores <- model_with_pretrained %>% evaluate_generator(generator = test_data,steps = 20)

# Output metrics
paste('Test loss:', scores[[1]], ' ')
paste('Test accuracy:', scores[[2]], ' ')

The screenshot shows the model performance on the test data:

The test accuracy is around 83%.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset