Cycle consistency loss

The adversarial loss alone does not ensure the proper mapping of the images. For instance, a generator can map the images from the source domain to a random permutation of images in the target domain which can match the target distribution.

So, to avoid this, we introduce an additional loss called cycle consistent loss. It enforces both generators G and F to be cycle-consistent.

Let's recollect the function of the generators:

  • Generator G: Converts x to y
  • Generator F: Converts y to x

We know that generator G takes the source image x and converts it to a fake target image y. Now if we feed this generated fake target image y to generator F, it has to return the original source image x. Confusing, right?

Look at the following figure; we have a source image, x. First, we feed this image to generator G, and it returns the fake target image. Now we take this fake target image, y, and feed it to generator F, which has to return the original source image:

The above equation can be represented as follows:

This is called the forward consistency loss and can be represented as follows:

Similarly, we can specify backward consistent loss, as shown in the following figure. Let's say we have an original target image, y. We take this y and feed it to discriminator F, and it returns the fake source image x. Now we feed this fake source image x to the generator G, and it has to return the original target image y:

The preceding equation can be represented as:

The backward consistency loss can be represented as follows:

So, together with a backward and forward consistent loss, we can write the cycle consistency loss as:

We want our generators to cycle consistent so, we multiply their loss with the cycle consistent loss. So, the final loss function can be given as:

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset