Here, we will describe some techniques to make life easier when training GANs:
- Normalize inputs between -1/1
- Use BatchNorm
- Use Leaky Relu (discriminator)
- Use Relu (Generator), tanh on generator output
- For downsampling use average pooling or strided convolution
- Use Adam optimizer
- If Discriminator loss went down fast, something is wrong
- Use dropout on generator (both train and test phase)