Skip to content

Latest commit

 

History

History

unsupervised_learning

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 

Generative adversarial networks

GAN

[paper] [example]

GANs - Generative Adversarial networks - were introduced by Ian Goodfellow in 2014. GANs can create new data by competing two neural networks: generator and discriminator .


Fig.1: GAN architecture

Conditional GAN

[paper]

GANs or DCGAN could not specify the class when generating image, since the generator totally depends on the random noise. In conditional GAN, generator gets noise as well as a label as an input; the desired output is to be the one corresponds to the label. Also, discriminator gets the label along the real or generated image.


Fig.3: Conditional GAN architecture

DCGAN

[paper]

DCGAN - Deep Convolutional GANs - is updated version of GANs presented in 2015. It utilizes convolutional layers along with batch normalization and leaky ReLU.


Fig.2: DCGAN generator used for LSUN scene modeling

LSGAN

[paper]

GANs had problem that learning is unstable. LSGAN - Least Square GANs, which employs least square loss instead of binary cross entropy loss for its loss function was introduced intended to improve the stability of the learning.

CycleGAN

[paper]

Image-to-image translation is a class of vision and graphics problems where the goal is to learn the mapping between an input image and an output image using a training set of aligned image pairs.

Autoencoders

Data-driven Discovery of Nonlinear Dynamical Systems

[paper] [example]

In the following equation, denotes the state of the system at time and the function describes the evolution of the system. The term can be the external forcing or feedback control. The goal is to determine the function and consequently discover the underlying dynamical system from data.

By applying the general form of a linear multistep method with steps to the equation above, given the measurements of the state from to , the following equation can be obtained:

The function is apploximated by a neural netwok. The neural network is trained so that the LHS of the equation above approaches to zero.

When , the equation states the trapezoidal rule.