Skip to content

An implementation of Variational Autoencoders with a constant balance between reconstruction error and Kullback-leibler divergence

Notifications You must be signed in to change notification settings

asperti/BalancingVAE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This repository contains an implementation of Variational Autoencoders with a new balancing strategy between reconstrution error and KL-divergence in the loss function.

Specifically, we enforce a constant balance between these two components via normalization of the reconstruction error by an estimation of its current value, derived from minibatches.

We derived this technique by an investigation of the loss function used by Dai e Wipf for their Two-Stage VAE, where the balancing parameter was instead learned during training.

Our technique seems to outperform all previous Variational Approaches, permitting us to obtain unprecedented FID scores for traditional datasets such as CIFAR-10 and CelebA.

The code is largely based on Dai e Wipf code at https://github.com/daib13/TwoStageVAE

The code for computing fid is a minor adaptation of the code at https://github.com/tsc2017/Frechet-Inception-Distance

About

An implementation of Variational Autoencoders with a constant balance between reconstruction error and Kullback-leibler divergence

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages