This repository provides an example of using Batch-Instance Normalization (NIPS 2018) for classification on CIFAR-10/100, written by Hyeonseob Nam and Hyo-Eun Kim at Lunit Inc.
Acknowledgement: This code is based on Wei Yang's pytorch-classification
If you use this code for your research, please cite:
@inproceedings{nam2018batch,
title={Batch-Instance Normalization for Adaptively Style-Invariant Neural Networks},
author={Nam, Hyeonseob and Kim, Hyo-Eun},
booktitle={Advances in Neural Information Processing Systems},
year={2018}
}
- PyTorch 0.4.0+
- Python 3.5+
- Cuda 8.0+
Training ResNet-50 on CIFAR-100 using Batch Normalization
python main.py --dataset cifar100 --depth 50 --norm bn --checkpoint checkpoints/cifar100-resnet50-bn
Training ResNet-50 on CIFAR-100 using Instance Normalization
python main.py --dataset cifar100 --depth 50 --norm in --checkpoint checkpoints/cifar100-resnet50-in
Training ResNet-50 on CIFAR-100 using Batch-Instance Normalization
python main.py --dataset cifar100 --depth 50 --norm bin --checkpoint checkpoints/cifar100-resnet50-bin
- Classification on CIFAR-10/100 (ResNet-110) and ImageNet (ResNet-18)
- Classification on CIFAR-100 with different architectures
- Mixed-domain classification on Office-Home (ResNet-18)
- Lunit tech blog (Korean)
- Tensorflow implementation by @taki0112