Developing perceptual distance metrics is a major topic in recent image processing problems. LPIPS[1] is a state-of-the-art perceptual metric based on human similarity judgments. The official implementation is not only publicly available as a metric, but also enables users to train the new metric by themselves. In other words, The official implementation has less simplicity by the high-level wrapping for training. This repository provides an alternative simple and useful implementation of LPIPS. This output is definitely the same result because the weights are converted from the original one.
torch
>= 1.3torchvision
>= 0.4
from lpips_pytorch import LPIPS, lpips
# define as a criterion module (recommended)
criterion = LPIPS(
net_type='alex', # choose a network type from ['alex', 'squeeze', 'vgg']
version='0.1' # Currently, v0.1 is supported
)
loss = criterion(x, y)
# functional call
loss = lpips(x, y, net_type='alex', version='0.1')
- Clone this repository and move into your project
~ $ git clone https://github.com/S-aiueo32/lpips-pytorch.git ~ $ mv lpips-pytorch/lpips-pytorch <YOUR_PROJECT>
- Install by
pip
$ pip install git+https://github.com/S-aiueo32/lpips-pytorch.git
BSD 2-Clause "Simplified" License.
- Zhang, Richard, et al. "The unreasonable effectiveness of deep features as a perceptual metric." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2018.
This project directly uses the original weights, many thanks to the authors.