Skip to content

This repo merged the METASR and PESR for competittion of TianCHI

Notifications You must be signed in to change notification settings

Myzhencai/METASRandPESR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PESR

Official implementation for Perception-Enhanced Single Image Super-Resolution via Relativistic Generative Networks ECCV Workshop 2018

Citation

Please our project if it is helpful for your research

@InProceedings{Vu_2018_ECCV_Workshops},
author = {Vu, Thang and Luu, Tung M. and Yoo, Chang D.},
title = {Perception-Enhanced Image Super-Resolution via Relativistic Generative Adversarial Networks},
booktitle = {The European Conference on Computer Vision (ECCV) Workshops},
month = {September},
year = {2018}
}

PSNR vs PESR

Dependencies

  • Nvidia GPUs (training takes 1 day on 4 Titan Xp GPUs)
  • At least 16G RAM
  • Python3
  • Pytorch 0.4
  • tensorboardX
  • tqdm
  • imageio

Datasets, models, and results

Dataset

  • Train: DIV2K (800 2K-resolution images)
  • Valid (for visualization): DIV2K (100 val images), PIRM (100 self-val images)
  • Test: Set5, Set14, B100, Urban100, PIRM (100 self-val images), DIV2K (100 val images)
  • Download train+val+test datasets
  • Download test only dataset

Pretrained models

  • Download pretrained models including 1 PSNR-optimized model and 1 perception-optimized model

Paper results

Quick start

  • Download test dataset and put into data/origin/ directory
  • Download pretrained model and put into check_point directory
  • Run python test.py --dataset <DATASET_NAME>
  • Results will be saved into results/ directory

Training

  • Download train+val+test dataset and put into data/origin directory
  • Pretrain with L1 loss: python train.py --phase pretrain --learning_rate 1e-4
  • Finetune on pretrained model with GAN: python train.py
  • Models with be saved into check_point/ direcory

Visualization

  • Start tensorboard: tensorboard --logdir check_point
  • Enter: YOUR_IP:6006 to your web browser.
  • Tensorboard when finetuning on pretrained model should be similar to:

Tensorboard

Tensorboard_imgs

Comprehensive testing

  • Test perceptual model: follow Quick start
  • Interpolate between perceptual model and PSNR model: python test.py --dataset <DATASET> --alpha <ALPHA> (with alpha being perceptual weight)
  • Test perceptual quality: refer to PIRM validation code

Quantitative and Qualitative results

RED and BLUE indicate best and second best respectively.

References

About

This repo merged the METASR and PESR for competittion of TianCHI

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages