(QoL improvements for) DLAS - A configuration-driven trainer for generative models
Go to file
James Betker 8ab595e427 Add FlatProcessorNet
After doing some thinking and reading on the subject, it occurred to me that
I was treating the generator like a discriminator by focusing the network
complexity at the feature levels. It makes far more sense to process each conv
level equally for the generator, hence the FlatProcessorNet in this commit. This
network borrows some of the residual pass-through logic from RRDB which makes
the gradient path exceptionally short for pretty much all model parameters and
can be trained in O1 optimization mode without overflows again.
2020-04-28 11:49:21 -06:00
.idea Add IDEA Project 2020-04-21 16:28:21 -06:00
codes Add FlatProcessorNet 2020-04-28 11:49:21 -06:00
datasets Update DATASETS.md 2019-10-27 16:46:45 +08:00
.flake8 mmsr 2019-08-23 21:42:47 +08:00
.gitignore mmsr 2019-08-23 21:42:47 +08:00
.style.yapf mmsr 2019-08-23 21:42:47 +08:00
LICENSE Initial commit 2019-08-23 21:04:30 +08:00
README.md Update README.md 2019-11-24 07:47:57 +00:00

MMSR

MMSR is an open source image and video super-resolution toolbox based on PyTorch. It is a part of the open-mmlab project developed by Multimedia Laboratory, CUHK. MMSR is based on our previous projects: BasicSR, ESRGAN, and EDVR.

Highlights

  • A unified framework suitable for image and video super-resolution tasks. It is also easy to adapt to other restoration tasks, e.g., deblurring, denoising, etc.
  • State of the art: It includes several winning methods in competitions: such as ESRGAN (PIRM18), EDVR (NTIRE19).
  • Easy to extend: It is easy to try new research ideas based on the code base.

Updates

[2019-07-25] MMSR v0.1 is released.

Dependencies and Installation

Dataset Preparation

We use datasets in LDMB format for faster IO speed. Please refer to DATASETS.md for more details.

Training and Testing

Please see wiki- Training and Testing for the basic usage, i.e., training and testing.

Model Zoo and Baselines

Results and pre-trained models are available in the wiki-Model Zoo.

Contributing

We appreciate all contributions. Please refer to mmdetection for contributing guideline.

Python code style
We adopt PEP8 as the preferred code style. We use flake8 as the linter and yapf as the formatter. Please upgrade to the latest yapf (>=0.27.0) and refer to the yapf configuration and flake8 configuration.

Before you create a PR, make sure that your code lints and is formatted by yapf.

License

This project is released under the Apache 2.0 license.