forked from mrq/DL-Art-School
df1046c318
The concept here is to use switching to split the generator into two functions: interpretation and transformation. Transformation is done at the pixel level by relatively simple conv layers, while interpretation is computed at various levels by far more complicated conv stacks. The two are merged using the switching mechanism. This architecture is far less computationally intensive that RRDB. |
||
---|---|---|
.. | ||
archs | ||
__init__.py | ||
base_model.py | ||
loss.py | ||
lr_scheduler.py | ||
networks.py | ||
SR_model.py | ||
SRGAN_model.py |