DL-Art-School/codes/models/archs
James Betker eb11a08d1c Enable disjoint feature networks
This is done by pre-training a feature net that predicts the features
of HR images from LR images. Then use the original feature network
and this new one in tandem to work only on LR/Gen images.
2020-07-31 16:29:47 -06:00
..
__init__.py mmsr 2019-08-23 21:42:47 +08:00
arch_util.py Huge set of mods to support progressive generator growth 2020-07-18 14:18:48 -06:00
AttentionResnet.py Add attention resnet 2020-05-29 20:02:10 -06:00
discriminator_vgg_arch.py Fix SRG4 & switch disc 2020-07-25 17:16:54 -06:00
DiscriminatorResnet_arch_passthrough.py Allow passthrough discriminator to have passthrough disabled from config 2020-05-19 09:41:16 -06:00
DiscriminatorResnet_arch.py Fixup upconv for the next attempt! 2020-05-01 19:56:14 -06:00
feature_arch.py Enable disjoint feature networks 2020-07-31 16:29:47 -06:00
FlatProcessorNet_arch.py Add more batch norms to FlatProcessorNet_arch 2020-04-30 11:47:21 -06:00
FlatProcessorNetNew_arch.py Full resnet corrupt, no BN 2020-04-30 19:17:30 -06:00
HighToLowResNet.py Misc changes 2020-04-28 11:50:16 -06:00
NestedSwitchGenerator.py Move ExpansionBlock to arch_util 2020-07-10 15:53:41 -06:00
ProgressiveSrg_arch.py Add a way to disable grad on portions of the generator graph to save memory 2020-07-22 11:40:42 -06:00
ResGen_arch.py More NSG improvements (v3) 2020-06-29 20:26:51 -06:00
RRDBNet_arch.py Remove RRDB with switching 2020-07-01 12:08:32 -06:00
spinenet_arch.py SRG3 work 2020-07-07 13:46:40 -06:00
SRG1_arch_new.py Remove all biases from generator 2020-07-04 22:19:55 -06:00
SRG1_arch.py Move ExpansionBlock to arch_util 2020-07-10 15:53:41 -06:00
SRResNet_arch.py mmsr 2019-08-23 21:42:47 +08:00
SwitchedResidualGenerator_arch.py Fix SRG4 & switch disc 2020-07-25 17:16:54 -06:00