DL-Art-School/codes/models/archs
James Betker 42a10b34ce Re-enable batch norm on switch processing blocks
Found out that batch norm is causing the switches to init really poorly -
not using a significant number of transforms. Might be a great time to
re-consider using the attention norm, but for now just re-enable it.
2020-06-24 21:15:17 -06:00
..
__init__.py
arch_util.py Fix initialization in mhead switched rrdb 2020-06-15 21:32:03 -06:00
AttentionResnet.py Add attention resnet 2020-05-29 20:02:10 -06:00
discriminator_vgg_arch.py Add capability to place additional conv into discriminator 2020-06-23 09:40:33 -06:00
DiscriminatorResnet_arch_passthrough.py Allow passthrough discriminator to have passthrough disabled from config 2020-05-19 09:41:16 -06:00
DiscriminatorResnet_arch.py Fixup upconv for the next attempt! 2020-05-01 19:56:14 -06:00
feature_arch.py Fix process_video bugs 2020-05-29 12:47:22 -06:00
FlatProcessorNet_arch.py Add more batch norms to FlatProcessorNet_arch 2020-04-30 11:47:21 -06:00
FlatProcessorNetNew_arch.py Full resnet corrupt, no BN 2020-04-30 19:17:30 -06:00
HighToLowResNet.py Misc changes 2020-04-28 11:50:16 -06:00
ResGen_arch.py Apply fixes to resgen 2020-05-24 07:43:23 -06:00
RRDBNet_arch.py Fix initialization in mhead switched rrdb 2020-06-15 21:32:03 -06:00
SRResNet_arch.py
SwitchedResidualGenerator_arch.py Re-enable batch norm on switch processing blocks 2020-06-24 21:15:17 -06:00