DL-Art-School/codes/models/archs
James Betker 42a10b34ce Re-enable batch norm on switch processing blocks
Found out that batch norm is causing the switches to init really poorly -
not using a significant number of transforms. Might be a great time to
re-consider using the attention norm, but for now just re-enable it.
2020-06-24 21:15:17 -06:00
..
__init__.py
arch_util.py
AttentionResnet.py
discriminator_vgg_arch.py
DiscriminatorResnet_arch_passthrough.py
DiscriminatorResnet_arch.py
feature_arch.py
FlatProcessorNet_arch.py
FlatProcessorNetNew_arch.py
HighToLowResNet.py
ResGen_arch.py
RRDBNet_arch.py
SRResNet_arch.py
SwitchedResidualGenerator_arch.py Re-enable batch norm on switch processing blocks 2020-06-24 21:15:17 -06:00