10f7e49214
Relu produced good performance gains over LeakyRelu, but GAN performance degraded significantly. Try SiLU as an alternative to see if it's the leaky-ness we are looking for or the smooth activation curvature. |
||
---|---|---|
.. | ||
__init__.py | ||
arch_util.py | ||
AttentionResnet.py | ||
discriminator_vgg_arch.py | ||
DiscriminatorResnet_arch_passthrough.py | ||
DiscriminatorResnet_arch.py | ||
feature_arch.py | ||
FlatProcessorNet_arch.py | ||
FlatProcessorNetNew_arch.py | ||
HighToLowResNet.py | ||
NestedSwitchGenerator.py | ||
ResGen_arch.py | ||
RRDBNet_arch.py | ||
spinenet_arch.py | ||
SRG1_arch_new.py | ||
SRG1_arch.py | ||
SRResNet_arch.py | ||
SwitchedResidualGenerator_arch.py |