DL-Art-School/codes/models
James Betker 4328c2f713 Change default ReLU slope to .2 BREAKS COMPATIBILITY
This conforms my ConvGnLelu implementation with the generally accepted negative_slope=.2. I have no idea where I got .1. This will break backwards compatibility with some older models but will likely improve their performance when freshly trained. I did some auditing to find what these models might be, and I am not actively using any of them, so probably OK.
2020-12-19 08:28:03 -07:00
..
byol
fixup_resnet
flownet2@db2b7899ea
glean
srflow
stylegan
switched_conv@cb520afd4d
tecogan
transformers
__init__.py
arch_util.py Change default ReLU slope to .2 BREAKS COMPATIBILITY 2020-12-19 08:28:03 -07:00
discriminator_vgg_arch.py
feature_arch.py
ProgressiveSrg_arch.py
ResGen_arch.py
RRDBNet_arch.py
spinenet_arch.py
srg2_classic.py
SwitchedResidualGenerator_arch.py