forked from mrq/DL-Art-School
10f7e49214
Relu produced good performance gains over LeakyRelu, but GAN performance degraded significantly. Try SiLU as an alternative to see if it's the leaky-ness we are looking for or the smooth activation curvature. |
||
---|---|---|
.. | ||
archs | ||
__init__.py | ||
base_model.py | ||
loss.py | ||
lr_scheduler.py | ||
networks.py | ||
SR_model.py | ||
SRGAN_model.py |