forked from mrq/DL-Art-School
10f7e49214
Relu produced good performance gains over LeakyRelu, but GAN performance degraded significantly. Try SiLU as an alternative to see if it's the leaky-ness we are looking for or the smooth activation curvature. |
||
---|---|---|
.. | ||
.idea | ||
data | ||
data_scripts | ||
metrics | ||
models | ||
options | ||
scripts | ||
temp | ||
utils | ||
distill_torchscript.py | ||
onnx_inference.py | ||
process_video.py | ||
recover_tensorboard_log.py | ||
requirements.txt | ||
run_scripts.sh | ||
test.py | ||
train.py |