DL-Art-School/codes
James Betker 42a10b34ce Re-enable batch norm on switch processing blocks
Found out that batch norm is causing the switches to init really poorly -
not using a significant number of transforms. Might be a great time to
re-consider using the attention norm, but for now just re-enable it.
2020-06-24 21:15:17 -06:00
..
.idea
data Add doResizeLoss to dataset 2020-06-08 11:27:06 -06:00
data_scripts Add simple resize to extract images 2020-06-23 09:39:51 -06:00
metrics
models Re-enable batch norm on switch processing blocks 2020-06-24 21:15:17 -06:00
options
scripts
temp
utils Add GPU mem tracing module 2020-06-14 11:02:54 -06:00
distill_torchscript.py Misc 2020-06-14 11:03:02 -06:00
onnx_inference.py Misc 2020-06-14 11:03:02 -06:00
process_video.py
requirements.txt
run_scripts.sh
test.py Misc 2020-06-02 09:35:52 -06:00
train.py