DL-Art-School/codes/models
James Betker 42a10b34ce Re-enable batch norm on switch processing blocks
Found out that batch norm is causing the switches to init really poorly -
not using a significant number of transforms. Might be a great time to
re-consider using the attention norm, but for now just re-enable it.
2020-06-24 21:15:17 -06:00
..
archs Re-enable batch norm on switch processing blocks 2020-06-24 21:15:17 -06:00
__init__.py
base_model.py Fix inverse temperature curve logic and add upsample factor 2020-06-19 09:18:30 -06:00
loss.py
lr_scheduler.py Enable forced learning rates 2020-06-07 16:56:05 -06:00
networks.py Add ConfigurableSwitchComputer 2020-06-24 19:49:37 -06:00
SR_model.py Allow validating in batches, remove val size limit 2020-06-02 08:41:22 -06:00
SRGAN_model.py Add profiling to SRGAN for testing timings 2020-06-18 11:29:10 -06:00