DL-Art-School/codes/models
James Betker 747ded2bf7 Fixes to the spsr3
Some lessons learned:
- Biases are fairly important as a relief valve. They dont need to be everywhere, but
  most computationally heavy branches should have a bias.
- GroupNorm in SPSR is not a great idea. Since image gradients are represented
   in this model, normal means and standard deviations are not applicable. (imggrad
   has a high representation of 0).
- Don't fuck with the mainline of any generative model. As much as possible, all
   additions should be done through residual connections. Never pollute the mainline
   with reference data, do that in branches. It basically leaves the mode untrainable.
2020-09-09 15:28:14 -06:00
..
archs Fixes to the spsr3 2020-09-09 15:28:14 -06:00
steps Enable amp to be disabled 2020-09-09 10:45:59 -06:00
__init__.py ExtensibleTrainer work 2020-08-22 08:24:34 -06:00
base_model.py Fix multistep optimizer (feeding from wrong config params) 2020-08-04 16:42:58 -06:00
ExtensibleTrainer.py Enable amp to be disabled 2020-09-09 10:45:59 -06:00
feature_model.py feamod fix 2020-08-30 08:08:49 -06:00
loss.py Update loss with lr crossgan 2020-08-26 17:57:22 -06:00
lr_scheduler.py Extensible trainer (in progress) 2020-08-12 08:45:23 -06:00
networks.py CSNLN changes (removed because it doesnt train well) 2020-09-08 08:04:16 -06:00
novograd.py Add novograd optimizer 2020-09-06 17:27:08 -06:00
SR_model.py Allow validating in batches, remove val size limit 2020-06-02 08:41:22 -06:00
SRGAN_model.py Fix SRGAN_model/fullimgdataset compatibility 1 2020-09-08 11:34:35 -06:00