Commit Graph

49 Commits

Author SHA1 Message Date
James Betker
13f97e1e97 Add recursive loss 2020-10-04 20:48:15 -06:00
James Betker
8197fd646f Don't accumulate losses for metrics when the loss isn't a tensor 2020-10-03 11:03:55 -06:00
James Betker
19a4075e1e Allow checkpointing to be disabled in the options file
Also makes options a global variable for usage in utils.
2020-10-03 11:03:28 -06:00
James Betker
dd9d7b27ac Add more sophisticated mechanism for balancing GAN losses 2020-10-02 22:53:42 -06:00
James Betker
39865ca3df TOTAL_loss, dumbo 2020-10-02 21:06:10 -06:00
James Betker
4e44fcd655 Loss accumulator fix 2020-10-02 20:55:33 -06:00
James Betker
567b4d50a4 ExtensibleTrainer - don't compute backward when there is no loss 2020-10-02 20:54:06 -06:00
James Betker
146a9125f2 Modify geometric & translational losses so they can be used with embeddings 2020-10-02 20:40:13 -06:00
James Betker
66d4512029 Fix up translational equivariance loss so it's ready for prime time 2020-09-30 12:01:00 -06:00
James Betker
dc8f3b24de Don't let duplicate keys be used for injectors and losses 2020-09-29 16:59:44 -06:00
James Betker
f9b83176f1 Fix bugs in extensibletrainer 2020-09-28 22:09:42 -06:00
James Betker
7e240f2fed Recurrent / teco work 2020-09-28 22:06:56 -06:00
James Betker
31641d7f63 Add ImagePatchInjector and TranslationalLoss 2020-09-26 21:25:32 -06:00
James Betker
6d0490a0e6 Tecogan implementation work 2020-09-25 16:38:23 -06:00
James Betker
05963157c1 Several things
- Fixes to 'after' and 'before' defs for steps (turns out they werent working)
- Feature nets take in a list of layers to extract. Not fully implemented yet.
- Fixes bugs with RAGAN
- Allows real input into generator gan to not be detached by param
2020-09-23 11:56:36 -06:00
James Betker
4ab989e015 try again.. 2020-09-22 18:27:52 -06:00
James Betker
3b6c957194 Fix? it again? 2020-09-22 18:25:59 -06:00
James Betker
7b60d9e0d8 Fix? cosine loss 2020-09-22 18:18:35 -06:00
James Betker
2e18c4c22d Add CosineEmbeddingLoss to F 2020-09-22 17:10:29 -06:00
James Betker
f40beb5460 Add 'before' and 'after' defs to injections, steps and optimizers 2020-09-22 17:03:22 -06:00
James Betker
17c569ea62 Add geometric loss 2020-09-20 16:24:23 -06:00
James Betker
17dd99b29b Fix bug with discriminator noise addition
It wasn't using the scale and was applying the noise to the
underlying state variable.
2020-09-20 12:00:27 -06:00
James Betker
3138f98fbc Allow discriminator noise to be injected at the loss level, cleans up configs 2020-09-19 21:47:52 -06:00
James Betker
e9a39bfa14 Recursively detach all outputs, even if they are nested in data structures 2020-09-19 21:47:34 -06:00
James Betker
e0bd68efda Add ImageFlowInjector 2020-09-19 10:07:00 -06:00
James Betker
9a17ade550 Some convenience adjustments to ExtensibleTrainer 2020-09-17 21:05:32 -06:00
James Betker
9963b37200 Add a new script for loading a discriminator network and using it to filter images 2020-09-17 13:30:32 -06:00
James Betker
5b85f891af Only log the name of the first network in the total_loss training set 2020-09-12 16:07:09 -06:00
James Betker
fb595e72a4 Supporting infrastructure in ExtensibleTrainer to train spsr4
Need to be able to train 2 nets in one step: the backbone will be entirely separate
with its own optimizer (for an extremely low LR).

This functionality was already present, just not implemented correctly.
2020-09-11 22:57:06 -06:00
James Betker
5189b11dac Add combined dataset for training across multiple datasets 2020-09-11 08:44:06 -06:00
James Betker
313424d7b5 Add new referencing discriminator
Also extend the way losses work so that you can pass
parameters into the discriminator from the config file
2020-09-10 21:35:29 -06:00
James Betker
3027e6e27d Enable amp to be disabled 2020-09-09 10:45:59 -06:00
James Betker
c04f244802 More mods 2020-09-08 20:36:27 -06:00
James Betker
e8613041c0 Add novograd optimizer 2020-09-06 17:27:08 -06:00
James Betker
21ae135f23 Allow Novograd to be used as an optimizer 2020-09-05 16:50:13 -06:00
James Betker
0dfd8eaf3b Support injectors that run in eval only 2020-09-05 07:59:45 -06:00
James Betker
6657a406ac Mods needed to support training a corruptor again:
- Allow original SPSRNet to have a specifiable block increment
- Cleanup
- Bug fixes in code that hasnt been touched in awhile.
2020-09-04 15:33:39 -06:00
James Betker
365813bde3 Add InterpolateInjector 2020-09-03 11:32:47 -06:00
James Betker
d90c96e55e Fix greyscale injector 2020-09-02 10:29:40 -06:00
James Betker
8b52d46847 Interpreted feature loss to extensibletrainer 2020-09-02 10:08:24 -06:00
James Betker
886d59d5df Misc fixes & adjustments 2020-09-01 07:58:11 -06:00
James Betker
4b4d08bdec Enable testing in ExtensibleTrainer, fix it in SRGAN_model
Also compute fea loss for this.
2020-08-31 09:41:48 -06:00
James Betker
cbd5e7a986 Support old school crossgan in extensibletrainer 2020-08-26 17:52:35 -06:00
James Betker
a65b07607c Reference network 2020-08-25 11:56:59 -06:00
James Betker
dffc15184d More ExtensibleTrainer work
It runs now, just need to debug it to reach performance parity with SRGAN. Sweet.
2020-08-23 17:22:45 -06:00
James Betker
e59e712e39 More ExtensibleTrainer work 2020-08-22 13:08:33 -06:00
James Betker
f40545f235 ExtensibleTrainer work 2020-08-22 08:24:34 -06:00
James Betker
74cdaa2226 Some work on extensible trainer 2020-08-18 08:49:32 -06:00
James Betker
ab04ca1778 Extensible trainer (in progress) 2020-08-12 08:45:23 -06:00