James Betker
254cb1e915
More dataset integration work
2020-09-25 22:19:38 -06:00
James Betker
e8613041c0
Add novograd optimizer
2020-09-06 17:27:08 -06:00
James Betker
6657a406ac
Mods needed to support training a corruptor again:
...
- Allow original SPSRNet to have a specifiable block increment
- Cleanup
- Bug fixes in code that hasnt been touched in awhile.
2020-09-04 15:33:39 -06:00
James Betker
4b4d08bdec
Enable testing in ExtensibleTrainer, fix it in SRGAN_model
...
Also compute fea loss for this.
2020-08-31 09:41:48 -06:00
James Betker
bdaa67deb7
Misc
2020-08-12 08:46:15 -06:00
James Betker
3320ad685f
Fix mega_batch_factor not set for test
2020-07-24 12:26:44 -06:00
James Betker
76a38b6a53
Misc
2020-06-02 09:35:52 -06:00
James Betker
f6815df58b
Misc
2020-05-27 08:04:47 -06:00
James Betker
987cdad0b6
Misc mods
2020-05-23 21:09:38 -06:00
James Betker
67139602f5
Test modifications
...
Allows bifurcating large images put into the test pipeline
This code is fixed and not dynamic. Needs some fixes.
2020-05-19 09:37:58 -06:00
James Betker
037a5a3cdb
Config updates
2020-05-13 09:20:28 -06:00
James Betker
62a97c53d1
Handle tuple-returning generators in test
2020-05-11 11:15:26 -06:00
James Betker
44b89330c2
Support inference across batches, support inference on cpu, checkpoint
...
This is a checkpoint of a set of long tests with reduced-complexity networks. Some takeaways:
1) A full GAN using the resnet discriminator does appear to converge, but the quality is capped.
2) Likewise, a combination GAN/feature loss does not converge. The feature loss is optimized but
the model appears unable to fight the discriminator, so the G-loss steadily increases.
Going forwards, I want to try some bigger models. In particular, I want to change the generator
to increase complexity and capacity. I also want to add skip connections between the
disc and generator.
2020-05-04 08:48:25 -06:00
James Betker
35bd1ecae4
Config changes for discriminator advantage run
...
Still going from high->low, discriminator discerns on low. Next up disc works on high.
2020-04-25 11:24:28 -06:00
James Betker
e98d92fc77
Allow test to operate on batches
2020-04-23 23:59:09 -06:00
James Betker
f4b33b0531
Some random fixes/adjustments
2020-04-22 00:38:53 -06:00
XintaoWang
037933ba66
mmsr
2019-08-23 21:42:47 +08:00