James Betker
11155aead4
Directly use dataset keys
...
This has been a long time coming. Cleans up messy "GT" nomenclature and simplifies ExtensibleTraner.feed_data
2020-12-04 20:14:53 -07:00
James Betker
da604752e6
Misc RRDB changes
2020-11-29 12:21:31 -07:00
James Betker
fd356580c0
Play with lambdas
2020-11-26 20:30:55 -07:00
James Betker
ade0a129da
Include psnr in test.py
2020-10-27 10:25:42 -06:00
James Betker
76e4f0c086
Restore test.py for use as standalone validator
2020-10-19 15:26:07 -06:00
James Betker
8ca566b621
Revert "Misc"
...
This reverts commit 0e3ea63a14
.
# Conflicts:
# codes/test.py
# codes/train.py
2020-10-19 13:34:54 -06:00
James Betker
24792bdb4f
Codebase cleanup
...
Removed a lot of legacy stuff I have no intent on using again.
Plan is to shape this repo into something more extensible (get it? hah!)
2020-10-13 20:56:39 -06:00
James Betker
c96f5b2686
Import switched_conv as a submodule
2020-10-07 23:10:54 -06:00
James Betker
0e3ea63a14
Misc
2020-10-05 18:01:50 -06:00
James Betker
254cb1e915
More dataset integration work
2020-09-25 22:19:38 -06:00
James Betker
e8613041c0
Add novograd optimizer
2020-09-06 17:27:08 -06:00
James Betker
6657a406ac
Mods needed to support training a corruptor again:
...
- Allow original SPSRNet to have a specifiable block increment
- Cleanup
- Bug fixes in code that hasnt been touched in awhile.
2020-09-04 15:33:39 -06:00
James Betker
4b4d08bdec
Enable testing in ExtensibleTrainer, fix it in SRGAN_model
...
Also compute fea loss for this.
2020-08-31 09:41:48 -06:00
James Betker
bdaa67deb7
Misc
2020-08-12 08:46:15 -06:00
James Betker
3320ad685f
Fix mega_batch_factor not set for test
2020-07-24 12:26:44 -06:00
James Betker
76a38b6a53
Misc
2020-06-02 09:35:52 -06:00
James Betker
f6815df58b
Misc
2020-05-27 08:04:47 -06:00
James Betker
987cdad0b6
Misc mods
2020-05-23 21:09:38 -06:00
James Betker
67139602f5
Test modifications
...
Allows bifurcating large images put into the test pipeline
This code is fixed and not dynamic. Needs some fixes.
2020-05-19 09:37:58 -06:00
James Betker
037a5a3cdb
Config updates
2020-05-13 09:20:28 -06:00
James Betker
62a97c53d1
Handle tuple-returning generators in test
2020-05-11 11:15:26 -06:00
James Betker
44b89330c2
Support inference across batches, support inference on cpu, checkpoint
...
This is a checkpoint of a set of long tests with reduced-complexity networks. Some takeaways:
1) A full GAN using the resnet discriminator does appear to converge, but the quality is capped.
2) Likewise, a combination GAN/feature loss does not converge. The feature loss is optimized but
the model appears unable to fight the discriminator, so the G-loss steadily increases.
Going forwards, I want to try some bigger models. In particular, I want to change the generator
to increase complexity and capacity. I also want to add skip connections between the
disc and generator.
2020-05-04 08:48:25 -06:00
James Betker
35bd1ecae4
Config changes for discriminator advantage run
...
Still going from high->low, discriminator discerns on low. Next up disc works on high.
2020-04-25 11:24:28 -06:00
James Betker
e98d92fc77
Allow test to operate on batches
2020-04-23 23:59:09 -06:00
James Betker
f4b33b0531
Some random fixes/adjustments
2020-04-22 00:38:53 -06:00
XintaoWang
037933ba66
mmsr
2019-08-23 21:42:47 +08:00