James Betker
|
c04f244802
|
More mods
|
2020-09-08 20:36:27 -06:00 |
|
James Betker
|
dffbfd2ec4
|
Allow SRG checkpointing to be toggled
|
2020-09-08 15:14:43 -06:00 |
|
James Betker
|
e6207d4c50
|
SPSR3 work
SPSR3 is meant to fix whatever is causing the switching units
inside of the newer SPSR architectures to fail and basically
not use the multiplexers.
|
2020-09-08 15:14:23 -06:00 |
|
James Betker
|
5606e8b0ee
|
Fix SRGAN_model/fullimgdataset compatibility 1
|
2020-09-08 11:34:35 -06:00 |
|
James Betker
|
22c98f1567
|
Move MultiConvBlock to arch_util
|
2020-09-08 08:17:27 -06:00 |
|
James Betker
|
146ace0859
|
CSNLN changes (removed because it doesnt train well)
|
2020-09-08 08:04:16 -06:00 |
|
James Betker
|
f43df7f5f7
|
Make ExtensibleTrainer compatible with process_video
|
2020-09-08 08:03:41 -06:00 |
|
James Betker
|
a18ece62ee
|
Add updated spsr net for test
|
2020-09-07 17:01:48 -06:00 |
|
James Betker
|
55475d2ac1
|
Clean up unused archs
|
2020-09-07 11:38:11 -06:00 |
|
James Betker
|
e8613041c0
|
Add novograd optimizer
|
2020-09-06 17:27:08 -06:00 |
|
James Betker
|
b1238d29cb
|
Fix trainable not applying to discriminators
|
2020-09-05 20:31:26 -06:00 |
|
James Betker
|
21ae135f23
|
Allow Novograd to be used as an optimizer
|
2020-09-05 16:50:13 -06:00 |
|
James Betker
|
912a4d9fea
|
Fix srg computer bug
|
2020-09-05 07:59:54 -06:00 |
|
James Betker
|
0dfd8eaf3b
|
Support injectors that run in eval only
|
2020-09-05 07:59:45 -06:00 |
|
James Betker
|
44c75f7642
|
Undo SRG change
|
2020-09-04 17:32:16 -06:00 |
|
James Betker
|
6657a406ac
|
Mods needed to support training a corruptor again:
- Allow original SPSRNet to have a specifiable block increment
- Cleanup
- Bug fixes in code that hasnt been touched in awhile.
|
2020-09-04 15:33:39 -06:00 |
|
James Betker
|
bfdfaab911
|
Checkpoint RRDB
Greatly reduces memory consumption with a low performance penalty
|
2020-09-04 15:32:00 -06:00 |
|
James Betker
|
696242064c
|
Use tensor checkpointing to drastically reduce memory usage
This comes at the expense of computation, but since we can use much larger
batches, it results in a net speedup.
|
2020-09-03 11:33:36 -06:00 |
|
James Betker
|
365813bde3
|
Add InterpolateInjector
|
2020-09-03 11:32:47 -06:00 |
|
James Betker
|
d90c96e55e
|
Fix greyscale injector
|
2020-09-02 10:29:40 -06:00 |
|
James Betker
|
8b52d46847
|
Interpreted feature loss to extensibletrainer
|
2020-09-02 10:08:24 -06:00 |
|
James Betker
|
886d59d5df
|
Misc fixes & adjustments
|
2020-09-01 07:58:11 -06:00 |
|
James Betker
|
0a9b85f239
|
Fix vgg_gn input_img_factor
|
2020-08-31 09:50:30 -06:00 |
|
James Betker
|
4b4d08bdec
|
Enable testing in ExtensibleTrainer, fix it in SRGAN_model
Also compute fea loss for this.
|
2020-08-31 09:41:48 -06:00 |
|
James Betker
|
b2091cb698
|
feamod fix
|
2020-08-30 08:08:49 -06:00 |
|
James Betker
|
a56e906f9c
|
train HR feature trainer
|
2020-08-29 09:27:48 -06:00 |
|
James Betker
|
0e859a8082
|
4x spsr ref (not workin)
|
2020-08-29 09:27:18 -06:00 |
|
James Betker
|
25832930db
|
Update loss with lr crossgan
|
2020-08-26 17:57:22 -06:00 |
|
James Betker
|
cbd5e7a986
|
Support old school crossgan in extensibletrainer
|
2020-08-26 17:52:35 -06:00 |
|
James Betker
|
8a6a2e6e2e
|
Rev3 of the full image ref arch
|
2020-08-26 17:11:01 -06:00 |
|
James Betker
|
f35b3ad28f
|
Fix val behavior for ExtensibleTrainer
|
2020-08-26 08:44:22 -06:00 |
|
James Betker
|
434ed70a9a
|
Wrap vgg disc
|
2020-08-25 18:14:45 -06:00 |
|
James Betker
|
83f2f8d239
|
more debugging
|
2020-08-25 18:12:12 -06:00 |
|
James Betker
|
3f60281da7
|
Print when wrapping
|
2020-08-25 18:08:46 -06:00 |
|
James Betker
|
bae18c05e6
|
wrap disc grad
|
2020-08-25 17:58:20 -06:00 |
|
James Betker
|
f85f1e21db
|
Turns out, can't do that
|
2020-08-25 17:18:52 -06:00 |
|
James Betker
|
935a735327
|
More dohs
|
2020-08-25 17:05:16 -06:00 |
|
James Betker
|
53e67bdb9c
|
Distribute get_grad_no_padding
|
2020-08-25 17:03:18 -06:00 |
|
James Betker
|
2f706b7d93
|
I an inept.
|
2020-08-25 16:42:59 -06:00 |
|
James Betker
|
8bae0de769
|
ffffffffffffffffff
|
2020-08-25 16:41:01 -06:00 |
|
James Betker
|
1fe16f71dd
|
Fix bug reporting spsr gan weight
|
2020-08-25 16:37:45 -06:00 |
|
James Betker
|
96586d6592
|
Fix distributed d_grad
|
2020-08-25 16:06:27 -06:00 |
|
James Betker
|
09a9079e17
|
Check rank before doing image logging.
|
2020-08-25 16:00:49 -06:00 |
|
James Betker
|
a1800f45ef
|
Fix for referencingmultiplexer
|
2020-08-25 15:43:12 -06:00 |
|
James Betker
|
a65b07607c
|
Reference network
|
2020-08-25 11:56:59 -06:00 |
|
James Betker
|
f9276007a8
|
More fixes to corrupt_fea
|
2020-08-23 17:52:18 -06:00 |
|
James Betker
|
0005c56cd4
|
dbg
|
2020-08-23 17:43:03 -06:00 |
|
James Betker
|
4bb5b3c981
|
corfea debugging
|
2020-08-23 17:39:02 -06:00 |
|
James Betker
|
7713cb8df5
|
Corrupted features in srgan
|
2020-08-23 17:32:03 -06:00 |
|
James Betker
|
dffc15184d
|
More ExtensibleTrainer work
It runs now, just need to debug it to reach performance parity with SRGAN. Sweet.
|
2020-08-23 17:22:45 -06:00 |
|