Commit Graph

715 Commits

Author SHA1 Message Date
James Betker
58d8bf8f69 Add network architecture built for teco 2020-10-09 08:40:14 -06:00
James Betker
b3d0baaf17 Improve multiframe dataset memory usage 2020-10-09 08:40:00 -06:00
James Betker
afe6af88af Fix attention print issue 2020-10-08 18:34:00 -06:00
James Betker
4c85ee51a4 Converge SSG architectures into unified switching base class
Also adds attention norm histogram to logging
2020-10-08 17:23:21 -06:00
James Betker
3cc56cd00b Merge remote-tracking branch 'origin/gan_lab' into gan_lab 2020-10-08 16:12:05 -06:00
James Betker
7d8d9dafbb misc 2020-10-08 16:12:00 -06:00
James Betker
856ef4d21d Update switched_conv 2020-10-08 16:10:23 -06:00
James Betker
1eb516d686 Fix more distributed bugs 2020-10-08 14:32:45 -06:00
James Betker
b36ba0460c Fix multi-frame dataset OBO error 2020-10-08 12:21:04 -06:00
James Betker
fba29d7dcc Move to apex distributeddataparallel and add switch all_reduce
Torch's distributed_data_parallel is missing "delay_allreduce", which is
necessary to get gradient checkpointing to work with recurrent models.
2020-10-08 11:20:05 -06:00
James Betker
c174ac0fd5 Allow tecogan to support generators that only output a tensor (instead of a list) 2020-10-08 09:26:25 -06:00
James Betker
969bcd9021 Use local checkpoint in SSG 2020-10-08 08:54:46 -06:00
James Betker
c93dd623d7 Tecogan losses work 2020-10-07 23:11:58 -06:00
James Betker
29bf78d791 Update switched_conv submodule 2020-10-07 23:11:50 -06:00
James Betker
c96f5b2686 Import switched_conv as a submodule 2020-10-07 23:10:54 -06:00
James Betker
c352c8bce4 More tecogan fixes 2020-10-07 12:41:17 -06:00
James Betker
a62a5dbb5f Clone and detach in recursively_detach 2020-10-07 12:41:00 -06:00
James Betker
1c44d395af Tecogan work
Its training!  There's still probably plenty of bugs though..
2020-10-07 09:03:30 -06:00
James Betker
e9d7371a61 Add concatenate injector 2020-10-07 09:02:42 -06:00
James Betker
8a7e993aea Merge remote-tracking branch 'origin/gan_lab' into gan_lab 2020-10-06 20:41:58 -06:00
James Betker
b2c4b2a16d Move gpu_ids out of if statement 2020-10-06 20:40:20 -06:00
James Betker
1e415b249b Add tag that can be applied to prevent parameter training 2020-10-06 20:39:49 -06:00
James Betker
2f2e3f33f8 StackedSwitchedGenerator_5lyr 2020-10-06 20:39:32 -06:00
James Betker
6217b48e3f Fix spsr_arch bug 2020-10-06 20:38:47 -06:00
James Betker
4290918359 Add distributed_checkpoint for more efficient checkpoints 2020-10-06 20:38:38 -06:00
James Betker
cffc596141 Integrate flownet2 into codebase, add teco visual debugs 2020-10-06 20:35:39 -06:00
James Betker
e4b89a172f Reduce spsr7 memory usage 2020-10-05 22:05:56 -06:00
James Betker
4111942ada Support attention deferral in deep ssgr 2020-10-05 19:35:55 -06:00
James Betker
840927063a Work on tecogan losses 2020-10-05 19:35:28 -06:00
James Betker
0e3ea63a14 Misc 2020-10-05 18:01:50 -06:00
James Betker
2875822024 SPSR9 arch
takes some of the stuff I learned with SGSR yesterday and applies it to spsr
2020-10-05 08:47:51 -06:00
James Betker
51044929af Don't compute attention statistics on multiple generator invocations of the same data 2020-10-05 00:34:29 -06:00
James Betker
e760658fdb Another fix.. 2020-10-04 21:08:00 -06:00
James Betker
a890e3a9c0 Fix geometric loss not handling 0 index 2020-10-04 21:05:01 -06:00
James Betker
c3ef8a4a31 Stacked switches - return a tuple 2020-10-04 21:02:24 -06:00
James Betker
13f97e1e97 Add recursive loss 2020-10-04 20:48:15 -06:00
James Betker
ffd069fd97 Lots of SSG work
- Checkpointed pretty much the entire model - enabling recurrent inputs
- Added two new models for test - adding depth (again) and removing SPSR (in lieu of the new losses)
2020-10-04 20:48:08 -06:00
James Betker
aca2c7ab41 Full checkpoint-ize SSG1 2020-10-04 18:24:52 -06:00
James Betker
fc396baf1a Move loaded_options to util
Doesn't seem to work with python 3.6
2020-10-03 20:29:06 -06:00
James Betker
2d8e9a9d30 Options fix? 2020-10-03 20:27:12 -06:00
James Betker
e3294939b0 Revert "SSG: offer option to use BN-based attention normalization"
Didn't work. Oh well.

This reverts commit 5cd2b37591.
2020-10-03 17:54:53 -06:00
James Betker
43c6c67fd1 Merge remote-tracking branch 'origin/gan_lab' into gan_lab 2020-10-03 16:17:31 -06:00
James Betker
5cd2b37591 SSG: offer option to use BN-based attention normalization
Not sure how this is going to work, lets try it.
2020-10-03 16:16:19 -06:00
James Betker
c896939523 Fix recursive checkpoint 2020-10-03 16:15:52 -06:00
James Betker
3cbb9ecd45 Misc 2020-10-03 16:15:42 -06:00
James Betker
35731502c3 Fix checkpoint recursion 2020-10-03 12:52:50 -06:00
James Betker
9b4ed82093 Get rid of unused convs in spsr7 2020-10-03 11:36:26 -06:00
James Betker
b2b81b13a4 Remove recursive utils import 2020-10-03 11:30:05 -06:00
James Betker
3561cc164d Fix up fea_loss calculator (for validation)
Not sure how this was working in regular training mode, but it
was failing in DDP.
2020-10-03 11:19:20 -06:00
James Betker
21d3bb83b2 Use tqdm reporting with validation 2020-10-03 11:16:39 -06:00