James Betker
|
24792bdb4f
|
Codebase cleanup
Removed a lot of legacy stuff I have no intent on using again.
Plan is to shape this repo into something more extensible (get it? hah!)
|
2020-10-13 20:56:39 -06:00 |
|
James Betker
|
e620fc05ba
|
Mods to support video processing with teco networks
|
2020-10-13 20:47:05 -06:00 |
|
James Betker
|
17d78195ee
|
Mods to SRG to support returning switch logits
|
2020-10-13 20:46:37 -06:00 |
|
James Betker
|
cc915303a5
|
Fix SPSR calls into SwitchComputer
|
2020-10-13 10:14:47 -06:00 |
|
James Betker
|
bdf4c38899
|
Merge remote-tracking branch 'origin/gan_lab' into gan_lab
# Conflicts:
# codes/models/archs/SwitchedResidualGenerator_arch.py
|
2020-10-13 10:12:26 -06:00 |
|
James Betker
|
9a5d6162e9
|
Add the "BigSwitch"
|
2020-10-13 10:11:10 -06:00 |
|
James Betker
|
8014f050ac
|
Clear metrics properly
Holy cow, what a PITA bug.
|
2020-10-13 10:07:49 -06:00 |
|
James Betker
|
4d52374e60
|
Merge remote-tracking branch 'origin/gan_lab' into gan_lab
|
2020-10-12 17:43:51 -06:00 |
|
James Betker
|
731700ab2c
|
checkpoint in ssg
|
2020-10-12 17:43:28 -06:00 |
|
James Betker
|
ca523215c6
|
Fix recurrent std in arch
|
2020-10-12 17:42:32 -06:00 |
|
James Betker
|
05377973bf
|
Allow initial recurrent input to be specified (optionally)
|
2020-10-12 17:36:43 -06:00 |
|
James Betker
|
597b6e92d6
|
Add ssgr1 recurrence
|
2020-10-12 17:18:19 -06:00 |
|
James Betker
|
d7d7590f3e
|
Fix constant injector - wasn't working in test
|
2020-10-12 10:36:30 -06:00 |
|
James Betker
|
ce163ad4a9
|
Update SSGdeep
|
2020-10-12 10:22:08 -06:00 |
|
James Betker
|
3409d88a1c
|
Add PANet arch
|
2020-10-12 10:20:55 -06:00 |
|
James Betker
|
a9c2e97391
|
Constant injector and teco fixes
|
2020-10-11 08:20:07 -06:00 |
|
James Betker
|
e785029936
|
Mods needed to support SPSR archs with teco gan
|
2020-10-10 22:39:55 -06:00 |
|
James Betker
|
120072d464
|
Add constant injector
|
2020-10-10 21:50:23 -06:00 |
|
James Betker
|
f99812e14d
|
Fix tecogan_losses errors
|
2020-10-10 20:30:14 -06:00 |
|
James Betker
|
3a5b23b9f7
|
Alter teco_losses to feed a recurrent input in as separate
|
2020-10-10 20:21:09 -06:00 |
|
James Betker
|
0d30d18a3d
|
Add MarginRemoval injector
|
2020-10-09 20:35:56 -06:00 |
|
James Betker
|
0011d445c8
|
Fix loss indexing
|
2020-10-09 20:20:51 -06:00 |
|
James Betker
|
202eb11fdc
|
For element loss added
|
2020-10-09 19:51:44 -06:00 |
|
James Betker
|
fe50d6f9d0
|
Fix attention images
|
2020-10-09 19:21:55 -06:00 |
|
James Betker
|
7e777ea34c
|
Allow tecogan to be used in process_video
|
2020-10-09 19:21:43 -06:00 |
|
James Betker
|
58d8bf8f69
|
Add network architecture built for teco
|
2020-10-09 08:40:14 -06:00 |
|
James Betker
|
afe6af88af
|
Fix attention print issue
|
2020-10-08 18:34:00 -06:00 |
|
James Betker
|
4c85ee51a4
|
Converge SSG architectures into unified switching base class
Also adds attention norm histogram to logging
|
2020-10-08 17:23:21 -06:00 |
|
James Betker
|
1eb516d686
|
Fix more distributed bugs
|
2020-10-08 14:32:45 -06:00 |
|
James Betker
|
fba29d7dcc
|
Move to apex distributeddataparallel and add switch all_reduce
Torch's distributed_data_parallel is missing "delay_allreduce", which is
necessary to get gradient checkpointing to work with recurrent models.
|
2020-10-08 11:20:05 -06:00 |
|
James Betker
|
c174ac0fd5
|
Allow tecogan to support generators that only output a tensor (instead of a list)
|
2020-10-08 09:26:25 -06:00 |
|
James Betker
|
969bcd9021
|
Use local checkpoint in SSG
|
2020-10-08 08:54:46 -06:00 |
|
James Betker
|
c93dd623d7
|
Tecogan losses work
|
2020-10-07 23:11:58 -06:00 |
|
James Betker
|
c96f5b2686
|
Import switched_conv as a submodule
|
2020-10-07 23:10:54 -06:00 |
|
James Betker
|
c352c8bce4
|
More tecogan fixes
|
2020-10-07 12:41:17 -06:00 |
|
James Betker
|
1c44d395af
|
Tecogan work
Its training! There's still probably plenty of bugs though..
|
2020-10-07 09:03:30 -06:00 |
|
James Betker
|
e9d7371a61
|
Add concatenate injector
|
2020-10-07 09:02:42 -06:00 |
|
James Betker
|
8a7e993aea
|
Merge remote-tracking branch 'origin/gan_lab' into gan_lab
|
2020-10-06 20:41:58 -06:00 |
|
James Betker
|
1e415b249b
|
Add tag that can be applied to prevent parameter training
|
2020-10-06 20:39:49 -06:00 |
|
James Betker
|
2f2e3f33f8
|
StackedSwitchedGenerator_5lyr
|
2020-10-06 20:39:32 -06:00 |
|
James Betker
|
6217b48e3f
|
Fix spsr_arch bug
|
2020-10-06 20:38:47 -06:00 |
|
James Betker
|
cffc596141
|
Integrate flownet2 into codebase, add teco visual debugs
|
2020-10-06 20:35:39 -06:00 |
|
James Betker
|
e4b89a172f
|
Reduce spsr7 memory usage
|
2020-10-05 22:05:56 -06:00 |
|
James Betker
|
4111942ada
|
Support attention deferral in deep ssgr
|
2020-10-05 19:35:55 -06:00 |
|
James Betker
|
840927063a
|
Work on tecogan losses
|
2020-10-05 19:35:28 -06:00 |
|
James Betker
|
2875822024
|
SPSR9 arch
takes some of the stuff I learned with SGSR yesterday and applies it to spsr
|
2020-10-05 08:47:51 -06:00 |
|
James Betker
|
51044929af
|
Don't compute attention statistics on multiple generator invocations of the same data
|
2020-10-05 00:34:29 -06:00 |
|
James Betker
|
e760658fdb
|
Another fix..
|
2020-10-04 21:08:00 -06:00 |
|
James Betker
|
a890e3a9c0
|
Fix geometric loss not handling 0 index
|
2020-10-04 21:05:01 -06:00 |
|
James Betker
|
c3ef8a4a31
|
Stacked switches - return a tuple
|
2020-10-04 21:02:24 -06:00 |
|