James Betker
|
3e3d2af1f3
|
Add multi-modal trainer
|
2020-10-22 13:27:32 -06:00 |
|
James Betker
|
43c4f92123
|
Collapse progressive zoom candidates into the batch dimension
This contributes a significant speedup to training this type of network
since losses can operate on the entire prediction spectrum at once.
|
2020-10-21 22:37:23 -06:00 |
|
James Betker
|
680d635420
|
Enable ExtensibleTrainer to skip steps when state keys are missing
|
2020-10-21 22:22:28 -06:00 |
|
James Betker
|
d1175f0de1
|
Add FFT injector
|
2020-10-21 22:22:00 -06:00 |
|
James Betker
|
931aa65dd0
|
Allow recurrent losses to be weighted
|
2020-10-21 16:59:44 -06:00 |
|
James Betker
|
b28e4d9cc7
|
Add spread loss
Experimental loss that peaks around 0.
|
2020-10-19 11:31:19 -06:00 |
|
James Betker
|
668cafa798
|
Push correct patch of recurrent embedding to upstream image, rather than whole thing
|
2020-10-18 22:39:52 -06:00 |
|
James Betker
|
7df378a944
|
Remove separated vgg discriminator
Checkpointing happens inline instead. Was a dumb idea..
Also fixes some loss reporting issues.
|
2020-10-18 12:10:24 -06:00 |
|
James Betker
|
c709d38cd5
|
Fix memory leak with recurrent loss
|
2020-10-18 10:22:10 -06:00 |
|
James Betker
|
552e70a032
|
Get rid of excessive checkpointed disc params
|
2020-10-18 10:09:37 -06:00 |
|
James Betker
|
6a0d5f4813
|
Add a checkpointable discriminator
|
2020-10-18 09:57:47 -06:00 |
|
James Betker
|
9ead2c0a08
|
Multiscale training in!
|
2020-10-17 22:54:12 -06:00 |
|
James Betker
|
eda75c9779
|
Cleanup fixes
|
2020-10-15 10:13:17 -06:00 |
|
James Betker
|
24792bdb4f
|
Codebase cleanup
Removed a lot of legacy stuff I have no intent on using again.
Plan is to shape this repo into something more extensible (get it? hah!)
|
2020-10-13 20:56:39 -06:00 |
|
James Betker
|
e620fc05ba
|
Mods to support video processing with teco networks
|
2020-10-13 20:47:05 -06:00 |
|
James Betker
|
17d78195ee
|
Mods to SRG to support returning switch logits
|
2020-10-13 20:46:37 -06:00 |
|
James Betker
|
8014f050ac
|
Clear metrics properly
Holy cow, what a PITA bug.
|
2020-10-13 10:07:49 -06:00 |
|
James Betker
|
05377973bf
|
Allow initial recurrent input to be specified (optionally)
|
2020-10-12 17:36:43 -06:00 |
|
James Betker
|
d7d7590f3e
|
Fix constant injector - wasn't working in test
|
2020-10-12 10:36:30 -06:00 |
|
James Betker
|
a9c2e97391
|
Constant injector and teco fixes
|
2020-10-11 08:20:07 -06:00 |
|
James Betker
|
e785029936
|
Mods needed to support SPSR archs with teco gan
|
2020-10-10 22:39:55 -06:00 |
|
James Betker
|
120072d464
|
Add constant injector
|
2020-10-10 21:50:23 -06:00 |
|
James Betker
|
f99812e14d
|
Fix tecogan_losses errors
|
2020-10-10 20:30:14 -06:00 |
|
James Betker
|
3a5b23b9f7
|
Alter teco_losses to feed a recurrent input in as separate
|
2020-10-10 20:21:09 -06:00 |
|
James Betker
|
0d30d18a3d
|
Add MarginRemoval injector
|
2020-10-09 20:35:56 -06:00 |
|
James Betker
|
0011d445c8
|
Fix loss indexing
|
2020-10-09 20:20:51 -06:00 |
|
James Betker
|
202eb11fdc
|
For element loss added
|
2020-10-09 19:51:44 -06:00 |
|
James Betker
|
7e777ea34c
|
Allow tecogan to be used in process_video
|
2020-10-09 19:21:43 -06:00 |
|
James Betker
|
1eb516d686
|
Fix more distributed bugs
|
2020-10-08 14:32:45 -06:00 |
|
James Betker
|
c174ac0fd5
|
Allow tecogan to support generators that only output a tensor (instead of a list)
|
2020-10-08 09:26:25 -06:00 |
|
James Betker
|
c93dd623d7
|
Tecogan losses work
|
2020-10-07 23:11:58 -06:00 |
|
James Betker
|
c352c8bce4
|
More tecogan fixes
|
2020-10-07 12:41:17 -06:00 |
|
James Betker
|
1c44d395af
|
Tecogan work
Its training! There's still probably plenty of bugs though..
|
2020-10-07 09:03:30 -06:00 |
|
James Betker
|
e9d7371a61
|
Add concatenate injector
|
2020-10-07 09:02:42 -06:00 |
|
James Betker
|
cffc596141
|
Integrate flownet2 into codebase, add teco visual debugs
|
2020-10-06 20:35:39 -06:00 |
|
James Betker
|
840927063a
|
Work on tecogan losses
|
2020-10-05 19:35:28 -06:00 |
|
James Betker
|
51044929af
|
Don't compute attention statistics on multiple generator invocations of the same data
|
2020-10-05 00:34:29 -06:00 |
|
James Betker
|
e760658fdb
|
Another fix..
|
2020-10-04 21:08:00 -06:00 |
|
James Betker
|
a890e3a9c0
|
Fix geometric loss not handling 0 index
|
2020-10-04 21:05:01 -06:00 |
|
James Betker
|
13f97e1e97
|
Add recursive loss
|
2020-10-04 20:48:15 -06:00 |
|
James Betker
|
8197fd646f
|
Don't accumulate losses for metrics when the loss isn't a tensor
|
2020-10-03 11:03:55 -06:00 |
|
James Betker
|
19a4075e1e
|
Allow checkpointing to be disabled in the options file
Also makes options a global variable for usage in utils.
|
2020-10-03 11:03:28 -06:00 |
|
James Betker
|
dd9d7b27ac
|
Add more sophisticated mechanism for balancing GAN losses
|
2020-10-02 22:53:42 -06:00 |
|
James Betker
|
39865ca3df
|
TOTAL_loss, dumbo
|
2020-10-02 21:06:10 -06:00 |
|
James Betker
|
4e44fcd655
|
Loss accumulator fix
|
2020-10-02 20:55:33 -06:00 |
|
James Betker
|
567b4d50a4
|
ExtensibleTrainer - don't compute backward when there is no loss
|
2020-10-02 20:54:06 -06:00 |
|
James Betker
|
146a9125f2
|
Modify geometric & translational losses so they can be used with embeddings
|
2020-10-02 20:40:13 -06:00 |
|
James Betker
|
66d4512029
|
Fix up translational equivariance loss so it's ready for prime time
|
2020-09-30 12:01:00 -06:00 |
|
James Betker
|
dc8f3b24de
|
Don't let duplicate keys be used for injectors and losses
|
2020-09-29 16:59:44 -06:00 |
|
James Betker
|
f9b83176f1
|
Fix bugs in extensibletrainer
|
2020-09-28 22:09:42 -06:00 |
|