James Betker
9b4ed82093
Get rid of unused convs in spsr7
2020-10-03 11:36:26 -06:00
James Betker
19a4075e1e
Allow checkpointing to be disabled in the options file
...
Also makes options a global variable for usage in utils.
2020-10-03 11:03:28 -06:00
James Betker
146a9125f2
Modify geometric & translational losses so they can be used with embeddings
2020-10-02 20:40:13 -06:00
James Betker
e30a1443cd
Change sw2 refs
2020-10-02 09:01:18 -06:00
James Betker
e38716925f
Fix spsr8 class init
2020-10-02 09:00:18 -06:00
James Betker
35469f08e2
Spsr 8
2020-10-02 08:58:15 -06:00
James Betker
e3053e4e55
Exchange SpsrNet for SpsrNetSimplified
2020-09-30 17:01:04 -06:00
James Betker
896b4f5be2
Revert "spsr7 adjustments"
...
This reverts commit 9fee1cec71
.
2020-09-29 18:30:41 -06:00
James Betker
9fee1cec71
spsr7 adjustments
2020-09-29 17:19:59 -06:00
James Betker
0b5a033503
spsr7 + cleanup
...
SPSR7 adds ref onto spsr6, makes more "common sense" mods.
2020-09-29 16:59:26 -06:00
James Betker
db52bec4ab
spsr6
...
This is meant to be a variant of SPSR5 that harkens
back to the simpler earlier architectures that do not
have embeddings or ref_ inputs, but do have deep
multiplexers. It does, however, use some of the new
conjoin mechanisms.
2020-09-28 22:09:27 -06:00
James Betker
5a27187c59
More mods to accomodate new dataset
2020-09-25 22:45:57 -06:00
James Betker
ce4613ecb9
Finish up single_image_dataset work
...
Sweet!
2020-09-25 16:37:54 -06:00
James Betker
ea565b7eaf
More fixes
2020-09-24 17:51:52 -06:00
James Betker
553917a8d1
Fix torchvision import bug
2020-09-24 17:38:34 -06:00
James Betker
58886109d4
Update how spsr arches do attention to conform with sgsr
2020-09-24 16:53:54 -06:00
James Betker
9a17ade550
Some convenience adjustments to ExtensibleTrainer
2020-09-17 21:05:32 -06:00
James Betker
ccf8438001
SPSR5
...
This is SPSR4, but the multiplexers have access to the output of the transformations
for making their decision.
2020-09-13 20:10:24 -06:00
James Betker
4e44bca611
SPSR4
...
aka - return of the backbone! I'm tired of massively overparameterized generators
with pile-of-shit multiplexers. Let's give this another try..
2020-09-11 22:55:37 -06:00
James Betker
19896abaea
Clean up old SwitchedSpsr arch
...
It didn't work anyways, so why not?
2020-09-11 16:09:28 -06:00
James Betker
1086f0476b
Fix ref branch using fixed filters
2020-09-11 08:58:35 -06:00
James Betker
9e5aa166de
Report the standard deviation of ref branches
...
This patch also ups the contribution
2020-09-10 16:34:41 -06:00
James Betker
668bfbff6d
Back to best arch for spsr3
2020-09-10 14:58:14 -06:00
James Betker
992b0a8d98
spsr3 with conjoin stage as part of the switch
2020-09-10 09:11:37 -06:00
James Betker
e0fc5eb50c
Temporary commit - noise
2020-09-09 17:12:52 -06:00
James Betker
00da69d450
Temporary commit - ref
2020-09-09 17:09:44 -06:00
James Betker
df59d6c99d
More spsr3 mods
...
- Most branches get their own noise vector now.
- First attention branch has the intended sole purpose of raw image processing
- Remove norms from joiner block
2020-09-09 16:46:38 -06:00
James Betker
747ded2bf7
Fixes to the spsr3
...
Some lessons learned:
- Biases are fairly important as a relief valve. They dont need to be everywhere, but
most computationally heavy branches should have a bias.
- GroupNorm in SPSR is not a great idea. Since image gradients are represented
in this model, normal means and standard deviations are not applicable. (imggrad
has a high representation of 0).
- Don't fuck with the mainline of any generative model. As much as possible, all
additions should be done through residual connections. Never pollute the mainline
with reference data, do that in branches. It basically leaves the mode untrainable.
2020-09-09 15:28:14 -06:00
James Betker
0ffac391c1
SPSR with ref joining
2020-09-09 11:17:07 -06:00
James Betker
c04f244802
More mods
2020-09-08 20:36:27 -06:00
James Betker
e6207d4c50
SPSR3 work
...
SPSR3 is meant to fix whatever is causing the switching units
inside of the newer SPSR architectures to fail and basically
not use the multiplexers.
2020-09-08 15:14:23 -06:00
James Betker
22c98f1567
Move MultiConvBlock to arch_util
2020-09-08 08:17:27 -06:00
James Betker
f43df7f5f7
Make ExtensibleTrainer compatible with process_video
2020-09-08 08:03:41 -06:00
James Betker
a18ece62ee
Add updated spsr net for test
2020-09-07 17:01:48 -06:00
James Betker
55475d2ac1
Clean up unused archs
2020-09-07 11:38:11 -06:00
James Betker
6657a406ac
Mods needed to support training a corruptor again:
...
- Allow original SPSRNet to have a specifiable block increment
- Cleanup
- Bug fixes in code that hasnt been touched in awhile.
2020-09-04 15:33:39 -06:00
James Betker
0e859a8082
4x spsr ref (not workin)
2020-08-29 09:27:18 -06:00
James Betker
8a6a2e6e2e
Rev3 of the full image ref arch
2020-08-26 17:11:01 -06:00
James Betker
a65b07607c
Reference network
2020-08-25 11:56:59 -06:00
James Betker
9d77a4db2e
Allow initial temperature to be specified to SPSR net for inference
2020-08-20 11:57:34 -06:00
James Betker
24bdcc1181
Let SwitchedSpsr transform count be specified
2020-08-18 09:10:25 -06:00
James Betker
868d0aa442
Undo early dim reduction on grad branch for SPSR_arch
2020-08-14 16:23:42 -06:00
James Betker
2d205f52ac
Unite spsr_arch switched gens
...
Found a pretty good basis model.
2020-08-12 17:04:45 -06:00
James Betker
3d0ece804b
SPSR LR2
2020-08-12 08:45:49 -06:00
James Betker
f0e2816239
Denoise attention maps
2020-08-10 14:59:58 -06:00
James Betker
59aba1daa7
LR switched SPSR arch
...
This variant doesn't do conv processing at HR, which should save
a ton of memory in inference. Lets see how it works.
2020-08-10 13:03:36 -06:00
James Betker
4e972144ae
More attention fixes for switched_spsr
2020-08-07 21:11:50 -06:00
James Betker
d02509ef97
spsr_switched missing import
2020-08-07 21:05:29 -06:00
James Betker
887806ffa0
Finish up spsr_switched
2020-08-07 21:03:48 -06:00
James Betker
299ee13988
More RAGAN fixes
2020-08-05 11:03:06 -06:00