Commit Graph

375 Commits

Author SHA1 Message Date
James Betker
c58c2b09ca Back to remove all biases (looks like a ConvBnRelu made its way in..) 2020-07-04 22:41:02 -06:00
James Betker
86cda86e94 Re-add biases, also add new init
A/B testing where we lost our GAN competitiveness.
2020-07-04 22:24:42 -06:00
James Betker
b03741f30e Remove all biases from generator
Continuing to investigate loss of GAN competitiveness, this is a big difference
between "old" SRG1 and "new".
2020-07-04 22:19:55 -06:00
James Betker
726e946e79 Turn BN off in SRG1
This wont work well but just testing if GAN performance comes back
2020-07-04 14:51:27 -06:00
James Betker
0ee39d419b OrderedDict not needed 2020-07-04 14:09:27 -06:00
James Betker
9048105b72 Break out SRG1 as separate network
Something strange is going on. These networks do not respond to
discriminator gradients properly anymore. SRG1 did, however so
reverting back to last known good state to figure out why.
2020-07-04 13:28:50 -06:00
James Betker
188de5e15a Misc changes 2020-07-04 13:22:50 -06:00
James Betker
510b2f887d Remove RDB from srg2
Doesnt seem to work so great.
2020-07-03 22:31:20 -06:00
James Betker
77d3765364 Fix new feature loss calc 2020-07-03 22:20:13 -06:00
James Betker
ed6a15e768 Add feature to dataset which allows it to force images to be a certain size. 2020-07-03 15:19:16 -06:00
James Betker
da4335c25e Add a feature-based validation test 2020-07-03 15:18:57 -06:00
James Betker
703dec4472 Add SpineNet & integrate with SRG
New version of SRG uses SpineNet for a switch backbone.
2020-07-03 12:07:31 -06:00
James Betker
3ed7a2b9ab Move ConvBnRelu/Lelu to arch_util 2020-07-03 12:06:38 -06:00
James Betker
ea9c6765ca Move train imports into init_dist 2020-07-02 15:11:21 -06:00
James Betker
e9ee67ff10 Integrate RDB into SRG
The last RDB for each cluster is switched.
2020-07-01 17:19:55 -06:00
James Betker
6ac6c95177 Fix scaling bug 2020-07-01 16:42:27 -06:00
James Betker
30653181ba Experiment: get rid of post_switch_conv 2020-07-01 16:30:40 -06:00
James Betker
17191de836 Experiment: bring initialize_weights back again
Something really strange going on here..
2020-07-01 15:58:13 -06:00
James Betker
d1d573de07 Experiment: new init and post-switch-conv 2020-07-01 15:25:54 -06:00
James Betker
480d1299d7 Remove RRDB with switching
This idea never really panned out, removing it.
2020-07-01 12:08:32 -06:00
James Betker
e2398ac83c Experiment: revert initialization changes 2020-07-01 12:08:09 -06:00
James Betker
78276afcaa Experiment: Back to lelu 2020-07-01 11:43:25 -06:00
James Betker
b945021c90 SRG v2 - Move to Relu, rely on Module-based initialization 2020-07-01 11:33:32 -06:00
James Betker
ee6443ad7d Add numeric stability computation script 2020-07-01 11:30:34 -06:00
James Betker
c0bb123504 Misc changes 2020-07-01 11:28:23 -06:00
James Betker
604763be68 NSG r7
Converts the switching trunk to a VGG-style network to make it more comparable
to SRG architectures.
2020-07-01 09:54:29 -06:00
James Betker
87f1e9c56f Invert ResGen2 to operate in LR space 2020-06-30 20:57:40 -06:00
James Betker
e07d8abafb NSG rev 6
- Disable style passthrough
- Process multiplexers starting at base resolution
2020-06-30 20:47:26 -06:00
James Betker
3ce1a1878d NSG improvements (r5)
- Get rid of forwards(), it makes numeric_stability.py not work properly.
- Do stability auditing across layers.
- Upsample last instead of first, work in much higher dimensionality for transforms.
2020-06-30 16:59:57 -06:00
James Betker
75f148022d Even more NSG improvements (r4) 2020-06-30 13:52:47 -06:00
James Betker
773753073f More NSG improvements (v3)
Move to a fully fixup residual network for the switch (no
batch norms). Fix a bunch of other small bugs. Add in a
temporary latent feed-forward from the bottom of the
switch. Fix several initialization issues.
2020-06-29 20:26:51 -06:00
James Betker
4b82d0815d NSG improvements
- Just use resnet blocks for the multiplexer trunk of the generator
- Every block initializes itself, rather than everything at the end
- Cleans up some messy parts of the architecture, including unnecessary
  kernel sizes and places where BN is not used properly.
2020-06-29 10:09:51 -06:00
James Betker
978036e7b3 Add NestedSwitchGenerator
An evolution of SwitchedResidualGenerator, this variant nests attention
modules upon themselves to extend the representative capacity of the
model significantly.
2020-06-28 21:22:05 -06:00
James Betker
6f2bc36c61 Distill_torchscript mods
Starts down the path of writing a custom trace that works using torch's hook mechanism.
2020-06-27 08:28:09 -06:00
James Betker
db08dedfe2 Add recover_tensorboard_log
Generates a tb_logger from raw console output. Useful for colab sessions
that crash.
2020-06-27 08:26:57 -06:00
James Betker
c8a670842e Missed networks.py in last commit 2020-06-25 18:36:06 -06:00
James Betker
407224eba1 Re-work SwitchedResgen2
Got rid of the converged multiplexer bases but kept the configurable architecture. The
new multiplexers look a lot like the old one.

Took some queues from the transformer architecture: translate image to a higher filter-space
and stay there for the duration of the models computation. Also perform convs after each
switch to allow the model to anneal issues that arise.
2020-06-25 18:17:05 -06:00
James Betker
42a10b34ce Re-enable batch norm on switch processing blocks
Found out that batch norm is causing the switches to init really poorly -
not using a significant number of transforms. Might be a great time to
re-consider using the attention norm, but for now just re-enable it.
2020-06-24 21:15:17 -06:00
James Betker
4001db1ede Add ConfigurableSwitchComputer 2020-06-24 19:49:37 -06:00
James Betker
83c3b8b982 Add parameterized noise injection into resgen 2020-06-23 10:16:02 -06:00
James Betker
0584c3b587 Add negative_transforms switch to resgen 2020-06-23 09:41:12 -06:00
James Betker
dfcbe5f2db Add capability to place additional conv into discriminator
This should allow us to support larger images sizes. May need
to add another one of these.
2020-06-23 09:40:33 -06:00
James Betker
bad33de906 Add simple resize to extract images 2020-06-23 09:39:51 -06:00
James Betker
030648f2bc Remove batchnorms from resgen 2020-06-22 17:23:36 -06:00
James Betker
68bcab03ae Add growth channel to switch_growths for flat networks 2020-06-22 10:40:16 -06:00
James Betker
3b81712c49 Remove BN from transforms 2020-06-19 16:52:56 -06:00
James Betker
61364ec7d0 Fix inverse temperature curve logic and add upsample factor 2020-06-19 09:18:30 -06:00
James Betker
0551139b8d Fix resgen temperature curve below 1
It needs to be inverted to maintain a true linear curve
2020-06-18 16:08:07 -06:00
James Betker
efc80f041c Save & load amp state 2020-06-18 11:38:48 -06:00
James Betker
2e3b6bad77 Log tensorboard directly into experiments directory 2020-06-18 11:33:02 -06:00