Commit Graph

25 Commits

Author SHA1 Message Date
James Betker
87f1e9c56f Invert ResGen2 to operate in LR space 2020-06-30 20:57:40 -06:00
James Betker
3ce1a1878d NSG improvements (r5)
- Get rid of forwards(), it makes numeric_stability.py not work properly.
- Do stability auditing across layers.
- Upsample last instead of first, work in much higher dimensionality for transforms.
2020-06-30 16:59:57 -06:00
James Betker
75f148022d Even more NSG improvements (r4) 2020-06-30 13:52:47 -06:00
James Betker
773753073f More NSG improvements (v3)
Move to a fully fixup residual network for the switch (no
batch norms). Fix a bunch of other small bugs. Add in a
temporary latent feed-forward from the bottom of the
switch. Fix several initialization issues.
2020-06-29 20:26:51 -06:00
James Betker
4b82d0815d NSG improvements
- Just use resnet blocks for the multiplexer trunk of the generator
- Every block initializes itself, rather than everything at the end
- Cleans up some messy parts of the architecture, including unnecessary
  kernel sizes and places where BN is not used properly.
2020-06-29 10:09:51 -06:00
James Betker
978036e7b3 Add NestedSwitchGenerator
An evolution of SwitchedResidualGenerator, this variant nests attention
modules upon themselves to extend the representative capacity of the
model significantly.
2020-06-28 21:22:05 -06:00
James Betker
407224eba1 Re-work SwitchedResgen2
Got rid of the converged multiplexer bases but kept the configurable architecture. The
new multiplexers look a lot like the old one.

Took some queues from the transformer architecture: translate image to a higher filter-space
and stay there for the duration of the models computation. Also perform convs after each
switch to allow the model to anneal issues that arise.
2020-06-25 18:17:05 -06:00
James Betker
42a10b34ce Re-enable batch norm on switch processing blocks
Found out that batch norm is causing the switches to init really poorly -
not using a significant number of transforms. Might be a great time to
re-consider using the attention norm, but for now just re-enable it.
2020-06-24 21:15:17 -06:00
James Betker
4001db1ede Add ConfigurableSwitchComputer 2020-06-24 19:49:37 -06:00
James Betker
83c3b8b982 Add parameterized noise injection into resgen 2020-06-23 10:16:02 -06:00
James Betker
0584c3b587 Add negative_transforms switch to resgen 2020-06-23 09:41:12 -06:00
James Betker
030648f2bc Remove batchnorms from resgen 2020-06-22 17:23:36 -06:00
James Betker
68bcab03ae Add growth channel to switch_growths for flat networks 2020-06-22 10:40:16 -06:00
James Betker
3b81712c49 Remove BN from transforms 2020-06-19 16:52:56 -06:00
James Betker
61364ec7d0 Fix inverse temperature curve logic and add upsample factor 2020-06-19 09:18:30 -06:00
James Betker
0551139b8d Fix resgen temperature curve below 1
It needs to be inverted to maintain a true linear curve
2020-06-18 16:08:07 -06:00
James Betker
778e7b6931 Add a double-step to attention temperature 2020-06-18 11:29:31 -06:00
James Betker
59b0533b06 Fix attimage step size 2020-06-17 18:45:24 -06:00
James Betker
645d0ca767 ResidualGen mods
- Add filters_mid spec which allows a expansion->squeeze for the transformation layers.
- Add scale and bias AFTER the switch
- Remove identity transform (models were converging on this)
- Move attention image generation and temperature setting into new function which gets called every step with a save path
2020-06-17 17:18:28 -06:00
James Betker
6f8406fbdc Fixed ConfigurableSwitchedGenerator bug 2020-06-16 16:53:57 -06:00
James Betker
7d541642aa Get rid of SwitchedResidualGenerator
Just use the configurable one instead..
2020-06-16 16:23:29 -06:00
James Betker
379b96eb55 Output histograms with SwitchedResidualGenerator
This also fixes the initialization weight for the configurable generator.
2020-06-16 15:54:37 -06:00
James Betker
2def96203e Mods to SwitchedResidualGenerator_arch
- Increased processing for high-resolution switches
- Do stride=2 first in HalvingProcessingBlock
2020-06-16 14:19:12 -06:00
James Betker
70c764b9d4 Create a configurable SwichedResidualGenerator
Also move attention image generator out of repo
2020-06-16 13:24:07 -06:00
James Betker
df1046c318 New arch: SwitchedResidualGenerator_arch
The concept here is to use switching to split the generator into two functions:
interpretation and transformation. Transformation is done at the pixel level by
relatively simple conv layers, while interpretation is computed at various levels
by far more complicated conv stacks. The two are merged using the switching
mechanism.

This architecture is far less computationally intensive that RRDB.
2020-06-16 11:23:50 -06:00