Commit Graph

266 Commits

Author SHA1 Message Date
James Betker
7659bd6818 Fix temperature equation 2020-07-14 10:17:14 -06:00
James Betker
853468ef82 Allow legacy state_dicts in srg2 2020-07-14 10:03:45 -06:00
James Betker
1b1431133b Add DualOutputSRG
Also removes the old multi-return mechanism that Generators support.
Also fixes AttentionNorm.
2020-07-14 09:28:24 -06:00
James Betker
a2285ff2ee Scale anorm by transform count 2020-07-13 08:49:09 -06:00
James Betker
dd0bbd9a7c Enable AttentionNorm on SRG2 2020-07-13 08:38:17 -06:00
James Betker
4c0f770f2a Fix inverted temperature curve bug 2020-07-12 11:02:50 -06:00
James Betker
14d23b9d20 Fixes, do fake swaps less often in pixgan discriminator 2020-07-11 21:22:11 -06:00
James Betker
ba6187859a err5 2020-07-10 23:02:56 -06:00
James Betker
902527dfaa err4 2020-07-10 23:00:21 -06:00
James Betker
020b3361fa err3 2020-07-10 22:57:34 -06:00
James Betker
b3a2c21250 err2 2020-07-10 22:52:02 -06:00
James Betker
716433db1f err1 2020-07-10 22:50:56 -06:00
James Betker
ef9f1307eb Sometimes don't use compression artifacts 2020-07-10 22:25:53 -06:00
James Betker
0b7193392f Implement unet disc
The latest discriminator architecture was already pretty much a unet. This
one makes that official and uses shared layers. It also upsamples one additional
time and throws out the lowest upsampling result.

The intent is to delete the old vgg pixdisc, but I'll keep it around for a bit since
I'm still trying out a few models with it.
2020-07-10 16:24:42 -06:00
James Betker
812c684f7d Update pixgan swap algorithm
- Swap multiple blocks in the image instead of just one. The discriminator was clearly
  learning that most blocks have one region that needs to be fixed.
- Relax block size constraints. This was in place to gaurantee that the discriminator
  signal was clean. Instead, just downsample the "loss image" with bilinear interpolation.
  The result is noisier, but this is actually probably healthy for the discriminator.
2020-07-10 15:56:14 -06:00
James Betker
33ca3832e1 Move ExpansionBlock to arch_util
Also makes all processing blocks have a conformant signature.

Alters ExpansionBlock to perform a processing conv on the passthrough
before the conjoin operation - this will break backwards compatibilty with SRG2.
2020-07-10 15:53:41 -06:00
James Betker
5e8b52f34c Misc changes 2020-07-10 09:45:48 -06:00
James Betker
5f2c722a10 SRG2 revival
Big update to SRG2 architecture to pull in a lot of things that have been learned:
- Use group norm instead of batch norm
- Initialize the weights on the transformations low like is done in RRDB rather than using the scalar. Models live or die by their early stages, and this ones early stage is pretty weak
- Transform multiplexer to use u-net like architecture.
- Just use one set of configuration variables instead of a list - flat networks performed fine in this regard.
2020-07-09 17:34:51 -06:00
James Betker
12da993da8 More fixes... 2020-07-08 22:07:09 -06:00
James Betker
7d6eb28b87 More fixes 2020-07-08 22:00:57 -06:00
James Betker
b2507be13c Fix up pixgan loss and pixdisc 2020-07-08 21:27:48 -06:00
James Betker
26a4a66d1c Bug fixes and new gan mechanism
- Removed a bunch of unnecessary image loggers. These were just consuming space and never being viewed
- Got rid of support of artificial var_ref support. The new pixdisc is what i wanted to implement then - it's much better.
- Add pixgan GAN mechanism. This is purpose-built for the pixdisc. It is intended to promote a healthy discriminator
- Megabatchfactor was applied twice on metrics, fixed that

Adds pix_gan (untested) which swaps a portion of the fake and real image with each other, then expects the discriminator
to properly discriminate the swapped regions.
2020-07-08 17:40:26 -06:00
James Betker
4305be97b4 Update log metrics
They should now be universal regardless of job configuration
2020-07-07 15:33:22 -06:00
James Betker
8a4eb8241d SRG3 work
Operates on top of a pre-trained SpineNET backbone (trained on CoCo 2017 with RetinaNet)

This variant is extremely shallow.
2020-07-07 13:46:40 -06:00
James Betker
0acad81035 More SRG2 adjustments.. 2020-07-06 22:40:40 -06:00
James Betker
086b2f0570 More bugs 2020-07-06 22:28:07 -06:00
James Betker
d4d4f85fc0 Bug fixes 2020-07-06 22:25:40 -06:00
James Betker
3c31bea1ac SRG2 architectural changes 2020-07-06 22:22:29 -06:00
James Betker
9a1c3241f5 Switch discriminator to groupnorm 2020-07-06 20:59:59 -06:00
James Betker
60c6352843 Misc 2020-07-06 20:44:07 -06:00
James Betker
6beefa6d0c PixDisc - Add two more levels of losses coming from this gen at higher resolutions 2020-07-06 11:15:52 -06:00
James Betker
2636d3b620 Fix assertion error 2020-07-06 09:23:53 -06:00
James Betker
8f92c0a088 Interpolate attention well before softmax 2020-07-06 09:18:30 -06:00
James Betker
72f90cabf8 More pixdisc fixes 2020-07-05 22:03:16 -06:00
James Betker
909007ee6a Add G_warmup
Let the Generator get to a point where it is at least competing with the discriminator before firing off.

Backwards from most GAN architectures, but this one is a bit different from most.
2020-07-05 21:58:35 -06:00
James Betker
a47a5dca43 Fix pixdisc bug 2020-07-05 21:57:52 -06:00
James Betker
d0957bd7d4 Alter weight initialization for transformation blocks 2020-07-05 17:32:46 -06:00
James Betker
16d1bf6dd7 Replace ConvBnRelus in SRG2 with Silus 2020-07-05 17:29:20 -06:00
James Betker
10f7e49214 Add ConvBnSilu to replace ConvBnRelu
Relu produced good performance gains over LeakyRelu, but
GAN performance degraded significantly. Try SiLU as an alternative
to see if it's the leaky-ness we are looking for or the smooth activation
curvature.
2020-07-05 13:39:08 -06:00
James Betker
9934e5d082 Move SRG1 to identical to new 2020-07-05 08:49:34 -06:00
James Betker
416538f31c SRG1 conjoined except ConvBnRelu 2020-07-05 08:44:17 -06:00
James Betker
c58c2b09ca Back to remove all biases (looks like a ConvBnRelu made its way in..) 2020-07-04 22:41:02 -06:00
James Betker
86cda86e94 Re-add biases, also add new init
A/B testing where we lost our GAN competitiveness.
2020-07-04 22:24:42 -06:00
James Betker
b03741f30e Remove all biases from generator
Continuing to investigate loss of GAN competitiveness, this is a big difference
between "old" SRG1 and "new".
2020-07-04 22:19:55 -06:00
James Betker
726e946e79 Turn BN off in SRG1
This wont work well but just testing if GAN performance comes back
2020-07-04 14:51:27 -06:00
James Betker
0ee39d419b OrderedDict not needed 2020-07-04 14:09:27 -06:00
James Betker
9048105b72 Break out SRG1 as separate network
Something strange is going on. These networks do not respond to
discriminator gradients properly anymore. SRG1 did, however so
reverting back to last known good state to figure out why.
2020-07-04 13:28:50 -06:00
James Betker
188de5e15a Misc changes 2020-07-04 13:22:50 -06:00
James Betker
510b2f887d Remove RDB from srg2
Doesnt seem to work so great.
2020-07-03 22:31:20 -06:00
James Betker
77d3765364 Fix new feature loss calc 2020-07-03 22:20:13 -06:00