Commit Graph

1208 Commits

Author SHA1 Message Date
James Betker
cc4571eb8d Randomize blur effect 2020-05-24 12:35:41 -06:00
James Betker
27a548c019 Enable blurring via settings 2020-05-24 11:56:39 -06:00
James Betker
3c2e5a0250 Apply fixes to resgen 2020-05-24 07:43:23 -06:00
James Betker
446322754a Support generators that don't output intermediary values. 2020-05-23 21:09:54 -06:00
James Betker
987cdad0b6 Misc mods 2020-05-23 21:09:38 -06:00
James Betker
9b44f6f5c0 Add AssistedRRDB and remove RRDBNetXL 2020-05-23 21:09:21 -06:00
James Betker
445e7e7053 Extract subimages mod 2020-05-23 21:07:41 -06:00
James Betker
90073fc761 Update LQ_dataset to support inference on split image videos 2020-05-23 21:05:49 -06:00
James Betker
74bb0fad33 Allow dataset classes to add noise internally 2020-05-23 21:04:24 -06:00
James Betker
af1968f9e5 Allow passthrough discriminator to have passthrough disabled from config 2020-05-19 09:41:16 -06:00
James Betker
67139602f5 Test modifications
Allows bifurcating large images put into the test pipeline

This code is fixed and not dynamic. Needs some fixes.
2020-05-19 09:37:58 -06:00
James Betker
6400607fc5 ONNX export support 2020-05-19 09:36:04 -06:00
James Betker
89c71293ce IDEA update 2020-05-19 09:35:26 -06:00
James Betker
9cde58be80 Make RRDB usable in the current iteration 2020-05-16 18:36:30 -06:00
James Betker
b95c4087d1 Allow an alt_path for saving models and states 2020-05-16 09:10:51 -06:00
James Betker
f911ef0d3e Add corruptor_usage_probability
Governs how often a corruptor is used, vs feeding uncorrupted images.
2020-05-16 09:05:43 -06:00
James Betker
635c53475f Restore swapout models just before a checkpoint 2020-05-16 07:45:19 -06:00
James Betker
877be4d88c README update 2020-05-15 14:03:44 -06:00
James Betker
a33ec3e22b Fix skips & images samples
- Makes skip connections between the generator and discriminator more
  extensible by adding additional configuration options for them and supporting
  1 and 0 skips.
- Places the temp/ directory with sample images from the training process appear
  in the training directory instead of the codes/ directory.
2020-05-15 13:50:49 -06:00
James Betker
cdf641e3e2 Remove working options from repo 2020-05-15 07:50:55 -06:00
James Betker
bd4d478572 config changes 2020-05-15 07:41:18 -06:00
James Betker
79593803f2 biggan arch, initial work (not implemented) 2020-05-15 07:40:45 -06:00
James Betker
61ed51d9e4 Improve corruptor logic: switch corruptors randomly 2020-05-14 23:14:32 -06:00
James Betker
d72e154442 Allow more LQ than GT images in corrupt mode 2020-05-14 20:46:20 -06:00
James Betker
8a514b9645 Misc changes 2020-05-14 20:45:38 -06:00
James Betker
a946483f1c Fix discriminator noise floor 2020-05-14 20:45:06 -06:00
James Betker
c8ab89d243 Add model swapout
Model swapout is a feature where, at specified intervals,
a random D and G model will be swapped in place for the
one being trained. After a short period of time, this model
is swapped back out. This is intended to increase training
diversity.
2020-05-13 16:53:38 -06:00
James Betker
c336d32fd3 Config updates 2020-05-13 15:27:49 -06:00
James Betker
5bcf187fb6 Disable LMDB support
It doesn't play nice with multiple dataroots and I don't
really see any need to continue support since I'm not
testing it.
2020-05-13 15:27:33 -06:00
James Betker
e36f22e14a Allow "corruptor" network to be specified
This network is just a fixed (pre-trained) generator
that performs a corruption transformation that the
generator-in-training is expected to undo alongside
SR.
2020-05-13 15:26:55 -06:00
James Betker
f389025b53 Change ResGen noise feature
It now injects noise directly into the input filters, rather than a
pure noise filter. The pure noise filter was producing really
poor results (and I'm honestly not quite sure why).
2020-05-13 09:22:06 -06:00
James Betker
343af70a8d Add code for compiling model to torchscript
I want to be able to export it to other formats too in the future.
2020-05-13 09:21:13 -06:00
James Betker
585b05e66b Cap test workers at 10 2020-05-13 09:20:45 -06:00
James Betker
037a5a3cdb Config updates 2020-05-13 09:20:28 -06:00
James Betker
fc3ec8e3a2 Add a noise floor to th discriminator noise factor 2020-05-13 09:19:22 -06:00
James Betker
5d1b4caabf Allow noise to be injected at the generator inputs for resgen 2020-05-12 16:26:29 -06:00
James Betker
06d18343f7 Allow noise to be added to discriminator inputs 2020-05-12 16:25:38 -06:00
James Betker
9210a62f58 Add rotating log buffer to trainer
Should stabilize stats output.
2020-05-12 10:09:45 -06:00
James Betker
f217216c81 Implement ResGenv2
Implements a ResGenv2 architecture which slightly increases the complexity
of the final output layer but causes it to be shared across all skip outputs.
2020-05-12 10:09:15 -06:00
James Betker
1596a98493 Get rid of skip layers from vgg disc 2020-05-12 10:08:12 -06:00
James Betker
c540244789 Config file update 2020-05-12 10:07:58 -06:00
James Betker
62a97c53d1 Handle tuple-returning generators in test 2020-05-11 11:15:26 -06:00
James Betker
f994466289 Initialize test dataloader with a worker count proportional to the batch size. 2020-05-10 10:49:37 -06:00
James Betker
ef48e819aa Allow resgen to have a conditional number of upsamples applied to it 2020-05-10 10:48:37 -06:00
James Betker
8969a3ce70 Add capability to start at arbitrary frames 2020-05-10 10:48:05 -06:00
James Betker
03351182be Use amp in SR_model for inference 2020-05-07 21:45:33 -06:00
James Betker
dbca0d328c Fix multi-lq bug 2020-05-06 23:16:35 -06:00
James Betker
aa0305def9 Resnet discriminator overhaul
It's been a tough day figuring out WTH is going on with my discriminators.
It appears the raw FixUp discriminator can get into an "defective" state where
they stop trying to learn and just predict as close to "0" D_fake and D_real as
possible. In this state they provide no feedback to the generator and never
recover. Adding batch norm back in seems to fix this so it must be some sort
of parameterization error.. Should look into fixing this in the future.
2020-05-06 17:27:30 -06:00
James Betker
602f86bfa4 Random config changes 2020-05-06 17:25:48 -06:00
James Betker
574e7e882b Fix up OOM issues when running a disjoint D update ratio and megabatches 2020-05-06 17:25:25 -06:00