Commit Graph

611 Commits

Author SHA1 Message Date
James Betker
06d18343f7 Allow noise to be added to discriminator inputs 2020-05-12 16:25:38 -06:00
James Betker
9210a62f58 Add rotating log buffer to trainer
Should stabilize stats output.
2020-05-12 10:09:45 -06:00
James Betker
f217216c81 Implement ResGenv2
Implements a ResGenv2 architecture which slightly increases the complexity
of the final output layer but causes it to be shared across all skip outputs.
2020-05-12 10:09:15 -06:00
James Betker
1596a98493 Get rid of skip layers from vgg disc 2020-05-12 10:08:12 -06:00
James Betker
c540244789 Config file update 2020-05-12 10:07:58 -06:00
James Betker
62a97c53d1 Handle tuple-returning generators in test 2020-05-11 11:15:26 -06:00
James Betker
f994466289 Initialize test dataloader with a worker count proportional to the batch size. 2020-05-10 10:49:37 -06:00
James Betker
ef48e819aa Allow resgen to have a conditional number of upsamples applied to it 2020-05-10 10:48:37 -06:00
James Betker
8969a3ce70 Add capability to start at arbitrary frames 2020-05-10 10:48:05 -06:00
James Betker
03351182be Use amp in SR_model for inference 2020-05-07 21:45:33 -06:00
James Betker
dbca0d328c Fix multi-lq bug 2020-05-06 23:16:35 -06:00
James Betker
aa0305def9 Resnet discriminator overhaul
It's been a tough day figuring out WTH is going on with my discriminators.
It appears the raw FixUp discriminator can get into an "defective" state where
they stop trying to learn and just predict as close to "0" D_fake and D_real as
possible. In this state they provide no feedback to the generator and never
recover. Adding batch norm back in seems to fix this so it must be some sort
of parameterization error.. Should look into fixing this in the future.
2020-05-06 17:27:30 -06:00
James Betker
602f86bfa4 Random config changes 2020-05-06 17:25:48 -06:00
James Betker
574e7e882b Fix up OOM issues when running a disjoint D update ratio and megabatches 2020-05-06 17:25:25 -06:00
James Betker
eee9d6d9ca Support skip connections in vgg arch discriminator. 2020-05-06 17:24:34 -06:00
James Betker
5c1832e124 Add support for multiple LQ paths
I want to be able to specify many different transformations onto
the target data; the model should handle them all. Do this by
allowing multiple LQ paths to be selected and the dataset class
selects one at random.
2020-05-06 17:24:17 -06:00
James Betker
3cd85f8073 Implement ResGen arch
This is a simpler resnet-based generator which performs mutations
on an input interspersed with interpolate-upsampling. It is a two
part generator:
1) A component that "fixes" LQ images with a long string of resnet
    blocks. This component is intended to remove compression artifacts
    and other noise from a LQ image.
2) A component that can double the image size. The idea is that this
    component be trained so that it can work at most reasonable
    resolutions, such that it can be repeatedly applied to itself to
    perform multiple upsamples.

The motivation here is to simplify what is being done inside of RRDB.
I don't believe the complexity inside of that network is justified.
2020-05-05 11:59:46 -06:00
James Betker
9f4581aacb Fix megabatch scaling, log low and med-res gen images 2020-05-05 08:34:57 -06:00
James Betker
3b4e54c4c5 Add support for passthrough disc/gen
Add RRDBNetXL, which performs processing at multiple image sizes.
Add DiscResnet_passthrough, which allows passthrough of image at different sizes for discrimination.
Adjust the rest of the repo to allow generators that return more than just a single image.
2020-05-04 14:01:43 -06:00
James Betker
44b89330c2 Support inference across batches, support inference on cpu, checkpoint
This is a checkpoint of a set of long tests with reduced-complexity networks. Some takeaways:
1) A full GAN using the resnet discriminator does appear to converge, but the quality is capped.
2) Likewise, a combination GAN/feature loss does not converge. The feature loss is optimized but
    the model appears unable to fight the discriminator, so the G-loss steadily increases.

Going forwards, I want to try some bigger models. In particular, I want to change the generator
to increase complexity and capacity. I also want to add skip connections between the
disc and generator.
2020-05-04 08:48:25 -06:00
James Betker
9c7debe75c Add colab option 2020-05-02 17:47:25 -06:00
James Betker
832f3587c5 Turn off EVDR (so we dont need the weird convs) 2020-05-02 17:47:14 -06:00
James Betker
8341bf7646 Enable megabatching 2020-05-02 17:46:59 -06:00
James Betker
61d3040cf5 Add doCrop into LQGT 2020-05-02 17:46:30 -06:00
James Betker
9e1acfe396 Fixup upconv for the next attempt! 2020-05-01 19:56:14 -06:00
James Betker
7eaabce48d Full resnet corrupt, no BN
And it works! Thanks fixup..
2020-04-30 19:17:30 -06:00
James Betker
03258445bc tblogger.. 2020-04-30 12:35:51 -06:00
James Betker
b6e036147a Add more batch norms to FlatProcessorNet_arch 2020-04-30 11:47:21 -06:00
James Betker
66e91a3d9e Revert "Enable skip-through connections from disc to gen"
This reverts commit b7857f35c3.
2020-04-30 11:45:07 -06:00
James Betker
f027e888ed Clear out tensorboard on job restart. 2020-04-30 11:44:53 -06:00
James Betker
b7857f35c3 Enable skip-through connections from disc to gen 2020-04-30 11:30:11 -06:00
James Betker
bf634fc9fa Make resnet w/ BN discriminator use leaky relus 2020-04-30 11:28:59 -06:00
James Betker
3781ea725c Add Resnet Discriminator with BN 2020-04-29 20:51:57 -06:00
James Betker
a5188bb7ca Remover fixup code from arch_util
Going into it's own arch.
2020-04-29 15:17:43 -06:00
James Betker
5b8a77f02c Discriminator part 1
New discriminator. Includes spectral norming.
2020-04-28 23:00:29 -06:00
James Betker
2c145c39b6 Misc changes 2020-04-28 11:50:16 -06:00
James Betker
46f550e42b Change downsample_dataset to do no image modification
I'm  preprocessing the images myself now. There's no need to have
the dataset do this processing as well.
2020-04-28 11:50:04 -06:00
James Betker
8ab595e427 Add FlatProcessorNet
After doing some thinking and reading on the subject, it occurred to me that
I was treating the generator like a discriminator by focusing the network
complexity at the feature levels. It makes far more sense to process each conv
level equally for the generator, hence the FlatProcessorNet in this commit. This
network borrows some of the residual pass-through logic from RRDB which makes
the gradient path exceptionally short for pretty much all model parameters and
can be trained in O1 optimization mode without overflows again.
2020-04-28 11:49:21 -06:00
James Betker
b8f67418d4 Retool HighToLowResNet
The receptive field of the original was *really* low. This new one has a
receptive field of 36x36px patches. It also has some gradient issues
that need to be worked out
2020-04-26 01:13:42 -06:00
James Betker
02ff4a57fd Enable HighToLowResNet to do a 1:1 transform 2020-04-25 21:36:32 -06:00
James Betker
35bd1ecae4 Config changes for discriminator advantage run
Still going from high->low, discriminator discerns on low. Next up disc works on high.
2020-04-25 11:24:28 -06:00
James Betker
d95808f4ef Implement downsample GAN
This bad boy is for a workflow where you train a model on disjoint image sets to
downsample a "good" set of images like a "bad" set of images looks. You then
use that downsampler to generate a training set of paired images for supersampling.
2020-04-24 00:00:46 -06:00
James Betker
ea54c7618a Print error when image read fails 2020-04-23 23:59:32 -06:00
James Betker
e98d92fc77 Allow test to operate on batches 2020-04-23 23:59:09 -06:00
James Betker
8ead9ae183 Lots more config files 2020-04-23 23:58:27 -06:00
James Betker
ea5f432f5a Log total gen loss 2020-04-22 14:02:10 -06:00
James Betker
79aff886b5 Modifications that allow developer to explicitly specify a different image set for PIX and feature losses 2020-04-22 10:11:14 -06:00
James Betker
12d92dc443 Add GTLQ dataset 2020-04-22 00:40:38 -06:00
James Betker
4d269fdac6 Support independent PIX dataroot 2020-04-22 00:40:13 -06:00
James Betker
05aafef938 Support variant input sizes and scales 2020-04-22 00:39:55 -06:00
James Betker
ebda70fcba Fix AMP 2020-04-22 00:39:31 -06:00
James Betker
f4b33b0531 Some random fixes/adjustments 2020-04-22 00:38:53 -06:00
James Betker
2538ca9f33 Add my own configs 2020-04-22 00:37:54 -06:00
James Betker
af5dfaa90d Change GT_size to target_size 2020-04-22 00:37:41 -06:00
James Betker
cc834bd5a3 Support >128px image squares 2020-04-21 16:32:59 -06:00
James Betker
4f6d3f0dfb Enable AMP optimizations & write sample train images to folder. 2020-04-21 16:28:06 -06:00
PRAGMA
1fb12871fd
Create requirements.txt 2019-11-24 07:48:52 +00:00
XintaoWang
a25ee9464d test w/o GT 2019-09-01 22:20:10 +08:00
XintaoWang
0098663b6b SRGAN model supprots dist training 2019-09-01 22:14:29 +08:00
XintaoWang
866a858e59 add deform_conv_cuda_kernel.cu 2019-08-27 17:49:12 +08:00
XintaoWang
037933ba66 mmsr 2019-08-23 21:42:47 +08:00